21 May 2017

Healthcare ... A Universal Moral Right

A Barber-surgeon practising blood-letting
Posted by Keith Tidman

Is health care a universal moral right — an irrefutably fundamental ‘good’ within society — that all nations ought to provide as faithfully and practically as they can? Is it a right in that all human beings, worldwide, are entitled to share in as a matter of justice, fairness, dignity, and goodness?

To be clear, no one can claim a right to health as such. As a practical matter, it is an unachievable goal — but there is a perceived right to healthcare. Where health and healthcare intersect — that is, where both are foundational to society — is in the realisation that people have a need for both. Among the distinctions, ‘health’ is a result of sundry determinants, access to adequate healthcare being just one. Other determinants comprise behaviours (such as smoking, drug use, and alcohol abuse), access to nutritious and sufficient food and potable water, absence or prevalence of violence or oppression, and rates of criminal activity, among others. And to be sure, people will continue to suffer from health disorders, despite all the best of intentions by science and medicine. ‘Healthcare’, on the other hand, is something society can and does make choices about, largely as a matter of policymaking and access to resources.

The United Nations, in Article 25 of its ‘Universal Declaration of Human Rights’, provides a framework for theories of healthcare’s essential nature:
“Everyone has the right to a standard of living adequate for the health and well-being of himself and of his family, including . . . medical care and necessary social services, and the right to security in the event of . . . sickness . . . in circumstances beyond his [or her] control.”
The challenge is whether and how nations live up to that well-intentioned declaration, in the spirit of protecting the vulnerable.

At a fundamental level, healthcare ethics comprises values — judgments as to what’s right and wrong, including obligations toward the welfare of other human beings. Rights and obligations are routinely woven into the deliberations of policymakers around the world. In practice, a key challenge in ensuring just practices — and figuring out how to divvy up finite (sometimes sorely constrained) material resources and economic benefits — is how society weighs the relative value of competing demands. Those jostling demands are many and familiar: education, industrial advancement, economic growth, agricultural development, security, equality of prosperity, housing, civil peace, environmental conditions — and all the rest of the demands on resources that societies grapple with in order to prioritise spending.

These competing needs are where similar constraints and inequalities of access persist across socioeconomic demographics and groups within and across nations. Some of these needs, besides being important in their own right, also determine — even if sometimes only obliquely — to health and healthcare. Their interconnectedness and interdependence are folded into what one might label ‘entitlements’, aimed at the wellbeing of individuals and whole populations alike. They are eminently relatable, as well as part and parcel of the overarching issue of social fairness and justice.

The current vexed debate over healthcare provision within the United States among policymakers, academics, pundits, the news media, other stakeholders (such as business executives), and the public at large is just one example of how those competing needs collide. It is also evidence of how the nuts and bolts of healthcare policy rapidly become entangled in the frenzy of opposing dogmas.

On the level of ideology, the healthcare debate is a well-trodden one: how much of the solution to the availability and funding of healthcare services should rest with the public sector, including government programming, mandating, regulation, and spending; and how much (with a nod to the laissez-faire philosophy of Adam Smith in support of free markets) should rest with the private sector, incluidng businesses such as insurance companies, hospitals, and doctors? Yet often missing in all this urgency and the decisions about how to ration healthcare is that the money being spent has not resulted in best health outcomes, based on comparison of certain health metrics with select other countries.

Sparring over public-sector versus private-sector solutions to social issues — as well as over states’ rights versus federalism among the constitutionally enumerated powers — has marked American politics for generations. Healthcare has been no exception. And even in a wealthy nation like the United States, challenges in cobbling together healthcare policy have drilled down into a series of consequential factors. They include whether to exclude specified ailments from coverage, whether preexisting conditions get carved out of (affordable) insured coverage, whether to impose annual or lifetime limits on protections, how much of the nation's gross domestic product to consign to healthcare, and how many tens of millions of people might remain without healthcare or be ominously underinsured, among more — precariously resting on arbitrary decisions. True reform might require starting with a blank slate, then cherry-picking from among other countries’ models of healthcare policy, based on their lessons learned as to what did and did not work over many years. Ideas as to America’s national healthcare are still on the anvil, being hammered by Congress and others into final policy.

Amid all this policy ‘sausage making’, there’s the political sleight-of-hand rhetoric that misdirects by acts of either commission or omission within debates. Yet, do the uninsured still have a moral right to affordable healthcare? Do the underinsured still have a moral right to healthcare? Do people with preexisting conditions still have a moral right to healthcare? Do people who are older, but who do not yet qualify for age-related Medicare protections, have a moral right to healthcare? Absolutely, on all counts. The moral right to healthcare — within society’s financial means — is universal, irreducible, non-dilutable; that is, no authority may discount or deny the moral right of people to at least basic healthcare provision. Within that philosophical context of morally rightful access to healthcare, the bucket of healthcare services provided will understandably vary wildly, from one country to another, pragmatically contingent on how wealthy or poor a country is.

Of course, the needs, perceptions, priorities — and solutions — surrounding the matter of healthcare differ quite dramatically among countries. And to be clear, there’s no imperative that the provision of effective, efficient, fair healthcare services hinge on liberally democratic, Enlightenment-inspired forms of government. Apart from these or other styles of governance, there’s more fundamentally no alternative to local sovereignty in shaping policy. Consider another example of healthcare policy: the distinctly different countries of sub-Saharan Africa pose an interesting case. The value of available and robust healthcare systems is as readily recognized in this part of the world as elsewhere. However, there has been a broadly articulated belief that the healthcare provided is of poor quality. Also, healthcare is considered less important among competing national priorities — such as jobs, agriculture, poverty, corruption, and conflict, among others. Yet, surely the right to healthcare is no less essential to these many populations.

Everything is finite, of course, and healthcare resources are no exception. The provision of healthcare is subject to zero-sum budgeting: the availability of funds for healthcare must compete with the tug of providing other services — from education to defence, from housing to environmental protections, from commerce to energy, from agriculture to transportation. This reality complicates the role of government in its trying to be socially fair and responsive. Yet, it remains incumbent on governments to forge the best healthcare system that circumstances allow. Accordingly, limited resources compel nations to take a fair, rational, nondiscriminatory approach to prioritising who gets what by way of healthcare services, which medical disorders to target at the time of allocation, and how society should reasonably be expected to shoulder the burden of service delivery and costs.

As long ago as the 17th century, René Descartes declared that:
‘... the conservation of health . . . is without doubt the primary good and the foundation of all other goods of this life’. 
However, how much societies spend, and how they decide who gets what share of the available healthcare capital, are questions that continue to divide. The endgame may be summed up, to follow in the spirit of the 18th-century English philosopher Jeremy Bentham, as ‘the greatest happiness for the greatest number [of people]’ for the greatest return on investment of public and private funds dedicated to healthcare. How successfully public and private institutions — in their thinking about resources, distribution, priorities, and obligations — mobilise and agitate for greater commitment comes with implied decisions, moral and practical, about good health to be maintained or restored, lives to be saved, and general wellbeing to be sustained.

Policymakers, in channeling their nations’ integrity and conscience, are pulled in different directions by competing social imperatives. At a macro level, depending on the country, these may include different mixes of crises of the moment, political and social disorder, the shifting sands of declared ideological purity, challenges to social orthodoxy, or attention to simply satiating raw urges for influence (chasing power). In that brew of prioritisation and conflict, policymakers may struggle in coming to grips with what’s ‘too many’ or ‘too few’ resources to devote to healthcare rather than other services and perceived commitments. Decisions must take into account that healthcare is multidimensional: a social, political, economics, humanities, and ethics matter holistically rolled into one. Therefore, some models for providing healthcare turn out to be more responsible, responsive, and accountable than others. These concerns make it all the more vital for governments, institutions, philanthropic organizations, and businesses to collaborate in policymaking, public outreach, program implementation, gauging of outcomes, and decisions about change going forward.

A line is thus often drawn between healthcare needs and other national needs — with the tensions of altruism and self-interest opposed. The distinctions between decisions and actions deemed altruistic and those deemed self-interested are blurred since they must hinge on motives, which are not always transparent. In some cases, actions taken to provide healthcare nationally serve both purposes — for example, what might improve healthcare, and in turn health, on one front (continent, nation, local community) may well keep certain health disorders from another front.

The ground-level aspiration is to maintain people’s health, treat the ill, and crucially, not financially burden families, because what’s not affordable to families in effect doesn’t really exist. That nobly said, there will always be tiered access to healthcare — steered by the emptiness or fullness of coffers, political clout, effectiveness of advocacy, sense of urgency, disease burden, and beneficiaries. Tiered access prompts questions about justice, standards, and equity in healthcare’s administration — as well as about government discretion and compassion. Matters of fairness and equity are more abstract, speculative metrics than are actual healthcare outcomes with respect to a population’s wellbeing, yet the two are inseperable.

Some three centuries after Descartes’ proclamation in favour of health as ‘the primary good’, the United Nations issued to the world the ‘International Covenant on Economic, Social, and Cultural Rights’ and thereby placing its imprimatur on ‘the right of everyone to the enjoyment of the highest attainable standard of physical and mental health’. The world has made headway, where many nations have instituted intricate, encompassing healthcare systems for their own populations, while also collaborating with the governments and local communities of financially stressed nations to undergird treatments through financial aid, program design and implementation, resource distribution, teaching of indigenous populations (and local service providers), setting up of healthcare facilities, provision of preventions and cures, follow-up as to program efficacy, and accountability of responsible parties.

In short, the overarching aim is to convert ethical axioms into practical, implementable social policies and programs.

14 May 2017

The Philosophy of Jokes

I say, I say, I say...
Posted by Martin Cohen
Ludwig Wittgenstein, that splendidly dour 20th century philosopher, usually admired for trying to make language more logical, once remarked, in his earnest Eastern European way, that a very serious work, or zery serieuse, verk in philosophy could consist entirely of jokes. 
Now Wittgenstein probably meant to shock his audience which consisted of his American friend, Norman Malcolm (who he also once, advised to avoid an academic career and to work instead on a farm) but he was also in deadly earnest. Because, humour is, as he also is on record as saying, ‘not a mood, but a way of looking at the world’. Understanding jokes, just like understanding the world, hinges on having first adopted the right kind of perspective.

So here's one to test his idea out on.
‘A traveler is staying at a monastery, where the Order has a vow of silence and can only speak at the evening meal. On his first night as they are eating, one of the monks stands up and shouts ‘Twenty two!’. Immediately the rest of the monks break out into raucous laughter. Then they return to new silence. A little while later, another shouts out ‘One hundred and ten’, to even more uproarious mirth. This goes on for two more nights with no real conversation, just different numbers being shouted out, followed by ribald laughing and much downing of ale. At last, no longer able to contain his curiosity the traveler asks the Abbot what it is all about. The Abbot explains that the monastery has only one non-religious book in it, which consists of a series of jokes each headed with its own number. Since all the monks know them by heart, instead of telling the jokes they just call out the number. 
Hearing this, the traveler decides to have a look at the book for himself. He goes to the library and carefully makes a note of the numbers of the funniest jokes. Then, that evening he stands up and calls out the number of his favourite joke – which is ‘seventy six’. But nobody laughs, instead there is an embarrassed silence. The next night he tries again, ‘One hundred and thirteen!’, he exclaims loudly into the silence - but still no response. 
After the meal he asks the Abbott if the jokes he picked were not considered funny by the monks? ‘Ooh no’, says the Abbott. ‘The jokes are funny – it’s just that some people just don't know how to tell them!’
I like that one! And incredibly, it is one of the oldest jokes around. This, we might say, is a joke with a pedigree. A version of it appears in the Philogelos, or Laughter Lover, which is a collection of some 265 jokes, written in Greek and compiled some 1,600 odd years ago. So it’s old. Nevertheless, despite its antiquity, the style of this and at least some of the other jokes is very familiar.

Clearly, humour is something that transcends communities and periods in history. It seems to draw on something common to all peoples. Yet jokes are also clearly things rooted in their times and places. At the time of this joke, monks and secret books were serious business. But the first philosophical observation to make and principle to note is that both these jokes involved one of those ‘ah-ha!’ moments.

Humour often involves a sudden, unexpected shift in perspective forcing a rapid reassessment of assumptions. Philosophy, at its best, does much the same thing.

07 May 2017

The Pleasures of Idle Thought?

Posted by John Hansen
What is the purpose of thought?  This was the focus of a monumental series of essays, chiefly written by the English lexicographer and essayist Dr. Samuel Johnson.  His essays, however, had a sting in the tail.
During the years 1758 to 1760, the Universal Chronicle published 103 weekly essays, of which 91 were written by Dr. Johnson.  These proved to be enormously popular.  The subject of the essays was a fictional character called The Idler, whose aspiration it was to engage in the pleasures of idle thought, to “keep the mind in a state of action but not labour”. Among other things, Dr. Johnson contemplates the many forms that idleness of thought can take – of which we describe a sample here: 
There is the kind of Idler, Dr. Johnson begins, who carries idleness as a “silent and peaceful quality, that neither raises envy by ostentation, nor hatred by opposition”.  His life will be less dreadful and more peaceful if he refrains from any serious engagement with matters, and yet he should not “languish for want of amusement”.  He needs the beguilement of ideas.

There is the Idler, too, who is on the point of more serious thought, yet “always in a state of preparation”.  It cannot fully be classified as idleness, since he is constantly forming plans and accumulating materials for the “main affair”.  But perhaps he fears failure, or he is simply captivated by the methods of preparation.  The main affair never arrives.

Then there is the Idler who, in his idleness, begins to feel the stirring of a certain unease.  He fills his days with petty business, and while he does so productively, yet he does not “lie quite at rest”.  When he retires from his business to be alone, he discovers little comfort.  His thoughts “do not make him sufficiently useful to others”, and make him “weary of himself”.

In fact, in time, there is the Idler who begins to tremble at the thought that he must go home, so that friends may sleep. At this time, “all the world agrees to shut out interruption”.  While his favourite pastime has been to shut out inner reflection, yet such inner reflection now seems to press in on him from all sides.

As life nears its end, there is the Idler who fears the end, yet in continuing idleness of thought, he seeks to ignore the fact that each moment brings him closer to his demise.  He now finds that his idle thoughts have trapped him.  His own mortality is disconcerting, yet something which he has never known how to face before.

In his final essay, which is written in a “solemn week” of the Church – a week of “the review of life” and “the renovation of holy purposes” – Dr. Johnson expresses the hope that “my readers are already disposed to view every incident with seriousness and improve it by meditation”.  Any other approach to thought will finally be self-defeating.
There are many, writes Dr. Johnson, who when they finally understand this, find that it is too late for them to capture the moments lost.  The last good gesture of The Idler is to warn his readers that the hour may be at hand when “probation ceases and repentance will be vain”.  Idleness of thought is not after all as innocent as it seems.  It comes back to bite you.  The purpose of thought, then, is ultimately to engage with life’s biggest questions.

It seems a remarkable achievement that Dr. Johnson apparently held an overview of about 100 essays in his head, which followed a meaningful progression over a period of three full years.  These essays continue to provoke and inspire today.  All but one – which was thought to be seditious – were bound into a single volume. An edition which is still in print and still being read by “Idlers” today is recommended below.



Read more:

Johnson, Samuel. “The Idler.” Samuel Johnson: Selected Poetry and Prose, edited by Frank Brady and W.K. Wimsatt, University of California Press, Ltd., 1977, 241-75.

By the same author:

Eastern and Western Philosophy: Personal Identity.

30 April 2017

Picture Post # 24 The Privilege of Being Near and Far


'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl and Martin Cohen


Image credit: from an original photographic plate created by Thomas Scarborough

The pictured child is not far, but too far to be near; or too close to be far, but not that near.  Instead, they are halfway, as the background, or foreground" seem to be as well. In-between is where we make distinctions; the difference is always in-between. But rather than representing elements between which a difference is made, this picture seems to represent the in-between itself.

Humidity and temperature change have touched the chemicals of this slide, and ‘X-rayed by life’ in this way, existence reaffirms itself as an ever-changing movement. Within the invisible that becomes visible, we might think to collect memories, freeze moments into pictures, and hence even to think of something as permanent...  yet, little by little, these perceptions are all erased by the visible that withdraws into the unknown.

Stability does not exist. The in-between hands to us that what we think, but do not truly know, and maybe if life would see us, it would say ‘we do not know much’. Thoughts alone make a thread that by stiff perseverance does not break, however often we may have to observe that the tissue is of dubious nature...

23 April 2017

Fact and Value: The Way Ahead

Grateful acknowledgement to Bannor Toys for the image
Posted by Thomas Scarborough
Philosophy may begin to solve a problem as soon as it has identified it.  All too often, it has not.  This post, then, is about defining a problem—no more.  It is one of the most urgent problems of philosophy.
One of the most important aspects of philosophy is ethics.  Yet there is an issue which is prior to ethics, which has to be addressed first.  It is the problem of the fact-value distinction—a problem which, since it first appeared on the philosophical map, has cut a divide between fact and value, and more importantly, philosophy and ethics.  In the words of Ludwig Wittgenstein, ethics has become ‘what we cannot speak about’.  Yet ethics is all that we do, from morning until night, from year to year.  Today, this problem has filtered through to the common person, and has caused profound disorientation in our time.  On a social level, we are conflicted and confused with multiple ethics, while on a global level, our ethics increasingly seem to have come apart, with widespread poverty, social disintegration, and environmental destruction. 

It seems easy to describe the philosophical problem, yet far from easy to offer a solution.  Should I take a walk in the woods today, or should I write letters instead?  Should I be a ‘bachelor girl’, or should I marry Joe?  Should we travel to Mars?  Should we drop the Bomb?  On the surface of it, our reasons for choosing one course of action over another might seem obvious, yet it is not something we find ourselves able to decide on the basis of facts.  The problem is basically this: we know that this is how the world ‘is’—yet how should we know how it ‘ought’ to be?  The philosopher David Hume gave the problem its classical formulation: it is impossible to derive an ‘ought’ from an ‘is’.  It is impossible to establish any value amidst an ocean of facts—and on the surface of it, Hume would seem to be unimpeachably right.  The facts cannot tell us what to do. 

As we seek a solution to the problem—because we must solve this problem if we are to find our way through to any discussion of ethics—Hume’s conclusion would seem to mean only one of two things: either he identified a problem which cannot be solved, or he was thinking in such a way that he created his own problem.  What, therefore, if Hume laid the very foundation on which the fact-value distinction rests? 

Hume considered that all knowledge may be subdivided into relations of ideas on the one hand, and matters of fact on the other.  That is, one begins with a handful of facts, then relates them to one another.  It is the simple matter of a world where facts exist, and these exist in a certain relation to one another—yet one finds no basis on which to determine what that relation ought to be.  Generations later, the philosopher Bertrand Russell wrote that many philosophers, following Kant, have maintained that relations are the work of the mind, while things in themselves have no relations.  While Russell was not saying precisely the same as Hume, he was not far off.  A similar view is reflected in the theory of language.  The philosopher Rudolf Carnap considered, in the words of philosophy professor Simon Blackburn (specifically about the ‘material mode of speech’), ‘Speech objects and their relations are the topic.’  Wittgenstein, too, held this view, in his own unique way, through his multiplicity of language-games.

A pebble is a thing.  A house is a thing.  Even gravity, ideology, taxonomy are ‘things’ in a way (we call them constructs), which in turn may be related to other things.  In a sense, even a unicorn is a thing, although we are unlikely ever to find one.  Things, then, may further be involved in what we call truth conditions—which means that they may be inserted into statements, which can be affirmed or denied.  And when we affirm such statements, we call them facts.  For example, we insert the thing ‘pebble’ into a statement: ‘A pebble sinks’—or we insert the thing ‘unicorn’ into a statement: ‘The Scots keep unicorns.’  Our things are now involved in truth conditions, which means that our world is filled with facts.  And if not facts, then denials of  facts. 

Here, I think, is where the problem lies—and the way ahead.  To say that there is a fact-value distinction means that we have first divided up our reality into things on the one hand, and relations on the other.  On what basis, then, might we find our way back to a ‘grounded’ ethics?  Personally I believe the solution lies in the direction of levelling both fact and value to value alone—or things and relations to relations alone—in all fields, including science and mathematics.  Yet even then, we would not finally have reached the goal.  Even if we should be able to see everything in terms of value, which values should then be true, and which false?  And having once solved which values are true, we would need to establish on what basis I should—or could—submit to them.

16 April 2017

On Quanta and Trees

Does Observation Create Physical Reality?

Image found on Mythapi Facebook page. Author unknown
Posted by Keith Tidman
The intervention of conscious observation into the quantum world — that observing an object to be in a particular location causes it actually to be there — is one of the core tenets of quantum theory. A tenet rigorously upheld through multiple experiments. The observer — his or her consciousness — cannot be separated from that physical reality. There is no reality independent of observation. 
As the visionary quantum physicist John Wheeler stated it, “No . . . property is a property until it is observed.” Which seems to apply as much to macro-sized objects — things in everyday life — as to micro-sized objects.

Three hundred-plus years ago, the philosopher George Berkeley prefigured the spirit of quantum theory’s then-future influence on the nature of reality, declaring, esse est percepi [to be is to be perceived]. Perception, he presciently advocated, is the essential benchmark — the necessary condition — for existence. The reality of things thus emerges from perception. As long as conscious observation is involved — a manifestation of the observer's capacity to consummate physical reality — all objects, large and small, acquire their existence.

So, how does this work? Quantum theory explains that until observation occurs, a potential object was in what’s called a state of ‘superposition’. An object, while in superposition, can be in any number of places, with observation causing it to be in just one location. There was no object isolated in space before it was observed or measured. Upon being observed, the object went from potentiality to actuality in that one location, the same for everyone.

What’s in superposition is the so-called ‘wave function’ — a mathematical description of all the possible states of an object. Only upon being observed does the wave function instantaneously and irreversibly ‘collapse’, causing the object to be in just one location. There is no distinction between the wave function and the object. According to the physics, the wave function is the object — in one-to-one correspondence with the physical thing.

The effect of observation and measurement has also been demonstrated by the so-called ‘double-slit experiment’. A stream of photons (light particles) passes one at a time through a screen with two slits. Behind the screen is a photographic plate, to capture what comes through the slits. In the absence of an observer, each photon will have appeared to pass through both slits simultaneously before creating a distinct interference pattern on the back plate — acting, in other words, like a wave, able to pass through both slits at once. However, in the presence of an observer — a person or detecting device in front of or behind each slit to see which slit the photon goes through — the interference pattern no longer shows up. Each photon appears to have passed through only one slit or the other. The photon has no location in spacetime until it’s observed or measured.

As suggested by both examples — the collapse of a wave function and the double-slit experiment — observation may be performed by a person directly and in real time. Or, observation may be accomplished by an apparatus (detector), whose measurements are observed by scientists later. In either case, observation remains critical, as explained by physicist and philosopher Roger Penrose:
“Almost all the interpretations of quantum mechanics . . . depend to some degree on the presence of consciousness for providing the ‘observer’ that is required [for] the emergence of a classical-like world.”
Meanwhile, the effects of these events also play into what’s known as quantum entanglement — what Albert Einstein famously dubbed ‘spooky action at a distance’. Quantum entanglement occurs when two particles remain ‘connected’, without regard to time and distance — that is, instantaneously, even at enormous distances — in such a way that actions performed on one particle are observed to have an immediate and direct effect on the other. This curious phenomenon, spooky or not, has been confirmed.

So, to the point, what does all this tell us about physical reality?

Causing the reality of an object by observation points to this initial moment of creation being subjective. It’s where an observer first intervenes — until which there is only ‘potential reality’. Accordingly, ‘initial reality,’ as we might call it, requires intervention by an observer — either a person or measuring device. Again, that initial moment of reality is subjective.

However, once initial conscious observation has occurred, the object henceforth exists for everyone. Further instances of observation change nothing about the physical reality already having been created. Reality is thus locked in for everyone — everywhere. Everyone who looks will find the object there, already existing. At that moment, reality is objective — the initially observed object remains so, existing for everyone.

In sum, then, the key takeaway is the presence of both a subjective and objective aspect to reality, depending on the moment — initial observation followed by subsequent observation.

At the moment of causing the object to exist, the observer also causes that object’s entire history to exist. Observation causes both the current reality and related past realities (history) to exist. Whether this is so for literally all observed things remains debatable — quantum mechanically, cosmologically, and philosophically.

Might the notion include, for example, the whole universe — the ultimate macro-sized object? As Wheeler postulates, in our looking rearward to the universe’s beginnings, might our observations result in selecting one out of alternative possible cosmic quantum histories, back to the Big Bang almost fourteen billion years ago? And, in line with the ‘anthropic principle’, might that quantum history account for the many finely tuned features of the universe essential for its and our existence — resulting in an objective macro-reality, the same for everyone, throughout the universe?

Accordingly, Berkeley argued that observation accounts for what gives material things their experienced qualities — an object’s initially experienced reality (its presence and qualities) as well as an object’s subsequently experienced reality.

Where Berkeley’s philosophy converges with the core of this discussion regarding the basis of objects’ reality is his argument for observation — perception — being essential for something to exist. What has been characterised as Berkeley’s empirical idealism. Berkeley argued that material objects are dependent on, not independent of, observation. In this important sense, observation and existence are the same. That is, they ‘cohere’.

09 April 2017

Breaking the Myth of Equality

Posted by Sifiso Mkhonto
‘Know the enemy and know yourself,’ wrote Sun Tzu, ‘and you need not fear the result of a hundred battles.’ Sun Tzu was referring to knowledge—and the right kind of knowledge, he noted, brings victory.
The enemy I speak of is colonialism. Not the colonialism which, for some, may seem to lie in the distant past, but colonialism in the new and (almost) universal understanding of the word -- namely, those features of colonialism which persist long after the coloniser has formally withdrawn. Colonialism in the new view refers to influence, ties, privilege, specialisation, domination, exploitation, and superiority.

The contrast between the old and the new was tragically highlighted recently when a South African premier, Helen Zille, respected for her role in exposing a major apartheid era cover-up,  took to social media to declare that colonialism had brought about positives, including the judiciary and the transport system. This was the old view, a shallow understanding of colonialism which was out of touch with the world in which we now live, and heartless.  In the new view of colonialism, many consider the positives unintended benefits—for the reason that the system was not created to benefit the majority, but only a certain group.

Colonialism, in the new understanding of the term, cannot be justified under any circumstances. To justify it may be compared with a woman who is raped, falls pregnant, and gives birth to a beautiful child. The child grows to be a successful young man, and now the rapist sends a letter to the victim, that the rape has brought blessing for all. In this case, the victim wishes not to be on equal terms with the perpetrator. The Martiniquan poet AimĂ© CĂ©saire said about colonialism, ‘I am talking about societies drained of their essence, cultures trampled underfoot, institutions undermined, lands confiscated, religions smashed, magnificent artistic creations destroyed, extraordinary possibilities wiped out.’

How then shall we overcome the enduring legacy of colonialism? How may we finally break the heavy yoke? I return to the subject of knowledge. In three areas in particular, I see our knowledge of the situation as being critical to its transformation. All of these areas need to be clearly understood, because they serve as enduring instruments which contribute to the reluctance of the oppressor to be equal to the oppressed:
• Knowledge of racism. Race is the major factor which the oppressor uses to exploit natives, even foreigners. The ability to convince a certain group, often implicitly and insidiously, that a certain colour of skin is lordly, has to be the greatest instrument used to ensure that equality remains a myth.  Even today, it would seem that most people find credence in this illogical belief. Through this perspective, many cultures have distorted and damaged their own norms, values, and practices, particularly in Africa.

• Knowledge of religion. Is the information provided by religious leaders a message that promotes equality, while uplifting the soul and uniting society? Or does religion factor into inequality because, with disregard for our social status, it proclaims that in God’s eyes we are all equal? Many religious leaders, too, proclaim a prosperity gospel, selling the idea to many in society that wealth is accumulated through worshiping God in a certain way—yet through the same message, they themselves become rich, so practically denying the quest for equality.

• Knowledge of generational wealth. Generational wealth proves that inheritance is a foreign word to many Black Africans. This wealth was fashioned through taking advantage of the preferential treatment that White people received through colonialism—and in South Africa, through apartheid. It is common knowledge that a White South African has greater opportunity of living a monetarily comfortable life than a Black South African counterpart. To equalise this requires that the one who is ahead uplifts the one who is left behind. The anti-apartheid activist Steve Biko emphasised that White people should gather amongst themselves to discuss their common problem.
Sadly, those who ought to be at the forefront of developing Africa’s knowledge in these things have so often failed us themselves. The very people who should be helping us—religious and non-religious leaders, and members of state—operate on the same level as colonialists. They perpetuate a system of oppressor vs. oppressed. A sectional few desire equality, and in South Africa, we have now become the most unequal nation on earth.

The need is knowledge which teaches, rebukes, corrects, and trains—a kind of knowledge which is above all price for building a united society, because it helps us to do what is right. The Liberty Life Group in South Africa issued the following statement: ‘Knowledge is not merely fact. It is not a badge. It is not a bragging right. It is those few words that completely, utterly alter the way you see things.’ May we share knowledge that will build, unite, and assist us in redressing the injustices of our continent.

Image acknowledgement: Collectie Tropenmuseum.

02 April 2017

Picture Post #23: Politicians Seeking to Picture the Historic Moment



'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'


Posted by Tessa den Uyl and Martin Cohen

The UK Prime Minster, Mrs May, signing the Article 50 notification

March 29th 2017 was the date that the United Kingdom, in the form of its Prime Minister, Teresa May, formally notified the European Union, by letter, of its intention to leave.

The UK is split down the middle by the plan, with a fraction over half voting 'for' and just under half voting 'against'. But within the ruling Conservatives, a resurgent nationalism and indeed triumphalism dominates.

And this is what first stands out about this carefully prepared and balanced image. The very dominant ‘Union Jack’. Now British politicians do not usually cloak themselves in the national flag—it used to be considered inappropriate, a usurpation. The flag has also been for too long associated with 'unacceptable' political parties, like the National Front, who stood for Parliament (and fought in the streets) on an openly racist platform about expelling 'other races' from the country.

But now a Prime Minster with a very similar platform, expelling European ‘migrants’ from the country, involving a backward-looking and divisive notion of Britishness, is actually in power.

So the Union Jack is there, on Mrs May's left, carefully arranged to display the red English cross to good effect. And on her right is … an empty chair, conveying a sense of isolation. Above the Prime Minster is not Big Ben, but a small, wind-up, carriage clock. Such things are anachronisms in an age of digital timepieces. They need winding regularly and are much less accurate. Here, the clock looks sad, and conveys only the impression of monotonous, dusty waiting rooms …

Mrs May sits exactly in front of a marble fireplace. Marble sends subliminal messages about wealth and importance. But it is also the stone of cemeteries and mausoleums.

The British Prime Minster wears a black outfit, which despite the white speckles does not quite manage to dispel the funeral feel. Mrs May used to be a banker, working at the Bank of England and the Association for Clearing Payment Services. But here she looks less like a banker and more like a lawyer, signing a Very Important Document, such as a death certificate.

And so in a sense it may actually turn out to be—for her Brexit programme could be the death of the United Kingdom.

26 March 2017

Poetry: On Thinking

Posted by Chengde Chen*


It is said that man is the animal that thinks
I don’t know whether animals think or not
but the crowd often doesn’t

From the unifying roar of the Third Reich saluting the FĂ¼hrer
to the wave of the ‘red ocean’ rolling towards the Red Sun
from rock stars’ pretended madness surging into real madness
to Manchester United’s football directing the eyes of the world
the crowd is so simple and so easy to manipulate
Whether it is past or present, east or west
whether it is about religion, war, rock stars or football stars
different fanaticisms are not different!

Why don’t people who can think think? Because
trends are greater than thought
traditions are heavier than thought
faiths are stronger than thought
power is more powerful than thought
A madman’s hysteria can become a nation’s reason
a dead dogma can become a social movement, because
a head without thinking can be filled with anything!

What is thinking? Thinking is not memory
nor reciting hundreds of classical poems
Thinking is not longing, nor lingering under the moonlight
Thinking is not calculation, nor differentiation or integration
Thinking is not fantasy, nor a dream in the daylight
Thinking is the deity’s atheistic advancement –
with reason cutting through the magnetic field of concepts
generating the omnipresent electricity of criticism

For a trend, it is a cold reef
For tradition, it is a rude drunkard
For religion, it is a self-appointed God
For power, it is the blind who see nothing
It is the will of water, and the breath of fire
Logic is its iron hooves, galloping through the universe
It may not be difficult to subdue a thinker
but a thought cannot be conquered
much as no force can make one equal two!


To think is not man’s instinct or necessary function
Without Copernicus, the Earth would still rotate
Without Darwin, apes would still have evolved into man
Yet thoughts make the difference between men
greater than that between man and a deity

Kant, who never travelled beyond his Königsberg
invited God into his heart to discuss ‘Practical Reason’
Einstein, being a junior clerk of a patent office
caught up with light to gain eternal life in four dimensional space
Some people have never thought throughout their lives
so life owes them a world
While some who have, have created new worlds!

Today we are proud of our digital capability
But machines carrying out man’s instructions
is man executing machines’ orders
Hence the Matthew Effect:
those who think, think more; those who don’t, even less
The net may have caught everybody
the high-performance screen can be more desolate

It is said that “when man thinks, God laughs”
But if man doesn’t, God would be bored
He likes the fun of thinking but not the hard work
so He created man to do the job for Him
How should the creature deserve the creation?
This oldest of Greek issues about thinking
is still the first thing that needs to be thought



* Chengde Chen is the author of Five Themes of Today, Open Gate Press, London. chengde@sipgroup.com

19 March 2017

The Trouble With Fallacy

Posted by Thomas Scarborough
‘That’s fallacious!’ people say, and no greater fault can be laid at the foot of philosophers, or anyone else who offers arguments. And yet, outside its tidy logical definition, the term ‘fallacycomes with many far from straightforward assumptions ...
The first thinker in the Western world to approach the concept of fallacy in a systematic way was Aristotle, and his thinking on the matter, set out in a work known as On Sophistical Refutations, remains a touchstone on the subject to this day. Yet Aristotle's work shows us just how far we have drifted—and, it is argued here, lost our way:
• For Aristotle, the point of identifying fallacy was to avoid ‘the semblance of wisdom without the reality’. Today, the emphasis is rather on syllogistic reasoning (see below), and reasoning would seem to have become an end in itself. In the words of philosophy writer Tim Ruggiero, ‘the focus is the method’. That is, Aristotle, in his time, placed a far greater emphasis on what one would hope to produce through sound reasoning, rather than the reasoning itself.

• Aristotle set no limits to what fallacy might include. Fallacy had to do with getting at the truth, and wherever the truth was impeded, there was fallacy. Aristotle was interested in ‘reasoning about any theme put before us from the most generally accepted premisses that there are’. Today, however, fallacies tend to be ruled in or out by rules that are technical. Philosophy professor Robert Audi notes that if we do not have an argument—even as we subvert the goal which is wisdom—this may not qualify as a fallacy today.

• Aristotle considered that a fallacy has either to do with ‘silly’ mistakes, or with the failure to take account of all of reality. Fallacy, he noted, occurs either through ‘stupidity’, or ‘whenever some question is left out’. That is, all fallacy, unless it is ‘stupid’, fails to take something into consideration. Today, by way of contrast, the emphasis on that which is left out would seem to be all but completely overlooked.
These three points are, in fact, intertwined. The goal of sound reasoning is wisdom, wisdom takes everything into account, and where one fails to take everything into account, one falls short of the wisdom that one seeks—apart from the 'silly' mistakes, that is. The implication is that fallacies occur where our minds fail to range broadly through our world.

By contrast, formal fallacies, today, generally concern only classical syllogisms without variables. A well known example of a valid syllogism is this:
All men are mortal
Socrates is a man
Therefore Socrates is mortal
And an example of an invalid syllogism is this:
Socrates has two legs
Birds have two legs
Therefore Socrates is a bird
As with many fallacies, we immediately feel that something is wrong with the second example, yet it may be hard to define just what.

But then, in fact, it may be said that every fallacy leaves something out of consideration. The ad hominem fallacy, for example—the argument that rests on a criticism of whoever has uttered it—fails to consider the facts of the matter; the fallacy of denying the antecedent fails to consider the excluded set; the genetic fallacy fails to consider the present reality, and so on.

Fallacy, then, is not merely about its more recent focus—correct syllogisms and sound conclusions, among other things. Rather, it is about what we leave out of our thinking. There may be nothing more needful in our time:
What is it that has been left out of our economic thinking that has led to social inequality? What is it that has been omitted from our technological thinking that has led to ecological ruin? What is it that has been left out of our political thinking that has led to our transgressions of human rights?
Fallacy, wherever it is found, comes down to a kind of short-sightedness that fails to range through all the world. In an important sense, it is not about mere ‘reasoning’ alone.