01 July 2018

PP #37 A Celebration of Brashness!



'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl and Martin Cohen

A postcard presentation of Times Square
      
Times Square, New York.
‘The soft rush of taxis by him, and laughter, laughters hoarse as a crow’s, incessant and loud, with the rumble of the subways underneath - and over all, the revolutions of light, the growings and recedings of light - light dividing like pearls - forming and reforming in glittering bars and circles and monstrous grotesque figures cut amazingly on the sky.’
During the so-called Jazz Age, that is the optimistic time after ‘the Great War’ and before the Depression, the rise of Nazism and the Second World War, F. Scott Fitzgerald’s metaphor in his book The Beautiful and Damned, reflects so well the human despair combined with hope.

Acts of freedom and expression intertwine to be heard and noticed, to forget and to distract, to employ, and to  hope... In those days, Times Square must have appeared promising, like a colourful stamp on the continent. But what did its message say?

Ideas about segregation and freedom brought ‘silent’ new horizons and made former distinctions tremble. With all there was to come, in those years of the Roaring Twenties, all the layers that combine to make a society were looking for ‘a voice’ and the call echoed, near and far. 
 
People rather grandly called Times Square the ‘crossroads of the world’ and in those days, that might have well been so. And today, on the edge of the square, the NASDAQ controls a good slice of the world’s wealth and the New York Times does likewise for the world's news. 
 
Yet it is after dark, after the office day has finished, that the square really comes alive. Doubtful is whether that liveliness today, is filled with the same complexity and struggle, or with that necessity literally and symbolically to survive. While it once stimulated a proper voice, ‘light dividing like pearls’, now Times Sqaure embraces more of a homogenisation and offers monstrous grotesque figures cut amazingly out of the sky.


24 June 2018

The Importance of Being Seen

Posted by Simon Thomas
Of all our innermost human desires, nothing seems to trump that of being seen. Humanity is meant for community and for togetherness. There is something very fundamental about being acknowledged, of being seen, as a person.
The Internet with all its wonders and the ability to connect people from all over the world, does indeed meet some of this need, but virtual reality does not on any level substitute real human interaction on a personal level. It does, however, point out the fact there that there is a real need for it. A person can have untold ‘friends’ or followers on social media, but this does not replace real human conversation. And we are sick because of it.

Everyone needs to know that what they say, or the very fact of their being is important to somebody. The heart longs for a human embrace, for the attentive ear of another, who shows some interest in being with them and listening to what they say. Psychological research has shown that when people are perpetually in a situation where they are ignored, this causes real emotional pain, and that in turn will cause physical problems caused by the stress of being ignored on a ongoing basis.

This is all the more prevalent in our society of individualism. Everyone wants to connect, but there seems to be this incessant preoccupation with connecting with everything and everyone except that which is in our present reality. It is true, a person can feel lonely in crowd, and feel intense feelings of abandonment even in the company of others. This is especially true in our society with its preoccupation with distraction.

Things become more important than people, virtual friendships on-line become more important than friendships we can experience in real time and in real situations. We see, but we don’t see each other. We put each other in categories, and fail to recognise how much we are the same, with the same need for communicating and the real need for simply communicating with those who share our time and space.

Families today, too, have gone this route. People live under the same roof but do not communicate; there is no or very little interaction. The whole emphasis has shifted form ‘how can I serve’ to ‘how can I get something out of this person’.

I have this kind of relationship with my dog. It comes to me when it is hungry or wants something from me. And that is okay. Animals do not have the complex relationship needs that human beings have. But my dog has what I call ‘cupboard love’ -- he loves me for what he can get from me. But that is not to be the way we interact with our fellow human beings. It is the height of selfishness. And very often the cause of much emotional and mental anguish.

I have noticed however, that that is how the Internet of things works. Someone has something on offer, which the other wants, and what happens is that while felt needs are met on a superficial level, there is not a lasting connection. And it is understandable that people want the connection, but they don’t want to acknowledge the person they interact with. We come across many people in our daily lives, in the office, at the bus station, in the shops, at church. But as many can testify, even after we exit a party or a group of people we feel drained.

It is important to listen to one another. Even a brief interaction can be meaningful if the person we talk to makes us feel that we have been seen, that we have been acknowledged. It is not uncommon to go through a day and while we do many things in the course our day’s activities, we are left empty. What is that? Well I perceive that the reason we fail to connect is that we seem to objectify people and treat them as less than they are.

Human beings are made imago dei -- in the image of God. We were created to interact and communicate; we were made to live in community and not in isolation. To be human is to share in the common human experience, and to live in such a way that we acknowledge one another, and not allow our many distractions to detract from how we relate to one another.

17 June 2018

White Lies – Malevolence or Defence?

Little White Lies, by e9Art
Posted by Christian Sötemann
A little thought experiment: In the year 2088, a mentally highly volatile leader of an autocratic world power is undergoing yet another personal crisis. His wife, so he has heard, is secretly planning to leave him. Without her, he sees no meaning in going on. Since he is also a narcissistic megalomaniac, in his dark mood, he decides that the world should perish if he left him. He prepares to give the order for a nuclear strike and confronts his wife on her secret plans.
Now, what would be a wise thing for her to answer, even if she actually planned on leaving him? Surely, most people would say something along those lines: Calm him down, say that everything is fine, just get him away from ordering a nuclear strike. The rest will be sorted out later. Hence she should lie to save the world from a nuclear attack.

That’s that then, right? Not so fast. In ethics, the role of the lie has been a hotly debated one. Among the ethical stances, there are some which emphasise the consequences of an action to determine whether they are moral or not. Many of the supporters of these approaches would probably have few issues with the wife’s lie. The argument would go like this: Lying in this particular case prevents unfathomable damage occurring to millions of people, so it is the right decision.

There are, however, perspectives in ethics that focus more on principles and duties rather than consequences of actions, notably in Kant’s categorical imperative: ‘Act only according to that maxim by which you can at the same time will that it should become a universal law’. From this point of view, in its strictest form, a lie cannot ever be legitimate, because human relationships would become poisoned if everybody lied to each other all the time.

In many cases, there is some validity to that principle. We have to be able, at least most of the time, to confide in what people around us tell us. The lie has to be the exception rather than the rule. Our everyday life would be seriously impaired if we all lied to each other all or most of the time. 

Still, there is a point to be made for white lies. Schopenhauer viewed lies as a legitimate form of self-defence in cases of extortion, threat or unauthorised interference or intrusion, among other things. If I am exposed to an evil will, lying can be part of the arsenal to defend myself.

For example, if somebody broke into my house, thus violating my right to privacy, my exclamation telling the burglar that the police were already on its way, would represent a perfectly legitimate lie to make this intruder leave my house as quickly as possible. Similarly, a child threatened by bullies on its way home from school might want to use the white lie that his parents or elder brother were just around the corner. There is no malevolent deceit in situations such as these.

It seems that the most important aspect here is that there is a predicament which can make a white lie a suitable means to an end. To avert a catastrophe or a crime, white lies can come into consideration. Besides, from this perspective, the ‘lie’ aspect of the white lie becomes less relevant – rather, it becomes one of several means to defend oneself. It is something one can do to get out of a dangerous situation.

The application of the categorical imperative in this case should therefore not denounce the white lie as harmful, but could be reformulated as: ‘In a dangerous situation threatening the physical and psychological integrity of an individual in an illegitimate way, every individual should have the right to undertake sufficient actions to avert this threat’.

In German, one translation of ‘white lie’ is Notlüge, meaning, literally, ‘emergency lie’. Perhaps this serves to illustrate some cases in which a white lie seems appropriate. It is something that is more a verbal form of defence rather than a mere lie.

Certainly, it would be harmful to lie all of the time. And it can be harmful to never ever lie. The potential Kantian counterargument that this takes into consideration the consequences of actions rather than a principled stance regardless of what happens afterwards is something that can be addressed.

But it represents another example of morality not necessarily being beholden to one orthodoxy throughout.  We may consider principles as well as consequences in our moral deliberations. There is something to be found between the extremes of rigidity and arbitrariness. So, we should not blame the dictator’s wife for her white lie. Those living in the year 2088 will be grateful for our leniency.

10 June 2018

BOOK REVIEWS: Back to the Future with the Food Gathering Diet

Posted by Martin Cohen*

BOOK REVIEW
Back to the Future with the Food Gatherers Diet


How we imagine hunting and gathering - in this case, on the South Texas Plains


Food Sanity: How to Eat in a World of Fads and Fiction
By David Friedman (Turner 2018).

Psst! Maybe someone should have told David Friedman, well-known media personality as well as the author of this new look at food issues – there are hardly any vegans. So if you pitch a book on 'how to eat' to that crowd, you take the risk of ending up preaching to a much reduced congregation. Add to which the serious vegans in town won't like some of what Friedman has to say, because vegans don’t eat eggs and certainly don’t eat fish. All of which only goes to show, that food is a pretty controversial and divisive issue these days, and if you want to be honest, as Friedman evidently does, you're going to have to risk trampling on the dearly held, indeed dearly munched, beliefs of lots of people.

But I hope Food Sanity does find that wider readership, because I’ve read a lot of books and articles recently about food and this one really does clear out a lot of the deadwood and present some pretty mind-boggling facts (and figures) to ‘put the record straight’, as Jack Canfield (of Chicken Soup for the Soul fame) puts it, by way of an endorsement of the book.

Take one opening salvo, that as I say, will surely lose Friedman lots of readers in one fell swoop: the Paleo or ‘Caveman’ Diet. This is probably the most popular diet going and that’s likely because it fits so excellently people’s dearly held prejudices. Plus, it allows them to eat lots of beef-burgers and chips, while cutting out things like muesli which only hippies eat anyway. But oh no, Friedman has done his research and found out that Stone Age folk didn’t really eat lots of red meat washed down with a beaker of blood, as we like to imagine. Instead, using both archaeological and anthropological research as a guide, he says that the earliest human tribes spent most of their time eating fruits and seeds, which they gathered, and probably only really sharpened the spears (or so, at least, I imagine) for internecine human disputes.

Friedman finishes his deconstruction of Paleo by consideration of human biology too: notably the fact that we just aren’t built to catch our fellow animals. We lack the right claws, teeth and general physique too. He points out, a thing curiously overlooked, that Stone Age people would have been rather short and squat - not the fine figures wielding clubs that we imagine. He retells Jared Diamond’s tale of a hunting trip by one of today’s last remaining ‘stone age’ tribes, in New Guinea. At the end of the hunt, the tribe had caught only some baby birds, frogs and mushrooms.

This is all fascinating to me, but compelling too are Friedman’s physiological observations, most particularly on the acidity of the human stomach. The gastric fluids of carnivores are very acidic (pH 1), which is essential if they are to break down the proteins and to kill bacteria. Our stomachs, however, are much less acidic (pH 5), and simply can’t tolerate much uncooked meat. And if, yes, Stone Age man might have done a bit of cooking, it would probably have been rather rudimentary with parts of the meat not really cooked.

Actually, by the time I had finished reading all of the reasons that ‘humans can't eat meat’, I was left puzzled by Friedman’s conclusion which was that a significant proportion of the prehistoric human diet (nonetheless) seems to have been meat. Less surprising was Friedman’s hearty endorsement of eggs, which surely everyone has heard by now are really not dangerous, and don’t cause heart attacks after all, and fish, which he carefully defends form claims that they are today dangerously contaminated with things like mercury.

However dairy gets the thumbs down, with a disdain that I personally felt was unjustified. Dairy, after all, is much more than drinks of cow’s milk - it is goat and sheep milk, cheese and cream too -  and an inseparable part of many dishes. We are advised here instead to swap to things like ‘almond milk’, and ‘hemp milk’ but I know these substitutes very well, and, well, they ain’t one. At least Friedman doesn’t try to suggest we switch to soya milk because, as he rightly observes, that is a food disaster just in itself

There is, to be honest, a bit too much bad news in this book - so much so that I started to skip some  sections, which fortunately the book’s modular structure permits. On the other hand, Friedman makes an effort to leaven the mix by including some good news and positive suggestions, including a two page table of the healthiest foods on earth. What are they? They're all fruits and veggies - the things that Plato and Pythagoras were praising and recommending nearly three thousand years ago. It seems that it’s time, if not indeed long overdue, to go back to following their advice.


*Martin Cohen is the author of a forthcoming book on food issues too called I Think Therefore I Eat, which is also published by Turner, and due out in November 2018


04 June 2018

Picture Post #36 A postcard from Taroudant









'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'


Posted by Tessa den Uyl and Martin Cohen

A postcard from Taroudant, Maroc

One piece of advice offered is to lower the gaze, to not allow it to dwell, as if the eye serves distraction.

The woman seated in front of the painting is possibly homeless. Her posture dissolves with the two figures on the wall, characterised by their carved-out eyes, and urge us to imagine where this woman can put her gaze.

Eyes and hearts, their combination invites a myriad of symbolic attributions. One of them is that a woman with her eyes can reach the man in his heart. The carved-out eyes suggest that women, even when veiled, still look (and distract), which they should not... Or is the image saying something quite different, that the time for women to be veiled is consigned to history and that these days we can 'forget about the eyes’?

An eye is connected with light, and light with reflection. The ‘seduction’ begins with the question of where the reflection should pose its attention.

27 May 2018

Occam's Razor: On the Virtue of Simplicity

As a Franciscan monk, simplicity was at the heart of   William's daily life.
Posted by Keith Tidman

The English philosopher and monk, William of Occam (c. 1287–1347), surely got it about right with his ‘law of parsimony’, which asserts, as a general principle, that when there are two competing explanations or theories, the one with the fewest assumptions (and fewest guesses or variables) more often is to be prefered. As the ‘More than Subtle Doctor’ couched the concept in his Summa Logicae, ‘It is futile to do with more what can be done with fewer’ — itself an example of ‘economy’. William’s law is typically referred to as Occam’s razor — the word ‘razor’ signifying a slicing away of arguably unnecessary postulates. In many instances, Occam’s razor is indeed right; in other examples, well, perhaps not. Let’s explore the ideas further.

Although the law of parsimony has always been most closely associated with William of Occam, (Occam, now called ‘Ockham’, being the village where he was born), he hasn’t been the principle’s only proponent. Just as famously, a millennia and a half earlier, the Greek philosopher Aristotle said something similar in his Posterior Analytics:
‘We may assume the superiority ceteris paribus [other things being equal] of the demonstration which derives from fewer postulates or hypotheses.’
And seven centuries after William, Albert Einstein, perhaps thinking of his own formulation of special relativity, noted that ‘the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible’. Many other philosophers, scientists, and thinkers have also admired the concept.

Science’s favoritism toward the parsimony of Occam’s razor is no more apparent than in the search for a so-called ‘theory of everything’ — an umbrella theory unifying harmoniously all the physical forces of the cosmos, including the two cornerstones of 20th-century physics: the general theory of relativity (describing the macro scale) and quantum theory (describing the micro scale). This holy grail of science has proven an immense but irresistible challenge, its having occupied much of Einstein’s life, as it has the imagination of other physicists. But the appeal to scientists is in a unified (presumed final or all-encompassing) theory being condensed into a single set of equations, or perhaps just one equation, to describe all physical reality. The appeal of the theory’s potential frugality in coherently and irreducibly explaining the universe remains immense.

Certainly, philosophers too, often regard parsimony as a virtue — although there have been exceptions. For clarity, we must first note that parsimony and simplicity are usually, as a practical matter, considered one and the same thing — that is, largely interchangeable. For its part, simplicity comes in at least two variants: one equates to the number and complexity of kinds of things hypothesised, and sometimes referred to as ‘elegance’ or ‘qualitative parsimony’; the second equates to the number and complexity of individual, independent things (entities) hypothesised, and sometimes referred to as ‘quantitative parsimony’. Intuitively, people in their daily lives usually favor simpler hypotheses; so do philosophers and scientists. For example, we assume that Earth’s gravity will always apply rather than its suddenly ceasing — that is, rather than objects falling upward unassisted.
Among the philosophers who weighed in on the principle was Thomas Aquinas, who noted in Summa Theologica in the 13th century, ‘If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments where one suffices.’ And the 18th-century German philosopher Immanuel Kant, in the Critique of Pure Reason, similarly observed that ‘rudiments or principles must not be unnecessarily multiplied.’ In this manner, philosophers have sometimes turned to Occam’s razor to criticise broad metaphysical hypotheses that purportedly include the baggage of unnecessary ontological concepts. An example of falling under such criticism via the application of Occam’s razor is Cartesian dualism, which physicalists argue is flawed by an extra category — that is, the notion that the mind is entirely apart from the neuronal and synaptic activity of the brain (the physical and mental purportedly being two separate entities).

Returning to Einstein, his iconic equation, E=mc2, is an example of Occam’s razor. This ‘simple’ mathematical formula, which had more-complex precursors, has only two variables and one constant, relating (via conversion) the amount of energy to the amount of matter (mass) multiplied by the speed of light squared. It allows one to calculate how much energy is tied up in the mass of any given object, such as a chickpea or granite boulder. The result is a perfectly parsimonious snapshot of physical reality. But simplicity isn’t always enough, of course. There must also be consistency with the available data, with the model necessarily accommodating new (better) data as they become available.

Other eminent scientists, like the 17th-century physicist and mathematician Isaac Newton, similarly valued this principle of frugality. The first of Newton’s three ‘rules of reasoning in philosophy’ expressed in his Principia Mathematica offers:
‘We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances. . . . Nature is pleased with simplicity, and affects not the pomp of superfluous causes.’
But, as noted above, Occam’s razor doesn’t always lead to truth per se. Nor, importantly, does the notion of ‘simplicity’ necessarily equate to ease of explanation or ease of understanding. Here are two examples where frugality arguably doesn’t win the day. One theory presents a complex cosmological explanation of the Big Bang and the physical evolution of a 13.8-billion-year-old universe. A single, but very-late-on-the-stage thread of that cosmological account is the intricate biological evolution of modern human beings. A second, creationist explanation of the current universe and of human beings — with far fewer assumptions and hypotheses — describes both as having roots in a single event some 6,000 to 10,000 years ago, with the cosmos conveniently made to look older. Available evidence suggests, however, that the first explanation is correct, despite the second explanation’s parsimony.

In broad ways, Occam’s razor has been supported by the empirical successes of theories that proved parsimonious in their explanations: with fewer causes, entities, properties, variables, and processes embedded in fewer assumptions and hypotheses. However, even though people tend instinctively and understandably to be drawn toward simpler accounts of hoped-for reality, simplicity hasn’t always triumphed. For example, the earlier nature-versus-nurture debate posed a simpler, albeit false, either-or dichotomy in trying to understand a person’s development and behaviour on the basis of either the environment — the influence of external factors, such as experience and learning, on an otherwise blank slate or perhaps set of instincts — or genes and heritability — that is, biological pre-wiring. Reality is, of course, a complex mix of both nature and nurture, with one influencing the other.

To avoid such pitfalls, as the English mathematician and philosopher Alfred North Whitehead pointedly (and parsimoniously) suggested:
‘. . . every natural philosopher should seek simplicity and distrust it.

20 May 2018

‘Purposeful Living’ Through Grief

Rainy Night In The City, by Alina Madan. Poster: Giclee Print
Posted by Lina Ufimtseva
Grief is like a rude neighbour in the night, knocking at your mind’s door at all kinds of inopportune moments.  Hush, you want to tell it, go away, let me sleep.  But not only is grief rude in its all-encompassing demands for attention, it also is disobedient, and stubbornly stays.  Often, for years.
I am stirring a pot of soup on the stove, and I switch it off.  The boiling liquid quickly settles, and the rolling of the surface stops.  ‘Just like my mother's blood,’ I think instinctively.  Her blood stopped moving, too. ‘Just so,’ I think, ‘a loved one's life can slip away, unceremoniously.’ And so, in the sudden memory which the soup brings back, grief stands rudely knocking.  Go away, go away.

Time allows for the body to regenerate and to heal, provided it is not put under more stress.  Years later, one may feel the strain in a joint from an old injury, but it will often be no more than a lingering nuisance.  Grief, on the other hand, can hit one like a train, no matter how much time has passed since tragedy struck. Why is emotional pain more difficult to bear than physical pain? 

The brain uses a single neural system to detect and feel pain.  The anterior insula cortex and the anterior cingulate cortex are responsible for detecting pain, regardless whether it is of a physical or emotional nature.  Even painkillers may numb emotional pain temporarily.  But they don’t help in healing.

This begs the question, why does emotional pain not heal as if it were physical?

Upon asking how a mother’s labour went, a woman may underplay her experience and reply that it was ‘painful’ or ‘a lot of pressure’.  Yet those mothers who lay in agony giving birth will voluntarily unleash the same process upon their bodies again and again.  Physical pain lingers only as an awareness that it was indeed at one time painful. 

Grief, however, has the unique ability to reiterate itself at the most seemingly random moments.  Therein lies a clue.  If we want physical pain to leave our bodies—assuming that, as it usually does,-- it affects only a certain limb or area of the body—we may use a crutch to prevent too much strain, say, on a leg.  But how does one rest from grief?

Generally one does not.

Our brains process the pain of grief in a non-linear manner.  Physical trauma leaves scars—smooth scars.  Emotional pain creates what I would call neural scabs of sorts that can be—and often will be—picked at, voluntarily or not.

The psychologist Thomas Crook has noted:
‘Indeed, when brain imaging studies are done on people who are grieving, increased activity is seen along a broad network of neurons.  These link areas associated not only with mood but also with memory, perception, conceptualization, and even the regulation of the heart, the digestive system, and other organs.  This shows the pervasive impact loss or even disappointment can have.’
Grief affects the neural pathways in a far more pervasive and ineluctable or ineludible manner than physical pain.  Emotional pain, like a scab, can very easily get picked by a casual scratch of an old memory, and the blood of grief starts pouring again.

Those who have been severely distraught by their circumstances often come to the conclusion that the greater meaning in life is not seeking happiness and hedonism, but in creating a purposeful living.  The word choice here: ‘a purposeful living’ rather than ‘a purposeful life’, is in itself deliberate.  Meaning is not stagnant.  One cannot create a purposeful life and leave it at that.  Purpose must continue to be lived out, to be striven for, to continue in some kind of endeavour. 

Purpose without struggle often loses its meaning.  In this light, grief can be given a purpose.  Severe emotional pain can be the catalyst to revaluate one’s values, choices, and path in life.  It can be one’s very own personal as well as professional spring board. 

Do you wish to leap into the bounds of further despair?  Go ahead, and grief will get you there.  Do you wish to see an armour around yourself unveiled?  Go ahead, and grief can give you the thickest skin and the thinnest heart you ever imagined.

Grief can and will redefine who you thought you were.  Can you hear it knocking?

13 May 2018

African Propaganda In a Nutshell

Posted by Sifiso Mkhonto
Change is happening all over the world. It is impossible to stand still. Yet as we change, there are those who would wish to influence that change—some in a positive and some in a negative way. My intention is to focus on invidious change that others seek to bring about through propaganda. Specifically, in Africa.
Propaganda is biased, misleading, and intends to shape perceptions, manipulate cognitions, and direct behaviour. The Oxford Dictionary of Philosophy defines propaganda as ‘the active manipulation of opinion by means that include distortion or concealment of the truth.’ It usefully distinguishes between ’agitation propaganda, which seeks to change attitudes, and ‘integration propaganda’ which seeks to reinforce existing attitudes.

Africa has been the victim of both agitation propaganda and integration propaganda—and while propaganda anywhere in the world may share the same characteristics, I here offer examples which are characteristically African, which Africans are primarily aware of—or ought to be. Mark Nichol, a writer, offers these four useful descriptions of propaganda, from which I develop my thoughtful analysis:
An appeal to prejudice, or the black-and-white fallacy. Africa is a place of unusually stark contrasts, historical, cultural, social, and geographical. Politicians and religious leaders exploit this by presenting only two alternatives, one of which is identified as undesirable. They do so to exploit an audience’s desire to believe that it is morally or otherwise superior. However, the goal is the pleasure of the propagandists, regardless of whether the victim is in poverty or has riches.

An appeal to fear. Africa still wrestles with fundamental issues, more so than other regions of the world, so that it faces many fears and uncertainties. Propagandists exploit fear and doubt, disseminating false or negative information, to undermine adherence to an undesirable belief or opinion. They do so to exploit audience anxieties or concerns through fear of political identity, gender, race, tribes, and religious or traditional practices.

Half-truths. Governments and political parties in Africa tend to be secretive about information, which may further be difficult for the public to access. Full knowing the full truth, they still make statements that are partly true or otherwise deceptive to further their own agenda. The government often disguises this as a matter of national security, so that the full truth lies under a veil of secrecy.

Obfuscation and glittering generalities. In Africa, the spoken word may have priority over the written word, so that it is received personally, not critically. Propagandists resort to vague communication and word prejudices intended to confuse the audience as it seeks to interpret the message. In South Africa, the ruling party has for each election campaign used this method to continue holding power. It tells the story of apartheid history and how its injustices ought to be fixed, however may only be fixed if each person votes in remembrance of their leaders who fought the apartheid system.
Where does the solution lie? It surely lies in our personal choice, as to whether to accept or reject what we see, read, and hear. Our identity and its underlying attitudes are changed over time, through those choices that we make—and our ideology, which is the consequence of what we were and are exposed to, often plays a crucial role in shaping our perception of what is truth and propaganda.

As individuals, we need to examine our judgements of information at the bar of mature reasoning, in order to avoid judging amiss and believing the propaganda. If we continue to fail this test, propaganda will prevail as it allows what is biased popular opinion to turn into the judgement of the minority opinion.  This then infringes on the right we all ought to or do have—freedom of speech.

The theologian Isaac Watts gives us this timely advice:
‘When a man of eloquence speaks or writes upon any subject, we are too ready to run into his sentiments, being sweetly and insensibly drawn by the smoothness of his harangue, and the pathetic power of his language. Rhetoric will varnish every error so that it shall appear in the dress of truth, and put such ornaments upon vice, as to make it look like virtue: it is an art of wondrous and extensive influence: it often conceals, obscures, or overwhelms the truth and places sometimes a gross falsehood in a most alluring light.’ 
Let us use logic as the measure of reasoning and sharing information. Not biased opinion from an eloquent man.

07 May 2018

Picture Post #35: The House Number









'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'


Posted by Thomas Scarborough

Mountain View township, South Africa

House numbers:- laser cut aluminium, cast iron plaques, illuminated perspex, oil rubbed bronze, stencils and paint, carvings in wood. These not only identify the house, but reveal the occupant.

The number on a front door in an African township. We are immediately impressed by the attitude it expresses:- a bold and careless statement that this is no. 1251, so put that in your pipe and smoke it. 

29 April 2018

Is There a Rational Basis For Human Compassion?

By Thomas Scarborough
Søren Kierkegaard wrote that Immanuel Kant’s moral philosophy was ‘utterly without grace’. It was a fierce condemnation of Kant.
Kant  favoured autonomy—which is defined as the capacity of an agent to act in accordance with objective morality rather than under the influence of desires. Today this is a view which, by and large, drives all of our ethical thinking. The problem, in Kierkegaard’s eyes, was that it lacked compassion. This is true. We place great emphasis on civil rights, the rule of law, social norms, and so on, while compassion is not comfortably accommodated in the scheme. How may it be possible to bridge the gap—rationally? This is the subject of this post.

Ethics is a very human thing. Regardless of the intellectual debate, or the final framing of our ethics private or public, it always originates in the human person. It is, above all, a person's formation of a certain outlook on the world. Aristotle thought of ethics as ‘the golden mean’—the balanced life—where the ‘mean’ is defined as a quality or action which is equally removed from two opposite extremes. Thus ethics represents the achievement of a balance in the human person—between economic and social goals, individual and communal goals, unity and diversity, novelty and tradition, thought and feeling, and so much more. This is our starting point in this post—that it is about balance—of which further discussion would unfortunately deny us room to develop the theme in the available space.

In order to develop the ‘golden mean’, then, it stands to reason that we should weigh a great number of opposites in our minds, not to speak of variations, one against the other. The scope of this is important here: as we do so, we typically have as our goal to balance the world around us, no more and no less. I should say, I have as my goal to balance the world around me—in my own individual mind—so as to develop (I should hope) a balanced outlook on my world. This is true—but it is simplistic. It is a more nuanced view of the process which should help us to open up our ethical thinking to human compassion.

I live in a world of others—tens, thousands, millions, in fact billions of others. As soon as I take these others into account, not merely as numbers, entities, or abstractions, I open up some important considerations. Each of these others carries in their own mind an evaluation of the world—without which my own evaluation of the world cannot be complete. It matters a great deal, not merely that others exist in my world, but that they each arrange the world in their own particular way. Therefore in a sense. we now have uncountable worlds within a world. It is easy to overlook this. These others perceive things, assess things, plan things, and act upon things which are of critical importance to that ‘golden mean’ which Aristotle spoke about. Perhaps this much goes without saying.

However this now introduces a quantum leap of complexity to my task of arranging my world, since now I must combine their world with mine—tens, thousands, even millions of worlds in other people’s minds. Then, too, this all has to do with semiotic codes, which are the means through which others reveal their own arrangement of the world—codes that are all too often all but inscrutable. A smile, a jig, a nod of the head—candles on the table, or a hush in the hallway—President Kennedy's visit to West Berlin, the Bomb under Mururoa, the public appearances or Her Majesty the Queen, and a host of so-called ‘interpretative devices’. In order to have some command of such things, I need to have an intimate ‘feel’ for others.

The existence of others in my world—further, the existence of their worlds within my world, and the ways in which they communicate their worlds with me—means that ethics may often come down to something all too human. I now need to be sensitive to the expressions, gestures, and postures of others, and a great variety of semiotic codes besides—not to speak of the sufferings, desires, and hopes which lie behind them. I need to understand—to borrow a term from the polymath Thomas Browne—‘the motto of our souls’. This represents a rapport which rests to a very large extent on a careful, sensitive reading of the many others involved in my world, whether this involvement is direct or indirectl. Thus we incorporate personal rapport in a rational ethics—which is human compassion.