22 July 2018

Seizing Control of Depression

The Man at the Tiller 1892 | Theo van Rysselbergh
Posted by Simon Thomas
We know the symptoms of depression well. We read of them everywhere: sleeplessness, weight loss, reckless behaviour—and so on. Yet we tend to miss the fact that the foremost of these symptoms is deeply philosophical. 
The philosopher Tim Ruggerio defines depression, above all, as ‘the healthy suspicion that there may not be an aim or point to existence’. This broadly agrees with a symptom which stands at the top of many lists of symptoms: ‘Feelings of helplessness and hopelessness. A bleak outlook.’

Of course, depression does not exist purely on a philosophical plane. It is deeply felt. The symptoms one reads about do not begin to describe the darkness one feels in the throes of a depressive episode. It may be hard to see a way out when, frayed and tattered, one’s feelings start spiralling—and it seems no amount of positive talk can help.

Yet even then, there is one steady pole at the centre. My feelings belong to me. Only I can do something about them. This, too, is deeply philosophical. It is too easy to doubt or despair about something, without recognising that one is despairing over oneself. One needs to own it—and such ownership, in turn, forms the basis for a rational way forward.

The philosopher-theologian Paul Tillich wrote, ‘The acceptance of despair is in itself faith and on the boundary line of the courage to be ... The act of accepting meaninglessness in itself is a meaningful act.’ Here, then, is how this simple philosophical insight helps us further:
When we recognise that we are dealing with a philosophical struggle, our orientation to the problem may change. The acceptance of depression as my own, far from acceptance in the sense of surrender, becomes the source of the resolve to face the real issue. It is about the search for an all-embracing meaning of life.

When I see that depression is a philosophical problem, it stands to reason that I shall engage in activities which strengthen me philosophically—which enhance the mind and focus on the good.  Conversely, I shall as far as possible remove myself from the company of those who engage in negativity.

When I understand that it is too easy to doubt or despair about something, without recognising that I am despairing over myself, I know to set aside some of those thoughts and activities which are merely avoidant, which serve to continue a once-removed despair.

Knowing that the solution is philosophical, it stands to reason that it does not merely take a day off to apply it. It is a long-term process, and there are no quick fixes. One develops realistic expectations. Similarly, one does not let down one’s guard. Depression is a bit like the devil in Christian belief. It does not take time off. It is not the time for peace until one walks free.

The ownership of depression represents an acceptance of one's own weakness. Socrates was an avid proponent of the dictum ‘Know thyself.’ To know one's weakness in times of distress is of great help, because if one knows what causes one to fall, one can take steps to stop the downward spiral of one’s mindset.

Philosophy in all its fullness includes the spiritual and artistic aspects of our personality. Therefore it is valuable to have an appreciation for the spiritual and aesthetic inclinations of the human ‘soul’, and to exercise and expand on them.
Of course, prevention is always better than cure. ‘Guard your heart, for out of it comes the issues of life,’ wrote the wise King Solomon. Watch your life and be careful what and who you allow in your heart. We are always under the influence of something or someone, at some stage of our life. It is sensible to guard what one allows oneself to be influenced by.

This is not intended to diminish the help that medication gives, or wise counsel. Yet philosophy plays a central role in depression, and may present a definitive anchor for the soul, which enables us to find the way back to a place of reason and not to spiral into despair.

15 July 2018

The Things | Relations Dichotomy

Yin-Yang by Sandi Baker
Posted by Thomas Scarborough
We humans have always been accused of dichotomous thinking: us and them, good and evil, for and against, and so on.  It pervades our thinking, and our existence.  Such dichotomous thinking is closely familiar to us.  Not a day goes by without someone suggesting that we should be more nuanced, less one-sided, better rounded. 
Yet there is a strange dichotomy which is more pervasive still, which passes all but unnoticed in our lives—and, I shall argue, bedevils all of our thinking.  Within its broader bounds, it goes by hundreds of names—which in itself suggests that it has too much escaped our attentions.  One might describe it as the static and dynamic, or being and becoming—but there are many ways to describe it besides:
things and relations (Kant)
objects and arrangements (Wittgenstein)
the spatial and the temporal
nouns and verbs
operators and variables
And so on. It is the simple matter of a world where things exist (we include events, which is things that happen), and exist in a certain relation to one another.  This dichotomy pervades all of our thinking—and this it does at a level which is embedded in our thinking.  As one sees, even in our grammar and our sums, for example.

It all has to do with individuation.  We all begin, apparently, with what William James called ‘one great blooming, buzzing confusion’, then we single out complexes from nature and call them things, entities, objects, even concepts—or events, actions, processes, and so on.  We distinguish these then from the relations between them.

We may have said enough in these few words to identify the presence of this dichotomy at the core of some major philosophical problems, of which just a sample here:
The fact-value distinction (Hume).  We have fact on the one hand—or statements which contain things—yet do not know on the other how we should arrange them or bring them into relation. 
The ‘own goal’ of science (Hawking).  By singling out things from nature, and discarding all that (we think) does not belong to them, we create a world of unforeseen side-effects, as we relate them.
Free will and determinism.  Free will goes to the question of cause and effect, and causation in turn is about the relation between two or more events. This, too, rests on the dichotomy of things and relations.
The mind-body problem.  This problem may rest on our experience that things exist in the world (or so we feel), while only relations can exist in our networking brain—not things, of course.
God.  The problem of God’s existence may rest on the notion of causality, since that which is caused is not influenced by God.  Again, causality rests on the distinction between events and their relations.
We may put it this way.  If we did not have this dichotomy of things (and the sort) versus relations, it would be impossible that we should have any of the problems listed above—and many more.  This suggests that we may solve these problems by doing away with one side of the dichotomy—say, things.  This would leave us only with relations, and relations within relations.  It is not an entirely new idea.

Someone might object.  Even if we have no things, objects, entities, events, actions, and so on, we do still have relations—and these relations are governed by scientific law.  But wait a moment.  Without things, there is no scientific law.  At least, not as we know it.

The fact of the dichotomy is presented here simply as food for further thought.  In my view, the dichotomy is artificial and false.  It is a reflection of something in the human fabric that insists first on our individuating things, then on relating them one to the other.  Yet there never has been anything to set this on a firm foundation.

08 July 2018

Is Time What It Appears to Be?

Posted by Keith Tidman

Picture credit: Shutterstock via https://www.livescience.com/

“Time itself flows in constant motion, just like a river; for neither the river nor the swift hour can stop its course; but, as wave is pushed on by wave, and as each wave as it comes is both pressed on and itself presses the wave in front, so time both flees and follows and is ever new.” – Ovid
We understand time both metaphorically and poetically as a flowing river — a sequence of discrete but fleeting moments — coursing linearly from an onrushing future to a tangible present to an accumulating past. Yet, might ‘time’ be different than that?

Our instincts embrace this model of flowing time as reality. The metaphor extends to suppose a unidirectional flow, or an ‘arrow of time’. According to this, a rock flies through a window, shattering the glass; the splinters of glass never reform into a whole window. The model serves as a handy approximation for our everyday experiences. Yet what if the metaphor of time as a flowing river does not reflect reality? What then might be an alternative model of time?

What if, rather than the notion of flow, time actually entails only one now. Here, an important distinction must be made, for clarity. That is, time is not a sequence of ‘nows’, as proposed by some, such as the British author of alternative physics, Julian Barbour. That is, time is not points of time — corresponding to frames in a movie reel — with events and experiences following one another as ephemeral moments that if slowed down can be distinguished from one another. But, rather, time entails just one now. A model of time in which the future is an illusion — it doesn’t exist. The future isn’t a predetermined block of about-to-occur happenings or about-to-exist things. Likewise, the past is an illusion — it doesn’t exist. 

As to the past not existing, let me be specific. The point is that what we label as history, cosmology, anthropology, archaeology, evolution, and the like do not compose a separately distinguishable past. Rather, they are chronicles — memories, knowledge, understanding, awareness, information, insight, evidence — that exist only as seamless components of now. The Battle of Hastings did not add to an accumulating past as such; all that we know and have chronicled about the battle exists only in the now. ‘Now’ is the entirety of what exists — all things and all happenings: absent a future and past, absent a beginning and end. As the 4th-century philosopher St. Augustine of Hippo presciently noted:
‘There are three times: a present time about things past, a present time about things present, a present time about things future. The future exists only as expectations, the past exists only as memory, but expectation and memory exist in the present’.
In this construct, what we experience is not the flow of time — not temporal duration, as we are want to envision — but change. All the diverse things and events that compose reality undergo change. Individual things change, as does the bigger landscape of which they are a part and to which they are bound. Critically, without change, we would not experience the illusion of time. And without things and events, we would not perceive change. Indeed, as Ernst Mach, the Austrian philosopher-physicist, pointed out: ‘... time is an abstraction, at which we arrive by means of the changes of things’.

It is change, therefore, that renders the apparition of ‘time’ visible to us — that is, change tricks the mind, making time seem real rather than the illusion it is. The illusion of time nonetheless remains helpful in our everyday lives — brown leaves drop from trees in autumn, we commute to work sipping our coffee, an apple rots under a tree, the embers of a campfire cool down, the newspaper is daily delivered to our front door, a lion chases down a gazelle, an orchestra performs Chopin to rapt audience members, and so forth. These kinds of experiences provide grounds for the illusion of time to exist rather than not to exist.

As Aristotle succinctly put it: ‘there is no time apart from change’. Yet, that said, change is not time. Change and time are often conflated, where change is commonly used as a measurement of the presumed passage (flow) of time. As such, change is more real to the illusion of time’s passing than is our observing the hands of a clock rotate. The movement of a clock’s hands simply marks off arbitrarily conventional units of something we call time; however, the hands’ rotation doesn’t tell us anything about the fundamental nature of time. Change leads to the orthodox illusion of time: a distinctly separate future, present, and past morphing from one to the other. Aristotle professed regarding this measurement aspect of time’s illusion:
‘Whether if soul [mind] did not exist, time would exist or not, is a question that may be asked; for if there cannot be someone to count, there cannot be anything that can be counted.’
So it is change — or more precisely, the neurophysiological perception of change in human consciousness — that deludes us into believing in time as a flowing river: a discrete future flowing into a discrete present flowing into a discrete past. The one-way arrow of time.

In this way, the expression of dynamic change provides our everyday illusion of time, flowing inexorably and eternally, as if to flow over us. The British idealist philosopher J.M.E. McTaggart wrote in the early years of the twentieth century that ‘in all ages the belief in the unreality of time has proved singularly attractive’. He underscored the point:
‘I believe that nothing that exists can be temporal, and that therefore time is unreal.’
To conclude, then: Although the intuitive illusion of time, passing from the future to the present to the past, serves as a convenient construct in our everyday lives at work, at home, and at play, in reality this model of time and its flow is a fiction. Actual experience exists only as a single, seamless ‘now’; there is no separately discrete future or past. Our sense of time’s allegorical flow — indeed, of time itself — arises from the occurrence of ‘change’ in things and events – and is ultimately an illusion.

01 July 2018

PP #37 A Celebration of Brashness!



'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'

Posted by Tessa den Uyl and Martin Cohen

A postcard presentation of Times Square
      
Times Square, New York.
‘The soft rush of taxis by him, and laughter, laughters hoarse as a crow’s, incessant and loud, with the rumble of the subways underneath - and over all, the revolutions of light, the growings and recedings of light - light dividing like pearls - forming and reforming in glittering bars and circles and monstrous grotesque figures cut amazingly on the sky.’
During the so-called Jazz Age, that is the optimistic time after ‘the Great War’ and before the Depression, the rise of Nazism and the Second World War, F. Scott Fitzgerald’s metaphor in his book The Beautiful and Damned, reflects so well the human despair combined with hope.

Acts of freedom and expression intertwine to be heard and noticed, to forget and to distract, to employ, and to  hope... In those days, Times Square must have appeared promising, like a colourful stamp on the continent. But what did its message say?

Ideas about segregation and freedom brought ‘silent’ new horizons and made former distinctions tremble. With all there was to come, in those years of the Roaring Twenties, all the layers that combine to make a society were looking for ‘a voice’ and the call echoed, near and far. 
 
People rather grandly called Times Square the ‘crossroads of the world’ and in those days, that might have well been so. And today, on the edge of the square, the NASDAQ controls a good slice of the world’s wealth and the New York Times does likewise for the world's news. 
 
Yet it is after dark, after the office day has finished, that the square really comes alive. Doubtful is whether that liveliness today, is filled with the same complexity and struggle, or with that necessity literally and symbolically to survive. While it once stimulated a proper voice, ‘light dividing like pearls’, now Times Sqaure embraces more of a homogenisation and offers monstrous grotesque figures cut amazingly out of the sky.


24 June 2018

The Importance of Being Seen

Posted by Simon Thomas
Of all our innermost human desires, nothing seems to trump that of being seen. Humanity is meant for community and for togetherness. There is something very fundamental about being acknowledged, of being seen, as a person.
The Internet with all its wonders and the ability to connect people from all over the world, does indeed meet some of this need, but virtual reality does not on any level substitute real human interaction on a personal level. It does, however, point out the fact there that there is a real need for it. A person can have untold ‘friends’ or followers on social media, but this does not replace real human conversation. And we are sick because of it.

Everyone needs to know that what they say, or the very fact of their being is important to somebody. The heart longs for a human embrace, for the attentive ear of another, who shows some interest in being with them and listening to what they say. Psychological research has shown that when people are perpetually in a situation where they are ignored, this causes real emotional pain, and that in turn will cause physical problems caused by the stress of being ignored on a ongoing basis.

This is all the more prevalent in our society of individualism. Everyone wants to connect, but there seems to be this incessant preoccupation with connecting with everything and everyone except that which is in our present reality. It is true, a person can feel lonely in crowd, and feel intense feelings of abandonment even in the company of others. This is especially true in our society with its preoccupation with distraction.

Things become more important than people, virtual friendships on-line become more important than friendships we can experience in real time and in real situations. We see, but we don’t see each other. We put each other in categories, and fail to recognise how much we are the same, with the same need for communicating and the real need for simply communicating with those who share our time and space.

Families today, too, have gone this route. People live under the same roof but do not communicate; there is no or very little interaction. The whole emphasis has shifted form ‘how can I serve’ to ‘how can I get something out of this person’.

I have this kind of relationship with my dog. It comes to me when it is hungry or wants something from me. And that is okay. Animals do not have the complex relationship needs that human beings have. But my dog has what I call ‘cupboard love’ -- he loves me for what he can get from me. But that is not to be the way we interact with our fellow human beings. It is the height of selfishness. And very often the cause of much emotional and mental anguish.

I have noticed however, that that is how the Internet of things works. Someone has something on offer, which the other wants, and what happens is that while felt needs are met on a superficial level, there is not a lasting connection. And it is understandable that people want the connection, but they don’t want to acknowledge the person they interact with. We come across many people in our daily lives, in the office, at the bus station, in the shops, at church. But as many can testify, even after we exit a party or a group of people we feel drained.

It is important to listen to one another. Even a brief interaction can be meaningful if the person we talk to makes us feel that we have been seen, that we have been acknowledged. It is not uncommon to go through a day and while we do many things in the course our day’s activities, we are left empty. What is that? Well I perceive that the reason we fail to connect is that we seem to objectify people and treat them as less than they are.

Human beings are made imago dei -- in the image of God. We were created to interact and communicate; we were made to live in community and not in isolation. To be human is to share in the common human experience, and to live in such a way that we acknowledge one another, and not allow our many distractions to detract from how we relate to one another.

17 June 2018

White Lies – Malevolence or Defence?

Little White Lies, by e9Art
Posted by Christian Sötemann
A little thought experiment: In the year 2088, a mentally highly volatile leader of an autocratic world power is undergoing yet another personal crisis. His wife, so he has heard, is secretly planning to leave him. Without her, he sees no meaning in going on. Since he is also a narcissistic megalomaniac, in his dark mood, he decides that the world should perish if he left him. He prepares to give the order for a nuclear strike and confronts his wife on her secret plans.
Now, what would be a wise thing for her to answer, even if she actually planned on leaving him? Surely, most people would say something along those lines: Calm him down, say that everything is fine, just get him away from ordering a nuclear strike. The rest will be sorted out later. Hence she should lie to save the world from a nuclear attack.

That’s that then, right? Not so fast. In ethics, the role of the lie has been a hotly debated one. Among the ethical stances, there are some which emphasise the consequences of an action to determine whether they are moral or not. Many of the supporters of these approaches would probably have few issues with the wife’s lie. The argument would go like this: Lying in this particular case prevents unfathomable damage occurring to millions of people, so it is the right decision.

There are, however, perspectives in ethics that focus more on principles and duties rather than consequences of actions, notably in Kant’s categorical imperative: ‘Act only according to that maxim by which you can at the same time will that it should become a universal law’. From this point of view, in its strictest form, a lie cannot ever be legitimate, because human relationships would become poisoned if everybody lied to each other all the time.

In many cases, there is some validity to that principle. We have to be able, at least most of the time, to confide in what people around us tell us. The lie has to be the exception rather than the rule. Our everyday life would be seriously impaired if we all lied to each other all or most of the time. 

Still, there is a point to be made for white lies. Schopenhauer viewed lies as a legitimate form of self-defence in cases of extortion, threat or unauthorised interference or intrusion, among other things. If I am exposed to an evil will, lying can be part of the arsenal to defend myself.

For example, if somebody broke into my house, thus violating my right to privacy, my exclamation telling the burglar that the police were already on its way, would represent a perfectly legitimate lie to make this intruder leave my house as quickly as possible. Similarly, a child threatened by bullies on its way home from school might want to use the white lie that his parents or elder brother were just around the corner. There is no malevolent deceit in situations such as these.

It seems that the most important aspect here is that there is a predicament which can make a white lie a suitable means to an end. To avert a catastrophe or a crime, white lies can come into consideration. Besides, from this perspective, the ‘lie’ aspect of the white lie becomes less relevant – rather, it becomes one of several means to defend oneself. It is something one can do to get out of a dangerous situation.

The application of the categorical imperative in this case should therefore not denounce the white lie as harmful, but could be reformulated as: ‘In a dangerous situation threatening the physical and psychological integrity of an individual in an illegitimate way, every individual should have the right to undertake sufficient actions to avert this threat’.

In German, one translation of ‘white lie’ is Notlüge, meaning, literally, ‘emergency lie’. Perhaps this serves to illustrate some cases in which a white lie seems appropriate. It is something that is more a verbal form of defence rather than a mere lie.

Certainly, it would be harmful to lie all of the time. And it can be harmful to never ever lie. The potential Kantian counterargument that this takes into consideration the consequences of actions rather than a principled stance regardless of what happens afterwards is something that can be addressed.

But it represents another example of morality not necessarily being beholden to one orthodoxy throughout.  We may consider principles as well as consequences in our moral deliberations. There is something to be found between the extremes of rigidity and arbitrariness. So, we should not blame the dictator’s wife for her white lie. Those living in the year 2088 will be grateful for our leniency.

10 June 2018

BOOK REVIEWS: Back to the Future with the Food Gathering Diet

Posted by Martin Cohen*

BOOK REVIEW
Back to the Future with the Food Gatherers Diet


How we imagine hunting and gathering - in this case, on the South Texas Plains


Food Sanity: How to Eat in a World of Fads and Fiction
By David Friedman (Turner 2018).

Psst! Maybe someone should have told David Friedman, well-known media personality as well as the author of this new look at food issues – there are hardly any vegans. So if you pitch a book on 'how to eat' to that crowd, you take the risk of ending up preaching to a much reduced congregation. Add to which the serious vegans in town won't like some of what Friedman has to say, because vegans don’t eat eggs and certainly don’t eat fish. All of which only goes to show, that food is a pretty controversial and divisive issue these days, and if you want to be honest, as Friedman evidently does, you're going to have to risk trampling on the dearly held, indeed dearly munched, beliefs of lots of people.

But I hope Food Sanity does find that wider readership, because I’ve read a lot of books and articles recently about food and this one really does clear out a lot of the deadwood and present some pretty mind-boggling facts (and figures) to ‘put the record straight’, as Jack Canfield (of Chicken Soup for the Soul fame) puts it, by way of an endorsement of the book.

Take one opening salvo, that as I say, will surely lose Friedman lots of readers in one fell swoop: the Paleo or ‘Caveman’ Diet. This is probably the most popular diet going and that’s likely because it fits so excellently people’s dearly held prejudices. Plus, it allows them to eat lots of beef-burgers and chips, while cutting out things like muesli which only hippies eat anyway. But oh no, Friedman has done his research and found out that Stone Age folk didn’t really eat lots of red meat washed down with a beaker of blood, as we like to imagine. Instead, using both archaeological and anthropological research as a guide, he says that the earliest human tribes spent most of their time eating fruits and seeds, which they gathered, and probably only really sharpened the spears (or so, at least, I imagine) for internecine human disputes.

Friedman finishes his deconstruction of Paleo by consideration of human biology too: notably the fact that we just aren’t built to catch our fellow animals. We lack the right claws, teeth and general physique too. He points out, a thing curiously overlooked, that Stone Age people would have been rather short and squat - not the fine figures wielding clubs that we imagine. He retells Jared Diamond’s tale of a hunting trip by one of today’s last remaining ‘stone age’ tribes, in New Guinea. At the end of the hunt, the tribe had caught only some baby birds, frogs and mushrooms.

This is all fascinating to me, but compelling too are Friedman’s physiological observations, most particularly on the acidity of the human stomach. The gastric fluids of carnivores are very acidic (pH 1), which is essential if they are to break down the proteins and to kill bacteria. Our stomachs, however, are much less acidic (pH 5), and simply can’t tolerate much uncooked meat. And if, yes, Stone Age man might have done a bit of cooking, it would probably have been rather rudimentary with parts of the meat not really cooked.

Actually, by the time I had finished reading all of the reasons that ‘humans can't eat meat’, I was left puzzled by Friedman’s conclusion which was that a significant proportion of the prehistoric human diet (nonetheless) seems to have been meat. Less surprising was Friedman’s hearty endorsement of eggs, which surely everyone has heard by now are really not dangerous, and don’t cause heart attacks after all, and fish, which he carefully defends form claims that they are today dangerously contaminated with things like mercury.

However dairy gets the thumbs down, with a disdain that I personally felt was unjustified. Dairy, after all, is much more than drinks of cow’s milk - it is goat and sheep milk, cheese and cream too -  and an inseparable part of many dishes. We are advised here instead to swap to things like ‘almond milk’, and ‘hemp milk’ but I know these substitutes very well, and, well, they ain’t one. At least Friedman doesn’t try to suggest we switch to soya milk because, as he rightly observes, that is a food disaster just in itself

There is, to be honest, a bit too much bad news in this book - so much so that I started to skip some  sections, which fortunately the book’s modular structure permits. On the other hand, Friedman makes an effort to leaven the mix by including some good news and positive suggestions, including a two page table of the healthiest foods on earth. What are they? They're all fruits and veggies - the things that Plato and Pythagoras were praising and recommending nearly three thousand years ago. It seems that it’s time, if not indeed long overdue, to go back to following their advice.


*Martin Cohen is the author of a forthcoming book on food issues too called I Think Therefore I Eat, which is also published by Turner, and due out in November 2018


04 June 2018

Picture Post #36 A postcard from Taroudant









'Because things don’t appear to be the known thing; they aren’t what they seemed to be neither will they become what they might appear to become.'


Posted by Tessa den Uyl and Martin Cohen

A postcard from Taroudant, Maroc

One piece of advice offered is to lower the gaze, to not allow it to dwell, as if the eye serves distraction.

The woman seated in front of the painting is possibly homeless. Her posture dissolves with the two figures on the wall, characterised by their carved-out eyes, and urge us to imagine where this woman can put her gaze.

Eyes and hearts, their combination invites a myriad of symbolic attributions. One of them is that a woman with her eyes can reach the man in his heart. The carved-out eyes suggest that women, even when veiled, still look (and distract), which they should not... Or is the image saying something quite different, that the time for women to be veiled is consigned to history and that these days we can 'forget about the eyes’?

An eye is connected with light, and light with reflection. The ‘seduction’ begins with the question of where the reflection should pose its attention.

27 May 2018

Occam's Razor: On the Virtue of Simplicity

As a Franciscan monk, simplicity was at the heart of   William's daily life.
Posted by Keith Tidman

The English philosopher and monk, William of Occam (c. 1287–1347), surely got it about right with his ‘law of parsimony’, which asserts, as a general principle, that when there are two competing explanations or theories, the one with the fewest assumptions (and fewest guesses or variables) more often is to be prefered. As the ‘More than Subtle Doctor’ couched the concept in his Summa Logicae, ‘It is futile to do with more what can be done with fewer’ — itself an example of ‘economy’. William’s law is typically referred to as Occam’s razor — the word ‘razor’ signifying a slicing away of arguably unnecessary postulates. In many instances, Occam’s razor is indeed right; in other examples, well, perhaps not. Let’s explore the ideas further.

Although the law of parsimony has always been most closely associated with William of Occam, (Occam, now called ‘Ockham’, being the village where he was born), he hasn’t been the principle’s only proponent. Just as famously, a millennia and a half earlier, the Greek philosopher Aristotle said something similar in his Posterior Analytics:
‘We may assume the superiority ceteris paribus [other things being equal] of the demonstration which derives from fewer postulates or hypotheses.’
And seven centuries after William, Albert Einstein, perhaps thinking of his own formulation of special relativity, noted that ‘the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible’. Many other philosophers, scientists, and thinkers have also admired the concept.

Science’s favoritism toward the parsimony of Occam’s razor is no more apparent than in the search for a so-called ‘theory of everything’ — an umbrella theory unifying harmoniously all the physical forces of the cosmos, including the two cornerstones of 20th-century physics: the general theory of relativity (describing the macro scale) and quantum theory (describing the micro scale). This holy grail of science has proven an immense but irresistible challenge, its having occupied much of Einstein’s life, as it has the imagination of other physicists. But the appeal to scientists is in a unified (presumed final or all-encompassing) theory being condensed into a single set of equations, or perhaps just one equation, to describe all physical reality. The appeal of the theory’s potential frugality in coherently and irreducibly explaining the universe remains immense.

Certainly, philosophers too, often regard parsimony as a virtue — although there have been exceptions. For clarity, we must first note that parsimony and simplicity are usually, as a practical matter, considered one and the same thing — that is, largely interchangeable. For its part, simplicity comes in at least two variants: one equates to the number and complexity of kinds of things hypothesised, and sometimes referred to as ‘elegance’ or ‘qualitative parsimony’; the second equates to the number and complexity of individual, independent things (entities) hypothesised, and sometimes referred to as ‘quantitative parsimony’. Intuitively, people in their daily lives usually favor simpler hypotheses; so do philosophers and scientists. For example, we assume that Earth’s gravity will always apply rather than its suddenly ceasing — that is, rather than objects falling upward unassisted.
Among the philosophers who weighed in on the principle was Thomas Aquinas, who noted in Summa Theologica in the 13th century, ‘If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments where one suffices.’ And the 18th-century German philosopher Immanuel Kant, in the Critique of Pure Reason, similarly observed that ‘rudiments or principles must not be unnecessarily multiplied.’ In this manner, philosophers have sometimes turned to Occam’s razor to criticise broad metaphysical hypotheses that purportedly include the baggage of unnecessary ontological concepts. An example of falling under such criticism via the application of Occam’s razor is Cartesian dualism, which physicalists argue is flawed by an extra category — that is, the notion that the mind is entirely apart from the neuronal and synaptic activity of the brain (the physical and mental purportedly being two separate entities).

Returning to Einstein, his iconic equation, E=mc2, is an example of Occam’s razor. This ‘simple’ mathematical formula, which had more-complex precursors, has only two variables and one constant, relating (via conversion) the amount of energy to the amount of matter (mass) multiplied by the speed of light squared. It allows one to calculate how much energy is tied up in the mass of any given object, such as a chickpea or granite boulder. The result is a perfectly parsimonious snapshot of physical reality. But simplicity isn’t always enough, of course. There must also be consistency with the available data, with the model necessarily accommodating new (better) data as they become available.

Other eminent scientists, like the 17th-century physicist and mathematician Isaac Newton, similarly valued this principle of frugality. The first of Newton’s three ‘rules of reasoning in philosophy’ expressed in his Principia Mathematica offers:
‘We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances. . . . Nature is pleased with simplicity, and affects not the pomp of superfluous causes.’
But, as noted above, Occam’s razor doesn’t always lead to truth per se. Nor, importantly, does the notion of ‘simplicity’ necessarily equate to ease of explanation or ease of understanding. Here are two examples where frugality arguably doesn’t win the day. One theory presents a complex cosmological explanation of the Big Bang and the physical evolution of a 13.8-billion-year-old universe. A single, but very-late-on-the-stage thread of that cosmological account is the intricate biological evolution of modern human beings. A second, creationist explanation of the current universe and of human beings — with far fewer assumptions and hypotheses — describes both as having roots in a single event some 6,000 to 10,000 years ago, with the cosmos conveniently made to look older. Available evidence suggests, however, that the first explanation is correct, despite the second explanation’s parsimony.

In broad ways, Occam’s razor has been supported by the empirical successes of theories that proved parsimonious in their explanations: with fewer causes, entities, properties, variables, and processes embedded in fewer assumptions and hypotheses. However, even though people tend instinctively and understandably to be drawn toward simpler accounts of hoped-for reality, simplicity hasn’t always triumphed. For example, the earlier nature-versus-nurture debate posed a simpler, albeit false, either-or dichotomy in trying to understand a person’s development and behaviour on the basis of either the environment — the influence of external factors, such as experience and learning, on an otherwise blank slate or perhaps set of instincts — or genes and heritability — that is, biological pre-wiring. Reality is, of course, a complex mix of both nature and nurture, with one influencing the other.

To avoid such pitfalls, as the English mathematician and philosopher Alfred North Whitehead pointedly (and parsimoniously) suggested:
‘. . . every natural philosopher should seek simplicity and distrust it.

20 May 2018

‘Purposeful Living’ Through Grief

Rainy Night In The City, by Alina Madan. Poster: Giclee Print
Posted by Lina Ufimtseva
Grief is like a rude neighbour in the night, knocking at your mind’s door at all kinds of inopportune moments.  Hush, you want to tell it, go away, let me sleep.  But not only is grief rude in its all-encompassing demands for attention, it also is disobedient, and stubbornly stays.  Often, for years.
I am stirring a pot of soup on the stove, and I switch it off.  The boiling liquid quickly settles, and the rolling of the surface stops.  ‘Just like my mother's blood,’ I think instinctively.  Her blood stopped moving, too. ‘Just so,’ I think, ‘a loved one's life can slip away, unceremoniously.’ And so, in the sudden memory which the soup brings back, grief stands rudely knocking.  Go away, go away.

Time allows for the body to regenerate and to heal, provided it is not put under more stress.  Years later, one may feel the strain in a joint from an old injury, but it will often be no more than a lingering nuisance.  Grief, on the other hand, can hit one like a train, no matter how much time has passed since tragedy struck. Why is emotional pain more difficult to bear than physical pain? 

The brain uses a single neural system to detect and feel pain.  The anterior insula cortex and the anterior cingulate cortex are responsible for detecting pain, regardless whether it is of a physical or emotional nature.  Even painkillers may numb emotional pain temporarily.  But they don’t help in healing.

This begs the question, why does emotional pain not heal as if it were physical?

Upon asking how a mother’s labour went, a woman may underplay her experience and reply that it was ‘painful’ or ‘a lot of pressure’.  Yet those mothers who lay in agony giving birth will voluntarily unleash the same process upon their bodies again and again.  Physical pain lingers only as an awareness that it was indeed at one time painful. 

Grief, however, has the unique ability to reiterate itself at the most seemingly random moments.  Therein lies a clue.  If we want physical pain to leave our bodies—assuming that, as it usually does,-- it affects only a certain limb or area of the body—we may use a crutch to prevent too much strain, say, on a leg.  But how does one rest from grief?

Generally one does not.

Our brains process the pain of grief in a non-linear manner.  Physical trauma leaves scars—smooth scars.  Emotional pain creates what I would call neural scabs of sorts that can be—and often will be—picked at, voluntarily or not.

The psychologist Thomas Crook has noted:
‘Indeed, when brain imaging studies are done on people who are grieving, increased activity is seen along a broad network of neurons.  These link areas associated not only with mood but also with memory, perception, conceptualization, and even the regulation of the heart, the digestive system, and other organs.  This shows the pervasive impact loss or even disappointment can have.’
Grief affects the neural pathways in a far more pervasive and ineluctable or ineludible manner than physical pain.  Emotional pain, like a scab, can very easily get picked by a casual scratch of an old memory, and the blood of grief starts pouring again.

Those who have been severely distraught by their circumstances often come to the conclusion that the greater meaning in life is not seeking happiness and hedonism, but in creating a purposeful living.  The word choice here: ‘a purposeful living’ rather than ‘a purposeful life’, is in itself deliberate.  Meaning is not stagnant.  One cannot create a purposeful life and leave it at that.  Purpose must continue to be lived out, to be striven for, to continue in some kind of endeavour. 

Purpose without struggle often loses its meaning.  In this light, grief can be given a purpose.  Severe emotional pain can be the catalyst to revaluate one’s values, choices, and path in life.  It can be one’s very own personal as well as professional spring board. 

Do you wish to leap into the bounds of further despair?  Go ahead, and grief will get you there.  Do you wish to see an armour around yourself unveiled?  Go ahead, and grief can give you the thickest skin and the thinnest heart you ever imagined.

Grief can and will redefine who you thought you were.  Can you hear it knocking?