21 November 2021

COVID-19: We Would Do It All Again

by Thomas Scarborough

'Compound Risks and Complex Emergencies.' PNAS.

Staying in a South African township in the autumn of 2021, once a week an old woman in worn out clothes slipped into the house unannounced, and sat down. She sat quietly for four hours, until lunch was served. She quietly ate her lunch, and left. I was curious to know the reason for her visits.

Her son, she said, had lost his work through the pandemic. He had been supporting her, and she could not now afford to feed herself. She couldn’t keep the lights burning after dark. She couldn’t even pay for candles—let alone the rest. Every day, she would slip quietly into a house like this, she said, and wait for a meal.

This was a direct result of a COVID-19 lockdown. Not that lockdowns are all the same, or have the same effects. The University of Oxford has developed a Stringency Index, which monitors a wide range of measures adopted by governments across the world, in response to the pandemic.

Without weighing up the rationale behind them, it is clear that these various measures have had grievous effects.

It is estimated that more than 200 million jobs were lost worldwide, in 2020 alone. The United Nations’ International Labour Organisation estimates that 8.8 percent of global working hours were lost, which is equivalent to 255 million full-time jobs. This is without considering the knock-on effects—apart from which, such losses are seldom made up in the years which follow.

According to the World Economic Forum, 38 percent of global cancer surgery was postponed or cancelled in the early months of the pandemic. The backlog, they said, would take nearly a year to clear. Of course, one can’t afford to postpone or cancel cancer surgery. Many surgeries were stopped besides—in fact, millions of surgeries per week.

Frustrations and the pressures of life under lockdown brought about huge increases in certain types of crime. Gender-based violence soared. The United Nations General Secretary reported a ‘horrifying global surge’. Scattered statistics confirm it. Many cities reported increases in gender violence of more than 30%. Some reported more than 200%. The effects of such violence never go away.

The lockdowns, in all their complexity and diversity, had negative effects on personal freedoms, supply chains, mental health, inequalities, and any number of things—and since everything is related to everything, almost anything one may think of was skewed.

Of course, not all of the effects of lockdowns were negative. Drug arrests plummeted in various places. Break-ins, not surprisingly, decreased as more people stayed home. In South Africa, a ban on liquor sales quickly emptied out hospital emergency rooms. Most importantly, it is thought that very many lives were saved from COVID-19.

How many lives were saved? This is hard to tell. Imperial College London judged that ‘the death toll would have been huge’—which is, there would have been millions more deaths. Surely this is true. At the same time, they noted that there are ‘the health consequences of lockdowns that may take years to fully uncover’—and paradoxically, the many attempts to stall the pandemic may have prolonged it.

How do we calculate the advantages and disadvantages of a pandemic response? It is, in fact, frightfully difficult. One needs to identify the real issues—or some would say, select them. One needs to weigh all relevant factors—however that might be done. There is a place, too, for the human trauma, whatever its logical status may be, and this is hard to quantify. 

At the end of the day, however, it all comes down to priorities.

An absolute priority during the pandemic was life. No matter how many or how few, lives should be saved. This emphasis is easy to see. Almost any graph which has traced the pandemic shows two lines: the number of cases, and the number of deaths. Other effects of the pandemic are by and large excluded from everyday graphs and charts—which is not to say that they are completely overlooked.

What does one mean, then, by loss of life? One means rapid loss of life, of the kind which overcomes one in the space of a few days from hospital admission to death. Such death, in most published instances, has been an almost complete abstraction—a number attached to COVID-19—signifying the priority of death pure and simple.

At the end of the day, the avoidance of rapid loss of life was the absolute priority in this pandemic. All other priorities were demoted, or put on hold, even repudiated as side-issues. Ought life to have been given absolute priority? Who can say? It has to do with human, cultural, and social values, as we find them, in the early 21st century.

The fact is that life—or the possibility of losing it quickly—was the immutable priority. Therefore, we would do it all again.

14 November 2021

The Limits of the ‘Unknowable’

In this image, the indeterminacy principle is here about the initial state of a particle. The colour (white, blue, green) indicates the phase, that is the position and direction of motion, of the particle. The position is initially determined with high precision, but the momentum is not. 

By Keith Tidman

 

We’re used to talking about the known and unknown. But rarely do we talk about the unknowable, which is a very different thing. The unknowable can make us uncomfortable, yet, the shadow of unknowability stretches across all disciplines, from the natural sciences to history and philosophy, as people encounter limits of their individual fields in the course of research. For this reason, unknowability invites a closer look.

 

Over the many years there has been a noteworthy shift. What I mean is this: Human intellectual endeavour has been steadily turning academic disciplines from the islands they had increasingly become over the centuries back into continents of shared interests, where specialized knowledge flows over one another’s boundaries in recognition of the interconnectedness of ideas and understanding of reality.

 

The result is fewer margins and gaps separating the assorted sciences and humanities. Interdependence has been regaining respectability. What we know benefits from these commonalities and this collaboration, allowing knowledge to profit: to expand and evolve across disciplines’ dimensions. And yet, despite this growing matrix of knowledge, unknowables still persist.

 

Consider some examples.

 

Forecasts of future outcomes characteristically fall into the unknowable, with outcomes often different from predictions. Such forecasts range widely, from the weather to political contests, economic conditions, vagaries of language, technology inventions, stock prices, occurrence of accidents, human behaviour, moment of death, demographics, wars and revolutions, roulette wheels, human development, and artificial intelligence, among many others. The longer the reach of a forecast, often the more unknowable the outcome. The ‘now’ and the short term come with improved certainty, but still not absolute. Reasons for many predictions’ dubiousness may include the following.

 

First, the initial conditions may be too many and indeterminate to acquire a coherent, comprehensive picture of starting points. 


Second, the untold, opaquely diverging and converging paths along which initial conditions travel may overwhelm: too many to trace. 


Third, how forces jostle those pathways in both subtle and large ways are impossible to model and take account of with precision and confidence. 


Fourth, chaos and complexity — along with volatility, temperamentality, and imperceptibly tiny fluctuations — may make deep understanding impossible to attain.

 

Ethics is another domain where unknowability persists. The subjectivity of societies’ norms, values, standards, and belief systems — derived from a society’s history, culture, language, traditions, lore, and religions, where change provides a backdraft to ‘moral truths’ — leaves objective ethics outside the realm of what is knowable. Contingencies and indefiniteness can interfere with moral decision-making. Accordingly, no matter how rational and informed individuals might be, there will remain unsettled moral disagreements.


On the level of being, why there is something rather than nothing is similarly unknowable. In principle,  ‘nothingness’ is just as possible as ‘something’, but for some unknown reason apart from the unlikelihood of spontaneous manifestation, ‘something’ demonstrably prevailed over its absence. Conspicuously, ‘nothingness’ would preclude the initial conditions required for ‘something’ to emerge from it. However, we and the universe of course exist; in its fine-tuned balance, the model of being is not just thinkable, it discernibly works. Yet, the reason why ‘something’ won out over ‘nothingness’ is not just unknown, it’s unknowable.

 

Anthropology arguably offers a narrower instance of unknowability, concerning our understanding of early hominids. The inevitable skimpiness of evidence and of fine-grained confirmatory records  compounded by uncertain interpretations stemming from the paucity of physical remains, and of their unvalidated connections and meaning in pre-historical context  suggests that the big picture of our more-distant predecessors will remain incomplete. A case of epistemic limits.


Another important instance of unknowability comes out of physics. The Heisenberg uncertainty principle, at the foundation of quantum mechanics, famously tells us that the more precisely we know about a subatomic particle’s position, the less we know about its momentum, and vice versa. There is a fundamental limit, therefore, to what one can know about a quantum system.

 

To be clear, though, seemingly intractable intellectual problems may not ultimately be insoluble, that is, they need not join the ranks of the unknowable. There’s an important distinction. Let me briefly suggest three examples.

 

The first is ‘dark energy and dark matter’, which together compose 95% of the universe. Remarkably, the tiny 5% left over constitutes the entire visible contents of the universe! Science is attempting to learn what dark energy and dark matter are, despite their prevalence compared with observable matter. The direct effects of dark energy and dark matter, such as on the universes known accelerating expansion, offer a glimpse. Someday, investigators will understand them; they are not unknowable.

 

Second is Fermat’s ‘last theorem’, the one that he teed up in the seventeenth century as a note in the margin of his copy of an ancient Greek text. He explained, to the dismay of generations of mathematicians, that the page’s margin was ‘too small to contain’ the proof. Fermat did suggest, however, that the proof is short and elegant. Four centuries passed before a twentieth-century British mathematician solved the theorem. The proof, shown to be long, turned out not to be unknowable as some had speculated, just terribly difficult.

 

A last instance that I’ll offer involves our understanding of consciousness. For millennia, we’ve been spellbound by the attributes that define our experience as persons, holding that ‘consciousness’ is the vital glue of mind and identity. Yet, a decisive explanation of consciousness, despite earnest attempts, has continued to elude us through the ages. Inventive hypotheses have abounded, though remained unsettled. Maybe thats not surprising, in light of the human brain’s physiological and functional complexity.

 

But as the investigative tools that neuroscientists and philosophers of the mind yield in the course of collaboration become more powerful in dissecting the layers of the brain and mind, consciousness will probably yield its secrets. Such as why and how, through the physical processes of the brain, we have very personalised experiences. It’s likely that one day we will get a sounder handle on what makes us, us. Difficult, yes; unknowable, no.

 

Even as we might take some satisfaction in what we know and anticipate knowing, we are at the same time humbled by two epistemic factors. First is that much of what we presume to know will turn out wrong or at most partial right, subject to revised models of reality. But the second humbling factor is a paradox: that the full extent of what is unknowable is itself unknowable.

 

07 November 2021

Picture Post #69: The Wallpaper

by Martin Cohen

Robert Polidori, Hotel Petra, Beirut, Lebanon, 2010

If there was something ‘a little spooky’ about last month’s Picture Post, on the face of it there should be too with this abandoned hotel room in a, to some extent, abandoned city, Beirut. 

And yet, that’s not my own reaction to it. On the contrary, the emptiness of the room creates the palette, and the symmetry of the disappearing doorways provides all the action the scene needs.

The colours too, seem to have been chosen by a master artist, as well, in this case they evidently were by the photographer, Robert Polidori. Unlike many of our other photographers, Polidori is well-known for his images of urban environments and interiors with his work exhibited at the Metropolitan Museum of Art (New York), Musée d'art contemporain de Montréal, Martin-Gropius-Bau museum (Berlin), and Instituto Moreira Salles (São Paulo and Rio de Janeiro) to mention just a few. 

Polidori has photographed the restoration of the Château de Versailles since the early 1980s and recorded the architecture and interiors of Havana, and this portrait of the Hotel Petra, once one of the most popular hotels in Beirut,  located in the city centre adjacent to the Grand Theatre seems to me to show that, for an artist, all buildings are equally valid as canvases.

31 October 2021

Briefly; on the Growing Sense of Insignificance that Comes with Aging

by Simbarashe Nyatsanza
 

Lord Chesterfield: "I look upon all that is past as one of those 
romantic dreams, which opium commonly occasions."

I will be, hopefully, turning 28 on the 10th of April this coming year, and I recently, reluctantly, came to the end of my university studies in December 2020. I was 26 years old at the time; late, somehow feeling outgrown and out of place, too aware of my fading sense of wonder and merriment with everything around me to continue on with the farce of mock-ignorance often needed for one to successfully allow themselves to be ‘educated’. 

I had to complete my studies and end the nonsense. I was also feeling quite stagnant and of stunted progress, with nothing to show for my life (if those ever-fleeting moments of glee and glum that characterized my existence back then could safely be called life). But, good god, I miss those old wicked times! Things were hard and confusing and drunken and exciting and draining and enraging and saddening and thrilling and depressing and carefree and sexy and sexual and niggerish and nightmarish and orgasmic and purifying and, sometimes, there was nothing but the rousing possibility, the potential, for more innocuous but meaningful meaninglessness. I was alive for, and because of, that. 

Memories of the debauched moments of belligerence, the often psychotic, sporadically violent and extremely intoxicating sense of selflessness that came with the demonstration of inebriated impulses during those days, now assume a faint kind of beauty that is no longer reachable, simply impossible to replicate, way out of par and incompatible with the forced sense of self-responsibility that often finds itself creeping in and enlarging in the crevices of the mind as the years add on. Yet these memories are somehow reassuring, as if they were a faint picture of a monument - the strange and saddening beauty of a wilting flower - a remembrancer of a series of moments fully exhausted, while they were exhausting - and yet the closest thing to liberation - to the very soul of the young, misguided misfit I so proudly was. 

 It often feels that it won't be long till I start describing myself as a 'once-was', or as an 'i-used-to'. Like those busy-sounding, busy losers who speak of a past laden with potential and yet say nothing of their rotten and dried out, washed up presences. Or those forcefully eccentric Africans who still speak with a misplaced ‘White’ accent ever since they went to Europe for a time as brief as a fortnight when they were as young as ten years old, who desperately hang on to a fading sense of sophistication, of a ‘difference’, who greatly overestimated their own sense of importance. Like them, it feels like it won't be long until my mind finally accepts its role as a repository for failed tests, failed relationships, failed prayers, failed exams, failed apologies, failed attempts at reconciliation, failed learning, failed loving - failed everythings, and the mouth resigns drunkenly into an amplifier of the uselessness of wisdom that comes with hindsight, always blasting even in the forced silence of one's mind. 

 It is The Irishman telling Jimmy Hover that 'It is what it is'. It is Red, in Shawshank Redemption, marking his name beneath a dead one, and then moving on. It's the red at the center of the flame that burns your fingertips as you light another cigarette that gently pushes you, drag by drag, towards permanent oblivion. It's the most gentlemanly Robert Mugabe finally dying in an Adidas tracksuit, while he always wore suits all his life. It is like shaving your hair every two days because of the gray strands that always eagerly sprout in it, reminding you of the old man whose face is starting to come out of yours, when you hadn't thought of yourself as that old. In fact, you would never have thought of yourself as old at all. It's that ageless voice inside of you, the one that keeps coming up with the reassurances, the reminders of how everyone is God's favorite child, of how there's still a chance to turn things around, to be something, like the others, finally gently screaming, “Get over yourself!” from the center of your brain. 

It's that desperate yearning for longevity, which almost comes across as a series of threatening promises of mediocrity. It's really a well crafted and articulate declaration of complacency; aging.

24 October 2021

Is there a RIGHT not to be vaccinated?

Eleanor Roosevelt and the Universal Declaration of Human Rights
(1948) which gives the right to medical treatment, but is silent 
on the right to decline it

I
s there a RIGHT not to be vaccinated? The question was raised over at Quora and I was spurred to answer it there.

People certainly think they have a right to your own body, indeed years ago Thomas Hobbes made this the basis of all our rights. But Hobbes recognised that people could be forced to do many things “with” their bodies.

And today, unvaccinated people are certainly finding that a lot of things they thought were their rights are not really. We are in unknown territory with vaccine mandates, really, and the ambiguities reveal themselves as governments perform all sorts of contortions to “appear” to leave people a choice, while in effect making the choice almost impossible to exercise. The US government s many other governments do, will sack people for not being vaccinated, but it does not explicitly seek to a law to make vaccination obligatory.

And so, there are concerted attempts all over the world to make ordinary, healthy people take corona virus vaccines that are known to have non-negligible side-effects including in some cases death. Databases like the EudraVigilance one operated by the European Medical Agency indicate that adverse side-effects are real enough. Two justifications offered for this are that (side-effects apart) the vaccine will protect you from the more serious effects of a Covid infection, and that they reduce transmission of the virus throughout society.

Many governments already mandate vaccinations on the basis that they are in the individuals’ heath interest, for example four doses of the Polio vaccine is recommended in the United States for children, and within Europe eleven Countries have mandatory vaccinations for at least one out of diphtheria, tetanus, pertussis, hepatitis B, poliovirus, Haemophilus influenzae type B, measles, mumps, rubella and varicella vaccine.

So the idea that governments can force you to be vaccinated is a bridge largely crossed already: vaccines are not considered to be experimental health treatments of the kind that the Nuremberg Code has high-lit and banned ever since the excesses of the Nazi regime in the Second World War.

However, the corona virus vaccine does seem to me to come with many problematic ethical issues. Firstly, it is not actually a vaccine in the traditional sense. This matters, as the protection it offers against the disease is far from established. Today, governments like Israel that were first to vaccinate their populations (setting to one side the separate and inferior effort for people in the Occupied Territories) are now mandating third and even fourth shots as the virus continues to spread and cause illness there.

Secondly, it is experimental in the very real sense that the gene therapy technology is novel and comes with significant unknowns. It is for this reason that all the companies making the vaccines insisted on, and got, blanket exemption from prosecution for the effects of their products. One of the early developers of the MRNA vaccine technology, Robert Malone, considers that there is a risk of the method actually worsening the illness in the long run, so called antibody enhancement, and that the unprecedented effort to universalise the vaccine also creates unprecedented downside risks from such an enhanced vaccine.

A third area of concern is that there is no doubt that vaccinated people can both be infected with the corona virus and can be infectious while infected. Although you hear politicians say things like “get vaccinated and then we can get back to normal” this is just political rhetoric, as there never was any reason to think that the inoculations against the corona virus really were equivalent to the successful campaigns for things like polio and rubella.

So, to answer the specific ethical point! The right not to be vaccinated, or in this case injected with gene therapies, does not exist. In which sense, we cannot lose this right, much as I personally think there should be such protections (protections going well beyond marginal cases such as “religious exemptions”). What seems to be new is that governments have taken upon themselves the right to impose a much more risky programme of gene therapy treatments, dished out it seems at six monthly intervals in perpetuity, backed by pretty much unprecedented sanctions on people who would, if allowed, choose not to be inoculated. But the principle of government compulsion is established already: by which I mean we are fined for driving too fast, or disallowed from certain jobs if we don’t have the right training certificates.

What the corona vaccine mandates and penalties for being “unvaccinated” (the restrictions on working, social life, social activity, travel ) really reflect is not the loss of rights as the weakness of existing civic rights. Like taxation, there should be no vaccination without a process of consent expressed through genuine and informed public debate and political representation. But as I say, this is not a right that we have at the moment, so it can hardly be said to be lost.

At the moment, governments claiming to be acting in the “general interest” have targeted individuals and groups, and criminalised certain aspects of normal life, but this is merely an extension of a politics that we have long allowed our governments to exercise.

17 October 2021

On the Appeal of Authoritarianism — and Its Risks

 

On March 30th, Hungary's populist leader, Viktor Orbán, obtained the indefinite recognition of special powers from his parliament, to the shock of many in Europe, and indeed in Hungary

By Keith Tidman

Authoritarianism is back in fashion. Seventy years after the European dictators brought the world to the brink of ruin, authoritarian leaders have again ascended across the globe, preaching firebrand nationalism. And there’s again no shortage of zealous supporters, even as there are equally passionate objectors. So, what has spurred authoritarianism’s renewed appeal? Let’s start by briefly looking at how authoritarianism and its adversarial ideology, liberal democracy, differ in their implied ‘social contract’.

 

One psychological factor for authoritarianism’ allure is its paternal claims, based on all-powerful, all-knowing central regimes substituting for the independent thought and responsibility of citizens. Decisions are made and actions taken on the people’s behalf; individual responsibility is confined to conformance and outright obedience. Worrying about getting choices right, and contending with their good and bad consequences, rests in the government’s lap, not in the individual’s. Constitutional principles start to be viewed as an extravagance, one that thwarts efficiency. For some people, this contract, exchanging freedom for reassuring paternalism, may appeal. For others, it’s a slippery slope that rapidly descends from the illiberalism of populists to something much worse.

 

Liberal democracy is hard work. It requires accountability based on individual agency. It requires people to become informed, assess information’s credibility, analyse arguments’ soundness, and arrive at independent choices and actions. Citizens must be vigilant on democracy’s behalf, with vigilance aided by the free flow of diverse, even contentious, ideas that enlighten and fill the intellectual storehouse on which democracy’s vibrancy depends. Often, individuals must get it right for themselves. They bear the consequences, including in their free and fair choice of elected representatives; ultimately, there are fewer options for offloading blame for bad outcomes. The rewards can be large, but so can the downsides. Constitutional bills of rights, the co-equal separation of powers, and the rule of law are democracy’s valued hallmarks. There’s likewise a social contract, though with allowance for revision to account for conditions at the moment. For many people, this model of democratic governance appeals; for others, it’s disorderly and ineffectual, even messy.

 

It requires only a small shift for the tension between authoritarianism and the personal agency and accountability of liberal democracy to end up tilting in authoritarianism’s favour. Individual perspectives and backgrounds, and particular leaders’ cult of personality, matter greatly here. With this in mind, let’s dig a bit deeper into what authoritarianism is all about and try to understand its appeal.

 

Authoritarianism was once seen more as the refuge of poor countries on far-away continents; nowadays we’ve witnessed its ascendancy in many developed nations too, such as in Europe, where the brittleness of former democracies snapped. Countries like Russia and China briefly underwent ‘liberal springs’, inquisitively flirting with the freedoms associated with democracy before becoming disenchanted with what they saw, rolling back the gains and increasing statist control over the levers of power. In other countries, what starts as extreme rightwing or leftwing populism, as in some quarters of Asia and Central and South America, has turned to authoritarianism. Strongmen have surrounded themselves with a carefully chosen entourage, doing their bidding. Security forces, like modern-day praetorians, shield and enforce. Social and political norms alter, to serve the wishes of centralised powers. It’s about power and control; to be in command is paramount. Challenges to officialdom are quick to set off alarms, and as necessary result in violence to enforce the restoration of conformity.

 

The authoritarian leader’s rationale is to sideline challengers, democratic or otherwise, turning to mock charges of fraudulence and ineptness to neutralize the opposition. The aim is structural submission and compliance with sanctioned doctrine. The leader asserts he or she ‘knows best’, to which flatterers nod in agreement. Other branches of government, from the legislature to the courts and holders of the nation’s purse strings, along with the country’s intelligentsia and news outlets, are disenfranchised in order to serve the bidding of the charismatic demagogue. Such heads of state may see themselves as the singular wellspring of wise decision-making, for some citizens raising the disconcerting spectre of democratic principles teetering in their supposed fragile balance.

 

Authoritarian leaders monopolising the messaging for public consumption, for the purpose of swaying behaviour, commonly becomes an exercise in copycatting the ‘doublespeak’ of George Orwell’s 1984: war is peace; slavery is freedom; ignorance is strength (slogans inscribed by the Party’s Ministry of Truth). Social activism is no longer brooked and thus may be trodden down by heavy-handed trusted handlers. Racism and xenophobia are ever out in front, as has been seen throughout Europe and in the United States, leading to a zealously protective circling of the wagons into increased sectarianism, hyper-partisanship, and the rise of extremist belief systems. In autocracies, criticism — and economic sanctions or withdrawal of official international recognition — from democracies abroad, humanitarian nongovernmental organisations, and supranational unions is scornfully brushed aside.

 

Yet, it may be wrong to suggest that enthusiasts of authoritarian leaders are hapless, prone to make imprudent choices. Populations may feel so stressed by their circumstances they conclude that a populist powerbroker, unhampered by democracy’s imagined rule-of-law ‘manacles’, is attractive. Those stresses on society might range widely: an unnerving haste toward globalisation; fear of an influx of migrants, putting pressure on presumed zero-sum resources, all the while raising hackles over the nation’s majority race or ethnicity becoming the minority; the fierce pitting of social and political identity groups against one another over policymaking; the disquieting sense of lost cohesion and one’s place in society; and a blend of anxiety and suspicion over unknowns about the nation’s future. In such fraught situations, democracy might be viewed as irresolute and clogging problem-solving, whereas authoritarianism might be viewed as decisive.

 

Quashing the voice of the ‘other social philosophy’, the ‘other community, the ‘other reality’ has become increasingly popular among the world’s growing list of authoritarian regimes. The parallel ambiguous wariness of the pluralism of democracy has been fueling this dynamic. It might be that this trend continues indefinitely, with democracy having run its course. Or, perhaps, the world’s nations will cycle unevenly in and out of democracy and authoritarianism, as a natural course of events. Either way, it’s arguable that democracy isn’t anywhere nearly as fragile as avowed, nor is authoritarianism as formidable.

 

09 October 2021

A Moral Rupture

by Thomas Scarborough


Virtually all of our assumptions with regard to ethics are based on theories in which we see no rupture between past, present, and future, but some kind of continuity

If we are with Aristotle, we hold out happiness as the goal, and assume that, as this was true for Aristotle, so it is for me, and forever will be. Or if we are with Moore, we believe that our moral sense is informed by intuition, always was, and will be in the future. If we are religiously minded, we assume that God himself has told us how to live, which was and is eternal and unchanging. 

I propose, instead, that ethics is not in fact constant, and at the present time we are witnessing a fundamental moral rupture. This is based upon a distinct ethical theory, which runs like this: 
As we look upon the world, we arrange the world in our understanding. Depending on how we have so arranged it, we act in this world. A concise way of putting this is that our behaviour is controlled by mental models. Since no one person so arranges the world in quite the same way as the next, our ethics are in many cases strikingly different. 
In past ages, people arranged the world in their minds in such a way that this was largely in keeping with their personal experience of the world. They based it on the people they met, the environment which surrounded them, and so on. Of course, people had access also to parchments, listened to orators, or explored new ideas, so that various influences came to bear upon them. Mostly, however, they interacted with a real world. 

In more recent history, the age of mass communications descended upon us. We invented the printing press, the postal system, then radio, and TV. And certainly, increased travel exposed us all to broader ideas. However, we still understood our world largely in terms of personal experience. Our personal experience, too, informed our interpretation of events and opinions further afield, and we had the leisure to ponder them, and often make sport of them. 

Since the turn of the century, however, we have increasingly been involved in instant, two-way communications, and in many cases dispersed communications, where many people are included at the same time. The result is that, for the first time in history, many (and some say most) of our interactions and reactions are electronic. 

One could list any number of implications. In terms of this post, our arrangement of the world in our understanding is changing. In fact, change is not the word. There is a rupture. The basis on which we arrange the world is not at all what it was. 

Images of the world swamp our experience of it.
The consequences of our views dissipate in the aether.
Feedback to our words and actions often eludes us, and
Ideology is little tempered by direct observation.

If we accept that mental models drive our behaviour, all older notions of ethics are uprooted. It may only become clear to us just how in the decades to come. Marshall McLuhan wrote in 1962, in his landmark The Gutenberg Galaxy, "There can only be disaster arising from unawareness of the causalities and effects inherent in our technologies."

03 October 2021

Picture Post #68 The Sitting Room

by Martin Cohen


Photo credit: Micelo

There's something a little spooky about this picture, emphasised by the face in the mirror above the fireplace – but there too in the ‘empty chairs’. Where are their proud occupants? What did they talk about or do those long evenings in their high-ceilinged castle? For this is a room carefully restored (if not quite brought back to life) by some French enterprise or other.

Indeed the French – and English too – do seem to live in an imaginary past, of posh families in big chateaux / country houses with not much to do except count their silver cutlery. I think it's rather a sad way to live, and so perhaps it is appropriate that this picture seems to me to speak only of a rather forlorn and empty existence.

26 September 2021

The Recounting of History: Getting From Then to Now



Double Herm of Thucydides and Herodotus

Thucydides was a historian of the wars between Athens and Sparta, in which he championed the Athenian general Perikles. Herodotus travelled and wrote widely and tried to be more impartial.



Posted by Keith Tidman

 

Are historians obliged to be unwaveringly objective in their telling of the past? After all, as Hegel asserted: ‘People and government never have learned anything from history or acted on principles deduced from it’.

 

History seems to be something more than just stirring fable, yet less than incontestable reality. Do historians’ accounts live up to the tall order of accurately informing and properly influencing the present and future? Certainly, history is not exempt from human frailty. And we do seem ‘condemned’ to repeat some version of even half-remembered history, such as stumbling repeatedly into unsustainable, unwinnable wars.

 

In broad terms, history has an ambitious task: to recount all of human activity  ideological, political, institutional, social, cultural, philosophical, judicial, intellectual, religious, economic, military, scientific, technological and familial. Cicero, who honoured Herodotus with the title the father of history’, seems to have had such a lofty role in mind for the discipline when he pondered: ‘What is the worth of human life, unless it is woven into the life of our ancestors by the records of history?’ The vast scope of that task implies both great challenges and vulnerabilities.

 

History provides the landscape of past events, situations, changes, people, decisions, and actions. Both the big picture and the subtler details of the historical record spur deliberation, and help render what we hope are wise choices about societys current and future direction. How wise such choices are — and the extent to which they are soundly based on, or at at least influenced by, how historians parse and interpret the past  reflects how ‘wise’ the historians are in fulfilment of the task. At its best, the recounting of history tracks the ebb and flow of changes in transparent ways, taking into account context for those moments in time. A pitfall to avoid, however, is tilting conclusions by superimposing on the past the knowledge and beliefs we hold today.

 

To these ends, historians and consumers of history strive to measure the evidence, complexities, inconsistencies, conflicts, and selective interpretations of past events. The challenge of chronicling and interpretation is made harder by the many alternative paths along which events might have unfolded, riven by changes in direction. There is no single linear progression or trajectory to history, extending expediently from the past to the present; twists and turn abound. The resulting tenuousness of causes and effects, and the fact that accounts of history and human affairs might not always align with one another, influence what we believe and how we behave generations later. 

 

The fact is, historical interpretations pile up, one upon another, as time passes. These coagulating layers can only make the picture of the past murkier. To recognise and skilfully scrape away the buildup of past interpretations, whether biased or context-bound or a result of history’s confounding ebb and flow, becomes a monumental undertaking. Indeed, it may never fully happen, as the task of cleaning up history is less alluring feature than capturing and recounting history.


Up to a point, it aids accuracy that historians may turn to primary, or original, sources of past happenings. These sources may be judged on their own merits: to assess evidence and widely differing interpretations, assess credibility and ferret out personal agendas, and assess the relative importance of observations to the true fabric of history. Artifacts, icons, and monuments tell a story, too, filling in the gaps of written and oral accounts. Such histories are intended to endure, leading us to insights into how the rhythms of social, reformist, and cultural forces brought society to where it is today.


And yet, contemporaneous chroniclers of events also fall victim to errors of commission and omission. It’s hard for history to be unimpeachably neutral in re-creating themes in human endeavour, like the victories and letdowns of ideological movements, leaders, governments, economic systems, religions, and cultures, as well as of course the imposing, disruptive succession of wars and revolutions. In the worst of instances, historians are the voice of powerful elites seeking to champion their own interests. 

 

When the past is presented to us, many questions remain. Whose voice is allowed to be loudest in the recounting and interpretation? Is it that of the conquerors, elites, powerful, holders of wealth, well-represented, wielders of authority, patrons? And is the softest or silenced voice only that of the conquered, weak, disregarded, disenfranchised, including marginalised groups based on race or gender? To get beyond fable, where is human agency truly allowed to flourish unfettered?

 

Therein lies the historian’s moral test. A history that is only partial and selective risks cementing in the privileges of the elite and the disadvantages of the silenced. ‘Revisionism’ in the best sense of the word is a noble task, aimed at putting flawed historical storytelling right, so that society can indeed then ‘act on the principles deduced from it’.



19 September 2021

The Cow in the Field and the Riddle of What Do We REALLY Know?


P
i looks at a wide range of things that go well beyond the scope of academic philosophy, but that shouldn't mean that we can't occasionally return to the narrow path. Talking with existentialists at a new website (that I would recommend to everyone) called Moti-Tribe,  brought me back to thinking about one of my favorite philosophical problems,

This is the story of ‘The Cow in the Field’ that I came up with many years ago at a time when the academic (boring) philosophers were obsessed with someone who had some coins in his pocket but weren’t sure exactly what they were, and calling it grandly, the ‘Gettier Problem’.

You’d have been forgiven for being put right off the issue by how the academics approached it, but indeed, the riddle is very old, can be tracked back certainly to Plato and is indeed rooted even further back in Eastern philosophy where the assumption that we don’t know things is a part of mysticism and monkishness that we don't really understand anything about.

It’s a kind of koan, which as I understand them, the point of which is to startle you out of your everyday assumptions and oblige you to think more intuitively. The conventional account is that they are a tool of Zen Buddhism used to demonstrate *the inadequacy of logical reasoning* - and open the way to enlightenment.

Well, once you explore the origins of Western philosophy, and the ideas of people like Pythagoras, Heraclitus, Socrates and Plato, you soon find out that there is a lot of riddling actually going on. And the reason why is exactly the same: in order to demonstrate this inadequacy of logical reasoning and provoke enlightenment.

Slightly bizarrely, conventional books and courses on philosophy seek to reinvent ancient philosophy to make it all about ‘the discovery’ of logic! But Western philosophy and Eastern mysticism are two sides of the same coin, we can learn from both.

So on to the puzzle!

THE COW IN THE FIELD


Imagine a farmer who has rather fine cow called Daisy. He is so proud of his cow that he often checks up on her. In fact, he is so concerned that one day, when he asks his dairyman how Daisy is doing, and the Dairyman tells him that Daisy is in the field happily grazing, the farmer decides that he needs to know for certain.

He doesn’t want to just have a 99% idea that Daisy is safe, he wants to be able to say 100% that he knows Daisy is okay.

The farmer goes out to the field and, standing by the gate, sees in the distance, behind some trees, a white-and-black shape that he recognises as his favourite cow. He goes back to the dairy and tells his friend the dairyman that he knows Daisy is in the field. 



Okay, so what’s the ‘problem’? Simply whether, at this point, does our farmer really ‘know’ it - or does he merely think that he knows?

Pause for a moment and ask yourself what your intuition is. Because we have to allow that the farmer not only thinks that he knows, he has evidence - the evidence of his eyes we might say - for his belief too.

Anyway, you maybe still think that there’s some doubt about him really knowing, but then we add a new twist. Responding to  the farmer’s worries, the dairyman decides that he will go and check on Daisy, and goes out to the field. And there he does indeed find Daisy, having a nap in a hollow, behind a bush, well out of sight of the gate. He also spots a large piece of black-and-white paper that has got caught in a tree. Point is, yes, Daisy WAS in the field, but the farmer could not have seen her, only the piece of paper.

So the philosophical debate is, when the farmer returned from the field after (as he thought) checking up on his cow, did he really KNOW she was in it?

Because now you see, it seems that Farmer Field has satisfied the three conventional requirements for ‘knowledge’.

• He believes something,

• he has a relevant reason for his belief,

• and in fact his belief is correct...

Philosophers say that he had a ‘justified true belief’. And yet we would not want to say that he really did know what he thought he knew. In this case, that his cow was in the field...

It's a simple story, okay, silly if you like, but entirely possible. And what the possibility shows is that the three conventional requirements for knowledge are simply not enough to give certainty. What THAT implies, is that we know nothing!

Which is back to the Eastern philosophies, which put so much more emphasis on what we don't know - and seek exotic ways to compensate.