26 December 2021

Can Thought Experiments Solve Ethical Dilemmas?


In ethics, the appeal to expand the “moral circle” typically requires moving from consideration of yourself to that of all of nature.

By Keith Tidman

What, in ethical terms, do we owe others, especially when lives are at stake? This is the crux of the ‘Drowning Child’ thought experiment posed by the contemporary philosopher, Peter Singer.

Singer illustrates the question to his students in this way:

You are walking to class when you spot a child drowning in a campus pond. You know nothing of the child’s life; and there is no personal affiliation. The pond is shallow, so it would be easy to wade in and rescue her. You would not endanger yourself, or anyone else, by going into the water and pulling the child out.
But, he adds, there are two catches. One is that your clothes will become saturated, caked in mud, and possibly ruined. The other is that taking the time to go back to your dorm to dry off and change clothes will mean missing the class you were crossing the campus for.

Singer then asks his students, ‘Do you have an obligation to rescue the child?’

The students, without exception and as one might expect, think that they do. The circumstances seem simple. Including that events are just yards away. The students, unprompted, recognise their direct responsibility to save the flailing child. The students’ moral, and even pragmatic, calculus is that the life of the child outweighs the possibility of ruined clothes and a missed class. And, for that matter, possibly the sheer ‘nuisance’ of it all. To the students, there is no ambiguity; the moral obligation is obvious; the costs, even to the cash-strapped students, are trivial.

The students’ answer to the hypothetical about saving a drowning child, as framed above, is straightforward — a one-off situation, perhaps, whose altruistic consequences end upon saving the drowning child who is then safe with family. But ought the situation be so narrowly prescribed? After all, as the stakes are raised, the moral issues, including the range of consequences, arguably become more ambiguous, nuanced, and soul-searching.

At this point, let’s pivot away from Singer’s students and toward the rest of us more generally. In pivoting, let’s also switch situations.

Suppose you are walking on the grounds of a ritzy hotel, to celebrate your fiftieth anniversary in a lavish rented ballroom, where many guests gleefully await you. Because of the once-in-a-lifetime situation, you’re wearing an expensive suit, have a wallet filled with several one-hundred-dollar bills, and are wearing a family legacy watch that you rarely wear.
Plainly, the stakes, at least in terms of potential material sacrifices, are much higher than in the first scenario.

If, then, you spot a child drowning in the hotel’s shallow pond nearby, would you wade in and save the child? Even if the expensive suit will be ruined, the paper money will fall apart from saturation, the family antique watch will not be repairable, and the long-planned event will have to be canceled, disappointing the many guests who expectantly flew in at significant expense?

 

The answer to ‘Do you have an obligation to rescue the child?’ is probably still a resounding yes — at least, let’s hope, for most of us. The moral calculus arguably doesn’t change, even if what materially is at risk for you and others does intensify. Sure, there may be momentary hesitation because of the costlier circumstances. Self-interests may marginally intrude, perhaps causing a pause to see if someone else might jump in instead. But hesitation is likely quickly set aside as altruistic and humanitarian instincts kick in.

To ratchet up the circumstances further, Singer turns to a child starving in an impoverished village, in a faraway country whose resources are insufficient to sustain its population, many of whom live in wretched conditions. Taking moral action to give that child a chance to survive, through a donation, would still be within most people’s finances in the developed world, including the person about to celebrate his anniversary. However, there are two obvious catches: one is that the child is far off, in an unfamiliar land; the other is that remoteness makes it easier to avert eyes and ears, in an effort at psychological detachment.

We might further equivocate based on other grounds, as we search for differentiators that may morally justify not donating to save the starving child abroad, after all. Platitudinous rationales might enter our thinking, such as the presence of local government corruption, the excessive administrative costs of charities, or the bigger, systemic problem of over-population needing to be solved first. Intended to trick and assuage our consciences, and repress urges to help.

Strapped for money and consumed by tuition debt, Singer’s students likely won’t be able to afford donating much, if anything, toward the welfare of the faraway starving child. Circumstances matter, like the inaccessibility; there’s therefore seemingly less of a moral imperative. However, the wealthier individual celebrating his anniversary arguably has a commensurately higher moral obligation to donate, despite the remoteness. A donation equal, let’s say, to the expense of the suit, money, and watch that would be ruined in saving the child in the hotel pond.

So, ought we donate? Would we donate? Even if there might appear to be a gnawing conflict between the morality of altruism and the hard-to-ignore sense of ostensible pointlessness in light of the systemic conditions in the country that perpetuate widespread childhood starvation? Under those circumstances, how might we calculate ‘effective altruism’, combining the empathy felt and the odds of meaningful utilitarian outcomes?

After all, what we ought to do and how we act based on what’s morally right not infrequently diverge. Even when we are confronted with stark images on television, social media, and newspapers of the distended stomachs of toddlers, with flies hovering around their eyes.

For most people, the cost of a donation to save the starving child far away is reasonable and socially just. But the concept of social justice might seem nebulous as we hurry on in the clamour of our daily lives. We don’t necessarily equate, in our minds, saving the drowning child with saving the starving child; moral dissonance might influence choices.

To summarise, Singer presented the ethical calculus in all these situations this way: ‘If it is within our power to prevent something bad from happening, without sacrificing anything of comparable moral importance, we ought, morally, to do it’. Including saving the life of a stranger to avoid a child’s preventable death.

For someone like the financially comfortable anniversary celebrator — if not for the financially struggling college students, who would nevertheless feel morally responsible for saving the child drowning on campus — there’s an equally direct line of responsibility in donating to support the starving child far away. Both situations entail moral imperatives in their own fashion, though again circumstances matter.

The important core of these ethical expectations is the idea of ‘cosmopolitanism’: simply, to value everyone equally, as citizens of the world. Idealistic, yes; but in the context of personal moral responsibility, there’s an obligation to the welfare of others, even strangers, and to treat human life reverentially. Humanitarianism and the ‘common good’ writ large, we suppose.

To this critical point, Singer directs us to the political theorist William Lecky, who wrote of an ‘expanding circle of concern’. It is a circle that starts with the individual and family, and then widens to encompass ‘a class, then a nation, then a coalition of nations, then all humanity’. A circle that is a reflection of our rapid globalisation.

Perhaps, the ‘Drowning Child’ thought experiment exposes divides between how we hypothesize about doing right and actually doing right, and the ambiguity surrounding the consistency of moral decision-making.


20 December 2021

Mathematical Meditations

by Thomas Scarborough

I shall call this post an exploration—a survey. Is mathematics, as Galileo Galilei described it, ‘the language in which God has written the universe’? Are the numerical features of the world, in the words of the authors of the Collins Dictionary of Philosophy, Godfrey Vesey and Paul Foulkes, ‘free from the inaccuracies we meet in other fields'? Many would say yes—however, there are things which give us pause for thought.

  • Sometimes reality may be too complex for our mathematics to apply. It is impossible to calculate in advance something as simple as the trail of a snail on a wall. Stephen Hawking noted, 'Even if we do achieve a complete unified theory, we shall not be able to make detailed predictions in any but the simplest situations.' If we do try to do so, therefore, we abuse mathematics—or perhaps we should say, in many contexts, mathematics fails. 
  • Our measurement of the world may be inadequate to the task—in varying degrees. I take a ruler, and draw a line precisely 100 mm in length. But now I notice the grain of the paper, that my pencil mark is indistinct, and that the ruler's notches are crude. In many cases, mathematics is not the finest fit with the reality we deal with. In some cases, no fit at all. I measure the position of a particle, only to find that theoretical physicist Werner Heisenberg was right: I have lost its velocity. 
  • The cosmologist Rodney Holder notes that, with regard to numbers—all numbers—'a finite number of decimal places constitutes an error'. Owners of early Sinclair calculators, such as myself, viewed the propagation of errors in these devices with astonishment. While calculators are now much refined, the problem is still there, and always will be. This error, writes Holder, 'propagates so rapidly that prediction is impossible'. 
  • In 1931, the mathematician Kurt Gödel presented his incompleteness theorems. Numbers systems, he showed, have limits of provability. We cannot unite what is provable with what is true—given that what this really means is, in the words of Natalie Wolchover, ‘ill-understood’. A better known consequence of this is that no program can find all the viruses on one’s computer. Consider also that no system in itself can prove one’s own veracity. 
  • Then, it is we ourselves who decide what makes up each unit of mathematics. A unit may be one atom, one litre of water, or one summer. But it is not that simple. Albert Einstein noted that a unit 'singles out a complex from nature'. Units may represent clouds with noses, ants which fall off a wall, names which start with a 'J', and so on. How suitable are our units, in each case, for manipulation with mathematics?
  • Worst of all, there are always things which lie beyond our equations. Whenever we scope a system, in the words of philosophy professor Simon Blackburn, there is 'the selection of particular facts as the essential ones'. We must first define a system’s boundaries. We must choose what it will include and what not.  This is practically impossible, for the reason that, in the words of Thomas Berry, an Earth historian, 'nothing is completely itself without everything else'. 
  • I shall add, myself, a 'post-Gödelian' theorem. Any and every mathematical equation assumes that it represents totality. In the simple equation x + y = z, there is nothing beyond z. As human beings, we can see that many things lie outside z, but if the equation could speak, it would know nothing of it. z revolts against the world, because the equation assumes a unitary result, which treats itself as the whole.
Certainly, we can calculate things with such stunning accuracy today that we can send a probe to land on a distant planet’s moon (Titan), to send back moving pictures. We have done even more wonderful things since, with ever increasing precision. Yet still the equations occupy their own totality. Everything else is banished. At what cost?

12 December 2021

Picture Post #70: Civilisation!



'Because things don’t appear to be the known thing; they aren’t what they seemed to be
neither will they become what they might appear to become.'


Posted by Tessa den Uyl

Rooftop Madrid. 2021. 

The last two picture posts did regard a certain idea of decay, and as a threesome, this picture might enhance that subtle fascination that surrounds those layered surfaces.

The abstract has often been criticized as either easy art or difficult thinking. Even when we became used to abstract ideas and turned them into what is called: rational thinking. Do we not mostly accept what is convenient in some way or other, no matter how illusory things are, and all the rest we prefer to discard? 

We overlap thoughts with actions and emotions; overlapping is our strength and our misery. Continuous overlapping sketches the picture we live in, and so, rather unawares, we shake a cocktail that we define as ‘our life’. 

What we can apprehend simply by viewing even trash, so complicated by its nature for refuse(d), and often wished unseen, is that everything traces into something else, and transforms. 

Symbolically, when we look at ourselves, we might find a trashy landscape far worse than this rooftop. Although the terror of seeing the veritable junkyard from within also shows the many forgotten things that we touched and related to during our life. And we’re all that, not just a part. 

The picture we prefer to see cannot match the reality we imagine. How real can we truly be?

05 December 2021

Looking Backward, Looking Forward

by Allister Marran

Who here thinks that life today will be what life is like in 20 years, or 50 or 100?

I mean, when I was born in 1976 there was no such thing as television in South Africa. That came later that year, with a measly one channel and a broadcast time of a couple of hours a day. And it was introduced as another propaganda arm of the apartheid government, carefully orchestrated to feed misinformation to those fortunate enough to afford one, which coincidentally was their target market.

Nobody came home from work or school back then and parked off in front of a 75 inch box with 1000 streaming channels at the ready. Life was vastly, fundamentally different.

Board games, visiting friends for coffee, swimming, clubs, dinner parties, cricket and soccer and four square in the road, reading and collecting and creating. People did stuff.

In 1982 I got my first computer for Christmas, a ZX Spectrum with 48k ram. It was mind blowing what it could do. My life irrevocably changed, the future had arrived and I was a part of it. And it was good.

I sat I front of that computer and learned the ins and outs of basic programming, and I played games. Boy did I play a lot of games. I was hooked.

As processing power increased exponentially so did graphics and game logic, and soon the game worlds were massively immersive, 2D became 3D, polygons became textured and colour pallettes went from 8 colour to 16 million colours in a matter of a few short years. It was glorious.

In the mid 90s I got my first mobile phone, an Ericsson 628 with a single line of display text. It was like a mini PDA like those Tricorder things they carry on Star Trek, or so it felt to me. 

A year or so later the internet came into our homes. Email, webpages and rudimentary forms of social media like bulletin boards, IRC and later instant chat hubs and comment sections on web based message boards.

The world was connected in every way. You no longer had to get out the house, you had a phone in your pocket, you could chat to your mates while.sitting in your study on your PC, you could download episodes of your favorite shows and watch them. And a new thing was on the horizon, you could even shop online for books and later groceries too. Soon you would never have to go out ever again.

The last puzzle piece was the smart phone in 2008 or so. A single device that drew it all together, your PC and tablet and gaming and television and phone and social media and alarm clock and watch and camera and video recorder and file server and banking and weather forecast and news source and calendar and email and GPS all in one place, on one device.

We can marvel at how in 45 years we have progressed from the first TV shows to a super computer in your pocket. And indeed, it's an engineering feat of incredible proportions. 

But is the world better for it?

In a world where people would visit friends and neighbors instead of turn on the telly, in a world where people would go for a run with a mate instead of spending an hour or two on Fortnite, in a world where people were not constantly bombarded with anxiety causing social media hysteria and toxic crazies yelling about the end times because of government over-reach and scaring the crap out of you every day, maybe the simple times without technology were indeed the good old days.

People certainly seemed happier back then.

The world will continue to change, it has to. It's called progress. More new things will come to be in the next 45 years, and your kids who are young today will lament their lot in life and yearn for the simpler times of 2021 when things were easier, when they only had to deal with an iPhone screen and a bit of harmless social media pressure.

Never lose sight of what is truly important in this world, things that have stayed constant throughout the ages, regardless of technological advances and social upheaval.

Treasure friends and family. Seek out and promote honesty and righteousness. Gravitate towards love and caring. Never lose site of your morals, your values and your God. Fight the good fight.

Don't let them grind you down.

We live in 2021. It's a scary place, full of screens and bots and AI's and algorithms.
But our souls can still hark back to that day before it all changed back in 1976, before that test pattern brought the future into our homes.

Change will always affect your life, but don't let it define who you are.

28 November 2021

Whose Reality Is It Anyway?

Thomas Nagel wondered if the world a bat perceives is fundamentally different  to our own

By Keith Tidman

Do we experience the world as it objectively is, or only as an approximation shaped by the effects of information passing through our mind’s interpretative sieve? Does our individual reality align with anyone else’s, or is it exclusively ours, dwelling like a single point amid other people’s experienced realities?

 

We are swayed by our senses, whether through the direct sensory observation of the world around us, or indirectly as we use apparatuses to observe, record, measure, and decipher. Either way, our minds filter the information absorbed, becoming the experiences funneled and fashioned into a reality which in turn is affected by sundry factors. These influences include our life experiences and interpretations, our mental models of the world, how we sort and assimilate ideas, our unconscious predilections, our imaginings and intuitions unsubscribed to particular facts, and our expectations of outcomes drawn from encounters with the world.

 

We believe that what serves as the lifeline in this modeling of personal reality is the presence of agency and ‘free will’. The tendency is to regard free will as orthodoxy. We assume we can freely reconsider and alter that reality, to account for new experiences and information that we mold through reason. To a point, that’s right; but to one degree or another we grapple with biases, some of which are hard-wired or at least deeply entrenched, that predispose us to particular choices and behaviours. So, how freely we can actually surmount those preconceptions and predispositions is problematic, in turn bearing on the limits of how we perceive the world.


The situation is complicated further by the vigorous debate over free will versus how much of what happens does so deterministically, where lifes course is set by forces beyond our control. Altering the models of reality to which we clutch is hard; resistance to change is tempting. We shun hints of doubt in upholding our individual (subjective) representations of reality. The obscurity and inaccessibility of any single, universally accepted objective world exacerbates the circumstances. We realise, though, that subjective reality is not an illusion to be casually dismissed to our suiting, but is lastingly tangible.


In 1974, the American philosopher Thomas Nagel developed a classic metaphor to address these issues of conscious experience. He proposed that some knowledge is limited to what we acquire through our subjective experiences, differentiating those from underlying objective facts. To show how, Nagel turned to bats’ conscious use of echoed sounds as the equivalent of our vision in perceiving its surroundings for navigation. He argued that although we might be able to imagine some aspects of what it’s like to be a bat, like hanging upside down or flying, we cannot truly know what a bat experiences as physical reality. The bat’s experiences are its alone, and for the same reasons of filtering and interpretation, are likewise distinguishable from objective reality.

 

Sensory experience, however, does more than just filter objective reality. The very act of human observation (in particular, measurement) can also create reality. What do I mean? Repeated studies have shown that a potential object remains in what’s called ‘superposition’, or a state of suspension. What stays in superposition is an abstract mathematical description, called a ‘wavefunction’, of all the possible ways an object can become real. There is no distinction between the wave function and the physical things.


While in superposition, the object can be in any number of places until measurement causes the wavefunction to ‘collapse’, resulting in the object being in a single location. Observation thus has implications for the nature of reality and the role of consciousness in bringing that about. According to quantum physicist John Wheeler, ‘No ... property is a property until it is observed’, a notion presaged by the philosopher George Berkeley three centuries earlier by declaring ‘Esse est percepi’ – to be, is to be perceived.


Evidence, furthermore, that experienced reality results from a subjective filtering of objective reality comes from how our minds react to externalities. For example, two friends are out for a stroll and look up at the summer sky. Do their individual perceptions of the sky’s ‘blueness’ precisely match each other’s or anyone else’s, or do they experience blueness differently? If those companions then wade into a lake, do their perceptions of ‘chilliness’ exactly match? How about their experiences of ‘roughness’ upon rubbing their hand on the craggy bark of a tree? These are interpretations of objective reality by the senses and the mind.


Despite the physiology of the friends’ brains and physical senses being alike, their filtered experiences nonetheless differ in both small and big ways. All this, even though the objective physical attributes of the sky, the lake, and the tree bark, independent of the mind, are the same for both companions. (Such as in the case of the wavelength of visible light that accounted for the blueness being interpretatively, subjectively perceived by the senses and mind.) Notwithstanding the deceptive simplicity of these examples, they are telling of how our minds are attuned to processing sensory input, thereby creating subjective realities that might resemble yet not match other people’s, and importantly don’t directly merge with underlying objective reality.

  

In this paradigm of experience, there are untold parsed and sieved realities: our own and everyone else’s. That’s not to say objective reality, independent of our mental parsing, is myth. It exists, at least as backdrop. That is, both objective and subjective reality are credible in their respective ways, as sides of the whole. It’s just that our minds’ unavoidable filtering leads to the altering of objective reality. Objective reality thus stays out of reach. The result is our being left with the personal reality our minds are capable of, a reality nonetheless easily but mistakenly conflated with objective reality.

 

That’s why our models of the underlying objective reality remain approximations, in states of flux. Because when it comes to understanding the holy grail of objective reality, our search is inspired by the belief that close is never close enough. We want more. Humankind’s curiosity strives to inch closer and closer to objective reality, however unending that tireless pursuit will likely prove.

 

21 November 2021

COVID-19: We Would Do It All Again

by Thomas Scarborough

'Compound Risks and Complex Emergencies.' PNAS.

Staying in a South African township in the autumn of 2021, once a week an old woman in worn out clothes slipped into the house unannounced, and sat down. She sat quietly for four hours, until lunch was served. She quietly ate her lunch, and left. I was curious to know the reason for her visits.

Her son, she said, had lost his work through the pandemic. He had been supporting her, and she could not now afford to feed herself. She couldn’t keep the lights burning after dark. She couldn’t even pay for candles—let alone the rest. Every day, she would slip quietly into a house like this, she said, and wait for a meal.

This was a direct result of a COVID-19 lockdown. Not that lockdowns are all the same, or have the same effects. The University of Oxford has developed a Stringency Index, which monitors a wide range of measures adopted by governments across the world, in response to the pandemic.

Without weighing up the rationale behind them, it is clear that these various measures have had grievous effects.

It is estimated that more than 200 million jobs were lost worldwide, in 2020 alone. The United Nations’ International Labour Organisation estimates that 8.8 percent of global working hours were lost, which is equivalent to 255 million full-time jobs. This is without considering the knock-on effects—apart from which, such losses are seldom made up in the years which follow.

According to the World Economic Forum, 38 percent of global cancer surgery was postponed or cancelled in the early months of the pandemic. The backlog, they said, would take nearly a year to clear. Of course, one can’t afford to postpone or cancel cancer surgery. Many surgeries were stopped besides—in fact, millions of surgeries per week.

Frustrations and the pressures of life under lockdown brought about huge increases in certain types of crime. Gender-based violence soared. The United Nations General Secretary reported a ‘horrifying global surge’. Scattered statistics confirm it. Many cities reported increases in gender violence of more than 30%. Some reported more than 200%. The effects of such violence never go away.

The lockdowns, in all their complexity and diversity, had negative effects on personal freedoms, supply chains, mental health, inequalities, and any number of things—and since everything is related to everything, almost anything one may think of was skewed.

Of course, not all of the effects of lockdowns were negative. Drug arrests plummeted in various places. Break-ins, not surprisingly, decreased as more people stayed home. In South Africa, a ban on liquor sales quickly emptied out hospital emergency rooms. Most importantly, it is thought that very many lives were saved from COVID-19.

How many lives were saved? This is hard to tell. Imperial College London judged that ‘the death toll would have been huge’—which is, there would have been millions more deaths. Surely this is true. At the same time, they noted that there are ‘the health consequences of lockdowns that may take years to fully uncover’—and paradoxically, the many attempts to stall the pandemic may have prolonged it.

How do we calculate the advantages and disadvantages of a pandemic response? It is, in fact, frightfully difficult. One needs to identify the real issues—or some would say, select them. One needs to weigh all relevant factors—however that might be done. There is a place, too, for the human trauma, whatever its logical status may be, and this is hard to quantify. 

At the end of the day, however, it all comes down to priorities.

An absolute priority during the pandemic was life. No matter how many or how few, lives should be saved. This emphasis is easy to see. Almost any graph which has traced the pandemic shows two lines: the number of cases, and the number of deaths. Other effects of the pandemic are by and large excluded from everyday graphs and charts—which is not to say that they are completely overlooked.

What does one mean, then, by loss of life? One means rapid loss of life, of the kind which overcomes one in the space of a few days from hospital admission to death. Such death, in most published instances, has been an almost complete abstraction—a number attached to COVID-19—signifying the priority of death pure and simple.

At the end of the day, the avoidance of rapid loss of life was the absolute priority in this pandemic. All other priorities were demoted, or put on hold, even repudiated as side-issues. Ought life to have been given absolute priority? Who can say? It has to do with human, cultural, and social values, as we find them, in the early 21st century.

The fact is that life—or the possibility of losing it quickly—was the immutable priority. Therefore, we would do it all again.

14 November 2021

The Limits of the ‘Unknowable’

In this image, the indeterminacy principle is here about the initial state of a particle. The colour (white, blue, green) indicates the phase, that is the position and direction of motion, of the particle. The position is initially determined with high precision, but the momentum is not. 

By Keith Tidman

 

We’re used to talking about the known and unknown. But rarely do we talk about the unknowable, which is a very different thing. The unknowable can make us uncomfortable, yet, the shadow of unknowability stretches across all disciplines, from the natural sciences to history and philosophy, as people encounter limits of their individual fields in the course of research. For this reason, unknowability invites a closer look.

 

Over the many years there has been a noteworthy shift. What I mean is this: Human intellectual endeavour has been steadily turning academic disciplines from the islands they had increasingly become over the centuries back into continents of shared interests, where specialized knowledge flows over one another’s boundaries in recognition of the interconnectedness of ideas and understanding of reality.

 

The result is fewer margins and gaps separating the assorted sciences and humanities. Interdependence has been regaining respectability. What we know benefits from these commonalities and this collaboration, allowing knowledge to profit: to expand and evolve across disciplines’ dimensions. And yet, despite this growing matrix of knowledge, unknowables still persist.

 

Consider some examples.

 

Forecasts of future outcomes characteristically fall into the unknowable, with outcomes often different from predictions. Such forecasts range widely, from the weather to political contests, economic conditions, vagaries of language, technology inventions, stock prices, occurrence of accidents, human behaviour, moment of death, demographics, wars and revolutions, roulette wheels, human development, and artificial intelligence, among many others. The longer the reach of a forecast, often the more unknowable the outcome. The ‘now’ and the short term come with improved certainty, but still not absolute. Reasons for many predictions’ dubiousness may include the following.

 

First, the initial conditions may be too many and indeterminate to acquire a coherent, comprehensive picture of starting points. 


Second, the untold, opaquely diverging and converging paths along which initial conditions travel may overwhelm: too many to trace. 


Third, how forces jostle those pathways in both subtle and large ways are impossible to model and take account of with precision and confidence. 


Fourth, chaos and complexity — along with volatility, temperamentality, and imperceptibly tiny fluctuations — may make deep understanding impossible to attain.

 

Ethics is another domain where unknowability persists. The subjectivity of societies’ norms, values, standards, and belief systems — derived from a society’s history, culture, language, traditions, lore, and religions, where change provides a backdraft to ‘moral truths’ — leaves objective ethics outside the realm of what is knowable. Contingencies and indefiniteness can interfere with moral decision-making. Accordingly, no matter how rational and informed individuals might be, there will remain unsettled moral disagreements.


On the level of being, why there is something rather than nothing is similarly unknowable. In principle,  ‘nothingness’ is just as possible as ‘something’, but for some unknown reason apart from the unlikelihood of spontaneous manifestation, ‘something’ demonstrably prevailed over its absence. Conspicuously, ‘nothingness’ would preclude the initial conditions required for ‘something’ to emerge from it. However, we and the universe of course exist; in its fine-tuned balance, the model of being is not just thinkable, it discernibly works. Yet, the reason why ‘something’ won out over ‘nothingness’ is not just unknown, it’s unknowable.

 

Anthropology arguably offers a narrower instance of unknowability, concerning our understanding of early hominids. The inevitable skimpiness of evidence and of fine-grained confirmatory records  compounded by uncertain interpretations stemming from the paucity of physical remains, and of their unvalidated connections and meaning in pre-historical context  suggests that the big picture of our more-distant predecessors will remain incomplete. A case of epistemic limits.


Another important instance of unknowability comes out of physics. The Heisenberg uncertainty principle, at the foundation of quantum mechanics, famously tells us that the more precisely we know about a subatomic particle’s position, the less we know about its momentum, and vice versa. There is a fundamental limit, therefore, to what one can know about a quantum system.

 

To be clear, though, seemingly intractable intellectual problems may not ultimately be insoluble, that is, they need not join the ranks of the unknowable. There’s an important distinction. Let me briefly suggest three examples.

 

The first is ‘dark energy and dark matter’, which together compose 95% of the universe. Remarkably, the tiny 5% left over constitutes the entire visible contents of the universe! Science is attempting to learn what dark energy and dark matter are, despite their prevalence compared with observable matter. The direct effects of dark energy and dark matter, such as on the universes known accelerating expansion, offer a glimpse. Someday, investigators will understand them; they are not unknowable.

 

Second is Fermat’s ‘last theorem’, the one that he teed up in the seventeenth century as a note in the margin of his copy of an ancient Greek text. He explained, to the dismay of generations of mathematicians, that the page’s margin was ‘too small to contain’ the proof. Fermat did suggest, however, that the proof is short and elegant. Four centuries passed before a twentieth-century British mathematician solved the theorem. The proof, shown to be long, turned out not to be unknowable as some had speculated, just terribly difficult.

 

A last instance that I’ll offer involves our understanding of consciousness. For millennia, we’ve been spellbound by the attributes that define our experience as persons, holding that ‘consciousness’ is the vital glue of mind and identity. Yet, a decisive explanation of consciousness, despite earnest attempts, has continued to elude us through the ages. Inventive hypotheses have abounded, though remained unsettled. Maybe thats not surprising, in light of the human brain’s physiological and functional complexity.

 

But as the investigative tools that neuroscientists and philosophers of the mind yield in the course of collaboration become more powerful in dissecting the layers of the brain and mind, consciousness will probably yield its secrets. Such as why and how, through the physical processes of the brain, we have very personalised experiences. It’s likely that one day we will get a sounder handle on what makes us, us. Difficult, yes; unknowable, no.

 

Even as we might take some satisfaction in what we know and anticipate knowing, we are at the same time humbled by two epistemic factors. First is that much of what we presume to know will turn out wrong or at most partial right, subject to revised models of reality. But the second humbling factor is a paradox: that the full extent of what is unknowable is itself unknowable.

 

07 November 2021

Picture Post #69: The Wallpaper

by Martin Cohen

Robert Polidori, Hotel Petra, Beirut, Lebanon, 2010

If there was something ‘a little spooky’ about last month’s Picture Post, on the face of it there should be too with this abandoned hotel room in a, to some extent, abandoned city, Beirut. 

And yet, that’s not my own reaction to it. On the contrary, the emptiness of the room creates the palette, and the symmetry of the disappearing doorways provides all the action the scene needs.

The colours too, seem to have been chosen by a master artist, as well, in this case they evidently were by the photographer, Robert Polidori. Unlike many of our other photographers, Polidori is well-known for his images of urban environments and interiors with his work exhibited at the Metropolitan Museum of Art (New York), Musée d'art contemporain de Montréal, Martin-Gropius-Bau museum (Berlin), and Instituto Moreira Salles (São Paulo and Rio de Janeiro) to mention just a few. 

Polidori has photographed the restoration of the Château de Versailles since the early 1980s and recorded the architecture and interiors of Havana, and this portrait of the Hotel Petra, once one of the most popular hotels in Beirut,  located in the city centre adjacent to the Grand Theatre seems to me to show that, for an artist, all buildings are equally valid as canvases.

31 October 2021

Briefly; on the Growing Sense of Insignificance that Comes with Aging

by Simbarashe Nyatsanza
 

Lord Chesterfield: "I look upon all that is past as one of those 
romantic dreams, which opium commonly occasions."

I will be, hopefully, turning 28 on the 10th of April this coming year, and I recently, reluctantly, came to the end of my university studies in December 2020. I was 26 years old at the time; late, somehow feeling outgrown and out of place, too aware of my fading sense of wonder and merriment with everything around me to continue on with the farce of mock-ignorance often needed for one to successfully allow themselves to be ‘educated’. 

I had to complete my studies and end the nonsense. I was also feeling quite stagnant and of stunted progress, with nothing to show for my life (if those ever-fleeting moments of glee and glum that characterized my existence back then could safely be called life). But, good god, I miss those old wicked times! Things were hard and confusing and drunken and exciting and draining and enraging and saddening and thrilling and depressing and carefree and sexy and sexual and niggerish and nightmarish and orgasmic and purifying and, sometimes, there was nothing but the rousing possibility, the potential, for more innocuous but meaningful meaninglessness. I was alive for, and because of, that. 

Memories of the debauched moments of belligerence, the often psychotic, sporadically violent and extremely intoxicating sense of selflessness that came with the demonstration of inebriated impulses during those days, now assume a faint kind of beauty that is no longer reachable, simply impossible to replicate, way out of par and incompatible with the forced sense of self-responsibility that often finds itself creeping in and enlarging in the crevices of the mind as the years add on. Yet these memories are somehow reassuring, as if they were a faint picture of a monument - the strange and saddening beauty of a wilting flower - a remembrancer of a series of moments fully exhausted, while they were exhausting - and yet the closest thing to liberation - to the very soul of the young, misguided misfit I so proudly was. 

 It often feels that it won't be long till I start describing myself as a 'once-was', or as an 'i-used-to'. Like those busy-sounding, busy losers who speak of a past laden with potential and yet say nothing of their rotten and dried out, washed up presences. Or those forcefully eccentric Africans who still speak with a misplaced ‘White’ accent ever since they went to Europe for a time as brief as a fortnight when they were as young as ten years old, who desperately hang on to a fading sense of sophistication, of a ‘difference’, who greatly overestimated their own sense of importance. Like them, it feels like it won't be long until my mind finally accepts its role as a repository for failed tests, failed relationships, failed prayers, failed exams, failed apologies, failed attempts at reconciliation, failed learning, failed loving - failed everythings, and the mouth resigns drunkenly into an amplifier of the uselessness of wisdom that comes with hindsight, always blasting even in the forced silence of one's mind. 

 It is The Irishman telling Jimmy Hover that 'It is what it is'. It is Red, in Shawshank Redemption, marking his name beneath a dead one, and then moving on. It's the red at the center of the flame that burns your fingertips as you light another cigarette that gently pushes you, drag by drag, towards permanent oblivion. It's the most gentlemanly Robert Mugabe finally dying in an Adidas tracksuit, while he always wore suits all his life. It is like shaving your hair every two days because of the gray strands that always eagerly sprout in it, reminding you of the old man whose face is starting to come out of yours, when you hadn't thought of yourself as that old. In fact, you would never have thought of yourself as old at all. It's that ageless voice inside of you, the one that keeps coming up with the reassurances, the reminders of how everyone is God's favorite child, of how there's still a chance to turn things around, to be something, like the others, finally gently screaming, “Get over yourself!” from the center of your brain. 

It's that desperate yearning for longevity, which almost comes across as a series of threatening promises of mediocrity. It's really a well crafted and articulate declaration of complacency; aging.

24 October 2021

Is there a RIGHT not to be vaccinated?

Eleanor Roosevelt and the Universal Declaration of Human Rights
(1948) which gives the right to medical treatment, but is silent 
on the right to decline it

I
s there a RIGHT not to be vaccinated? The question was raised over at Quora and I was spurred to answer it there.

People certainly think they have a right to your own body, indeed years ago Thomas Hobbes made this the basis of all our rights. But Hobbes recognised that people could be forced to do many things “with” their bodies.

And today, unvaccinated people are certainly finding that a lot of things they thought were their rights are not really. We are in unknown territory with vaccine mandates, really, and the ambiguities reveal themselves as governments perform all sorts of contortions to “appear” to leave people a choice, while in effect making the choice almost impossible to exercise. The US government s many other governments do, will sack people for not being vaccinated, but it does not explicitly seek to a law to make vaccination obligatory.

And so, there are concerted attempts all over the world to make ordinary, healthy people take corona virus vaccines that are known to have non-negligible side-effects including in some cases death. Databases like the EudraVigilance one operated by the European Medical Agency indicate that adverse side-effects are real enough. Two justifications offered for this are that (side-effects apart) the vaccine will protect you from the more serious effects of a Covid infection, and that they reduce transmission of the virus throughout society.

Many governments already mandate vaccinations on the basis that they are in the individuals’ heath interest, for example four doses of the Polio vaccine is recommended in the United States for children, and within Europe eleven Countries have mandatory vaccinations for at least one out of diphtheria, tetanus, pertussis, hepatitis B, poliovirus, Haemophilus influenzae type B, measles, mumps, rubella and varicella vaccine.

So the idea that governments can force you to be vaccinated is a bridge largely crossed already: vaccines are not considered to be experimental health treatments of the kind that the Nuremberg Code has high-lit and banned ever since the excesses of the Nazi regime in the Second World War.

However, the corona virus vaccine does seem to me to come with many problematic ethical issues. Firstly, it is not actually a vaccine in the traditional sense. This matters, as the protection it offers against the disease is far from established. Today, governments like Israel that were first to vaccinate their populations (setting to one side the separate and inferior effort for people in the Occupied Territories) are now mandating third and even fourth shots as the virus continues to spread and cause illness there.

Secondly, it is experimental in the very real sense that the gene therapy technology is novel and comes with significant unknowns. It is for this reason that all the companies making the vaccines insisted on, and got, blanket exemption from prosecution for the effects of their products. One of the early developers of the MRNA vaccine technology, Robert Malone, considers that there is a risk of the method actually worsening the illness in the long run, so called antibody enhancement, and that the unprecedented effort to universalise the vaccine also creates unprecedented downside risks from such an enhanced vaccine.

A third area of concern is that there is no doubt that vaccinated people can both be infected with the corona virus and can be infectious while infected. Although you hear politicians say things like “get vaccinated and then we can get back to normal” this is just political rhetoric, as there never was any reason to think that the inoculations against the corona virus really were equivalent to the successful campaigns for things like polio and rubella.

So, to answer the specific ethical point! The right not to be vaccinated, or in this case injected with gene therapies, does not exist. In which sense, we cannot lose this right, much as I personally think there should be such protections (protections going well beyond marginal cases such as “religious exemptions”). What seems to be new is that governments have taken upon themselves the right to impose a much more risky programme of gene therapy treatments, dished out it seems at six monthly intervals in perpetuity, backed by pretty much unprecedented sanctions on people who would, if allowed, choose not to be inoculated. But the principle of government compulsion is established already: by which I mean we are fined for driving too fast, or disallowed from certain jobs if we don’t have the right training certificates.

What the corona vaccine mandates and penalties for being “unvaccinated” (the restrictions on working, social life, social activity, travel ) really reflect is not the loss of rights as the weakness of existing civic rights. Like taxation, there should be no vaccination without a process of consent expressed through genuine and informed public debate and political representation. But as I say, this is not a right that we have at the moment, so it can hardly be said to be lost.

At the moment, governments claiming to be acting in the “general interest” have targeted individuals and groups, and criminalised certain aspects of normal life, but this is merely an extension of a politics that we have long allowed our governments to exercise.

17 October 2021

On the Appeal of Authoritarianism — and Its Risks

 

On March 30th, Hungary's populist leader, Viktor Orbán, obtained the indefinite recognition of special powers from his parliament, to the shock of many in Europe, and indeed in Hungary

By Keith Tidman

Authoritarianism is back in fashion. Seventy years after the European dictators brought the world to the brink of ruin, authoritarian leaders have again ascended across the globe, preaching firebrand nationalism. And there’s again no shortage of zealous supporters, even as there are equally passionate objectors. So, what has spurred authoritarianism’s renewed appeal? Let’s start by briefly looking at how authoritarianism and its adversarial ideology, liberal democracy, differ in their implied ‘social contract’.

 

One psychological factor for authoritarianism’ allure is its paternal claims, based on all-powerful, all-knowing central regimes substituting for the independent thought and responsibility of citizens. Decisions are made and actions taken on the people’s behalf; individual responsibility is confined to conformance and outright obedience. Worrying about getting choices right, and contending with their good and bad consequences, rests in the government’s lap, not in the individual’s. Constitutional principles start to be viewed as an extravagance, one that thwarts efficiency. For some people, this contract, exchanging freedom for reassuring paternalism, may appeal. For others, it’s a slippery slope that rapidly descends from the illiberalism of populists to something much worse.

 

Liberal democracy is hard work. It requires accountability based on individual agency. It requires people to become informed, assess information’s credibility, analyse arguments’ soundness, and arrive at independent choices and actions. Citizens must be vigilant on democracy’s behalf, with vigilance aided by the free flow of diverse, even contentious, ideas that enlighten and fill the intellectual storehouse on which democracy’s vibrancy depends. Often, individuals must get it right for themselves. They bear the consequences, including in their free and fair choice of elected representatives; ultimately, there are fewer options for offloading blame for bad outcomes. The rewards can be large, but so can the downsides. Constitutional bills of rights, the co-equal separation of powers, and the rule of law are democracy’s valued hallmarks. There’s likewise a social contract, though with allowance for revision to account for conditions at the moment. For many people, this model of democratic governance appeals; for others, it’s disorderly and ineffectual, even messy.

 

It requires only a small shift for the tension between authoritarianism and the personal agency and accountability of liberal democracy to end up tilting in authoritarianism’s favour. Individual perspectives and backgrounds, and particular leaders’ cult of personality, matter greatly here. With this in mind, let’s dig a bit deeper into what authoritarianism is all about and try to understand its appeal.

 

Authoritarianism was once seen more as the refuge of poor countries on far-away continents; nowadays we’ve witnessed its ascendancy in many developed nations too, such as in Europe, where the brittleness of former democracies snapped. Countries like Russia and China briefly underwent ‘liberal springs’, inquisitively flirting with the freedoms associated with democracy before becoming disenchanted with what they saw, rolling back the gains and increasing statist control over the levers of power. In other countries, what starts as extreme rightwing or leftwing populism, as in some quarters of Asia and Central and South America, has turned to authoritarianism. Strongmen have surrounded themselves with a carefully chosen entourage, doing their bidding. Security forces, like modern-day praetorians, shield and enforce. Social and political norms alter, to serve the wishes of centralised powers. It’s about power and control; to be in command is paramount. Challenges to officialdom are quick to set off alarms, and as necessary result in violence to enforce the restoration of conformity.

 

The authoritarian leader’s rationale is to sideline challengers, democratic or otherwise, turning to mock charges of fraudulence and ineptness to neutralize the opposition. The aim is structural submission and compliance with sanctioned doctrine. The leader asserts he or she ‘knows best’, to which flatterers nod in agreement. Other branches of government, from the legislature to the courts and holders of the nation’s purse strings, along with the country’s intelligentsia and news outlets, are disenfranchised in order to serve the bidding of the charismatic demagogue. Such heads of state may see themselves as the singular wellspring of wise decision-making, for some citizens raising the disconcerting spectre of democratic principles teetering in their supposed fragile balance.

 

Authoritarian leaders monopolising the messaging for public consumption, for the purpose of swaying behaviour, commonly becomes an exercise in copycatting the ‘doublespeak’ of George Orwell’s 1984: war is peace; slavery is freedom; ignorance is strength (slogans inscribed by the Party’s Ministry of Truth). Social activism is no longer brooked and thus may be trodden down by heavy-handed trusted handlers. Racism and xenophobia are ever out in front, as has been seen throughout Europe and in the United States, leading to a zealously protective circling of the wagons into increased sectarianism, hyper-partisanship, and the rise of extremist belief systems. In autocracies, criticism — and economic sanctions or withdrawal of official international recognition — from democracies abroad, humanitarian nongovernmental organisations, and supranational unions is scornfully brushed aside.

 

Yet, it may be wrong to suggest that enthusiasts of authoritarian leaders are hapless, prone to make imprudent choices. Populations may feel so stressed by their circumstances they conclude that a populist powerbroker, unhampered by democracy’s imagined rule-of-law ‘manacles’, is attractive. Those stresses on society might range widely: an unnerving haste toward globalisation; fear of an influx of migrants, putting pressure on presumed zero-sum resources, all the while raising hackles over the nation’s majority race or ethnicity becoming the minority; the fierce pitting of social and political identity groups against one another over policymaking; the disquieting sense of lost cohesion and one’s place in society; and a blend of anxiety and suspicion over unknowns about the nation’s future. In such fraught situations, democracy might be viewed as irresolute and clogging problem-solving, whereas authoritarianism might be viewed as decisive.

 

Quashing the voice of the ‘other social philosophy’, the ‘other community, the ‘other reality’ has become increasingly popular among the world’s growing list of authoritarian regimes. The parallel ambiguous wariness of the pluralism of democracy has been fueling this dynamic. It might be that this trend continues indefinitely, with democracy having run its course. Or, perhaps, the world’s nations will cycle unevenly in and out of democracy and authoritarianism, as a natural course of events. Either way, it’s arguable that democracy isn’t anywhere nearly as fragile as avowed, nor is authoritarianism as formidable.

 

09 October 2021

A Moral Rupture

by Thomas Scarborough


Virtually all of our assumptions with regard to ethics are based on theories in which we see no rupture between past, present, and future, but some kind of continuity

If we are with Aristotle, we hold out happiness as the goal, and assume that, as this was true for Aristotle, so it is for me, and forever will be. Or if we are with Moore, we believe that our moral sense is informed by intuition, always was, and will be in the future. If we are religiously minded, we assume that God himself has told us how to live, which was and is eternal and unchanging. 

I propose, instead, that ethics is not in fact constant, and at the present time we are witnessing a fundamental moral rupture. This is based upon a distinct ethical theory, which runs like this: 
As we look upon the world, we arrange the world in our understanding. Depending on how we have so arranged it, we act in this world. A concise way of putting this is that our behaviour is controlled by mental models. Since no one person so arranges the world in quite the same way as the next, our ethics are in many cases strikingly different. 
In past ages, people arranged the world in their minds in such a way that this was largely in keeping with their personal experience of the world. They based it on the people they met, the environment which surrounded them, and so on. Of course, people had access also to parchments, listened to orators, or explored new ideas, so that various influences came to bear upon them. Mostly, however, they interacted with a real world. 

In more recent history, the age of mass communications descended upon us. We invented the printing press, the postal system, then radio, and TV. And certainly, increased travel exposed us all to broader ideas. However, we still understood our world largely in terms of personal experience. Our personal experience, too, informed our interpretation of events and opinions further afield, and we had the leisure to ponder them, and often make sport of them. 

Since the turn of the century, however, we have increasingly been involved in instant, two-way communications, and in many cases dispersed communications, where many people are included at the same time. The result is that, for the first time in history, many (and some say most) of our interactions and reactions are electronic. 

One could list any number of implications. In terms of this post, our arrangement of the world in our understanding is changing. In fact, change is not the word. There is a rupture. The basis on which we arrange the world is not at all what it was. 

Images of the world swamp our experience of it.
The consequences of our views dissipate in the aether.
Feedback to our words and actions often eludes us, and
Ideology is little tempered by direct observation.

If we accept that mental models drive our behaviour, all older notions of ethics are uprooted. It may only become clear to us just how in the decades to come. Marshall McLuhan wrote in 1962, in his landmark The Gutenberg Galaxy, "There can only be disaster arising from unawareness of the causalities and effects inherent in our technologies."

03 October 2021

Picture Post #68 The Sitting Room

by Martin Cohen


Photo credit: Micelo

There's something a little spooky about this picture, emphasised by the face in the mirror above the fireplace – but there too in the ‘empty chairs’. Where are their proud occupants? What did they talk about or do those long evenings in their high-ceilinged castle? For this is a room carefully restored (if not quite brought back to life) by some French enterprise or other.

Indeed the French – and English too – do seem to live in an imaginary past, of posh families in big chateaux / country houses with not much to do except count their silver cutlery. I think it's rather a sad way to live, and so perhaps it is appropriate that this picture seems to me to speak only of a rather forlorn and empty existence.

26 September 2021

The Recounting of History: Getting From Then to Now



Double Herm of Thucydides and Herodotus

Thucydides was a historian of the wars between Athens and Sparta, in which he championed the Athenian general Perikles. Herodotus travelled and wrote widely and tried to be more impartial.



Posted by Keith Tidman

 

Are historians obliged to be unwaveringly objective in their telling of the past? After all, as Hegel asserted: ‘People and government never have learned anything from history or acted on principles deduced from it’.

 

History seems to be something more than just stirring fable, yet less than incontestable reality. Do historians’ accounts live up to the tall order of accurately informing and properly influencing the present and future? Certainly, history is not exempt from human frailty. And we do seem ‘condemned’ to repeat some version of even half-remembered history, such as stumbling repeatedly into unsustainable, unwinnable wars.

 

In broad terms, history has an ambitious task: to recount all of human activity  ideological, political, institutional, social, cultural, philosophical, judicial, intellectual, religious, economic, military, scientific, technological and familial. Cicero, who honoured Herodotus with the title the father of history’, seems to have had such a lofty role in mind for the discipline when he pondered: ‘What is the worth of human life, unless it is woven into the life of our ancestors by the records of history?’ The vast scope of that task implies both great challenges and vulnerabilities.

 

History provides the landscape of past events, situations, changes, people, decisions, and actions. Both the big picture and the subtler details of the historical record spur deliberation, and help render what we hope are wise choices about societys current and future direction. How wise such choices are — and the extent to which they are soundly based on, or at at least influenced by, how historians parse and interpret the past  reflects how ‘wise’ the historians are in fulfilment of the task. At its best, the recounting of history tracks the ebb and flow of changes in transparent ways, taking into account context for those moments in time. A pitfall to avoid, however, is tilting conclusions by superimposing on the past the knowledge and beliefs we hold today.

 

To these ends, historians and consumers of history strive to measure the evidence, complexities, inconsistencies, conflicts, and selective interpretations of past events. The challenge of chronicling and interpretation is made harder by the many alternative paths along which events might have unfolded, riven by changes in direction. There is no single linear progression or trajectory to history, extending expediently from the past to the present; twists and turn abound. The resulting tenuousness of causes and effects, and the fact that accounts of history and human affairs might not always align with one another, influence what we believe and how we behave generations later. 

 

The fact is, historical interpretations pile up, one upon another, as time passes. These coagulating layers can only make the picture of the past murkier. To recognise and skilfully scrape away the buildup of past interpretations, whether biased or context-bound or a result of history’s confounding ebb and flow, becomes a monumental undertaking. Indeed, it may never fully happen, as the task of cleaning up history is less alluring feature than capturing and recounting history.


Up to a point, it aids accuracy that historians may turn to primary, or original, sources of past happenings. These sources may be judged on their own merits: to assess evidence and widely differing interpretations, assess credibility and ferret out personal agendas, and assess the relative importance of observations to the true fabric of history. Artifacts, icons, and monuments tell a story, too, filling in the gaps of written and oral accounts. Such histories are intended to endure, leading us to insights into how the rhythms of social, reformist, and cultural forces brought society to where it is today.


And yet, contemporaneous chroniclers of events also fall victim to errors of commission and omission. It’s hard for history to be unimpeachably neutral in re-creating themes in human endeavour, like the victories and letdowns of ideological movements, leaders, governments, economic systems, religions, and cultures, as well as of course the imposing, disruptive succession of wars and revolutions. In the worst of instances, historians are the voice of powerful elites seeking to champion their own interests. 

 

When the past is presented to us, many questions remain. Whose voice is allowed to be loudest in the recounting and interpretation? Is it that of the conquerors, elites, powerful, holders of wealth, well-represented, wielders of authority, patrons? And is the softest or silenced voice only that of the conquered, weak, disregarded, disenfranchised, including marginalised groups based on race or gender? To get beyond fable, where is human agency truly allowed to flourish unfettered?

 

Therein lies the historian’s moral test. A history that is only partial and selective risks cementing in the privileges of the elite and the disadvantages of the silenced. ‘Revisionism’ in the best sense of the word is a noble task, aimed at putting flawed historical storytelling right, so that society can indeed then ‘act on the principles deduced from it’.