21 August 2022

Thence We Will Create Superhumans

by Corinne Othenin-Girard *


IMAGINE A WORLD IN WHICH parents have the option to go to a geneticist to discuss the ‘genetic fix’ choices of their unborn child.

If you should think that this is a fantasy of a dystopian fiction, you would be mistaken. Not only is the above, to a point, technologically possible today, but the parents' option could be made possible, too, in the not-too-distant future. 

Human Genome Editing is a kind of genetic engineering, where DNA is deleted and inserted, modified and replaced. 

The main argument in support of this technology is that it would be used to prevent the transmission of genetic diseases from one generation to the other. 

There seems now to be an instrumentalisation of individuals with disability, which means that concepts become instruments which serve as a guide to action. The proponents of (Germline) Genome Editing are using ‘the prevention of disability’ as a concept that coincides with how people with disabilities are usually portrayed and viewed by the broad public. 

There are two kinds of such editing—Somatic Genome Editing, and Germline Genome Editing—and there are, broadly, three possible applications. These applications include the following: 

1. Somatic Genome Editing is performed in the non-reproductive cells, and may contribute to treating diseases in existing individuals. It is said that it has the potential to revolutionise healthcare. A stunning success of this method was shown recently in the (possibly permanent) cure of hemophilia. And by now nearly 300 experimental gene-based therapies are in clinical testing. Changes made by somatic genome therapy are not passed down to future generations.

2. Germline Genome Editing is performed in early-stage embryo (before ‘it’ is even called an embryo), or in germ cells (sperm and egg cells). These modifications affect all cells of the potential future child, and will also be passed on to future generations. This technology would be used to prevent the transmission of diseases from one generation to the next. In other words, Genome editing would be used for fixing genetic ‘defects’ or ‘variations’ which cause rare diseases. Germline Genome Editing does not treat, cure, or prevent disease in any living individual. It is used to create embryos with altered genomes. 

3. From there on, the technology of Germline Genome Editing will inevitably expand into the area of generating ‘new’ or ‘improved’ abilities. Any gene can change, based on the ability-development it promises. ‘Treating disease’ or ‘preventing disability’ would therefore merge with ‘enhancement’. If genome editing should be deemed to be ‘sufficiently safe’, it could be applied to all kinds of gene variations—and that which is seen as ‘normal’ might be up for debate. The proponents of enhancement by genome editing mean to improve the human body and mind to its maximum potential. They conceive the natural human body as limited, defective, and in need of improvement, and support functioning beyond species-typical boundaries. 

Assuming that so-called ‘glitches’ of gene editing would be overcome, is it ethically acceptable to use this technology in order to ‘design’ future babies? It has already been done, in fact, and this issue has already come up, through the so-called CRISPR-Baby Scandal. In 2018, a Chinese researcher He Jiankui made the first CRIPRS-edited babies, twin girls called Lulu and Nana. Many researchers condemned his action. The actual editing wasn’t executed well. 

At the moment, public opinion is thought to carry a lot of weight. Therefore, various polls have been conducted to assess it. For example, the parent may have a severe heritable muscle disease: whether gene editing for (unborn) babies is acceptable, when it greatly reduces their risk of serious diseases or conditions. Assuming, again, that the technology is safe and effective. 

But for the technology to be declared as safe, don’t individuals with changed DNA need to be monitored throughout their life? 

The emerging field of enhancement medicine is due to push the boundaries through genetic manipulation, and will apply a shift to what is the human norm. 

Would using genome editing technology to create the 'perfect' or 'ideal' human risk making us become less tolerant of 'imperfections'? A person who couldn't embrace the norm of perfection would be perceived as 'disabled' and not as a person with a difference that needs to be sustained.

A genuinely inclusive and pro-equality society has no preferences between all possible future persons. Instead all existing and future individuals are perceived as having equal worth and value.

-------------------------------------

* Corinne Othenin-Girard is a PhD student in sociology in Basle, Switzerland. She is currently working on a participatory project on the topic of Human Germline Genome Editing. Corinne invites readers of Pi to join a Zoom Conference, 9 September 2022 on Human Germline Gene editing (HGGE), more specifically on how it could change the future of humanity.

15 August 2022

The Tangled Web We Weave


By Keith Tidman
 

Kant believed, as a universal ethical principle, that lying was always morally wrong. But was he right? And how might we decide that?

 

The eighteenth-century German philosopher asserted that everyone had ‘intrinsic worth’: that people are characteristically rational and free to make their own choices. Lying, he believed, degrades that aspect of moral worth, withdrawing others’ ability to exercise autonomy and make logical decisions, as we presume they might in possessing truth. 

 

Kant’s ground-level belief in these regards was that we should value others strictly ‘as ends’, and never see people ‘as merely means to ends’. A maxim that’s valued and commonly espoused in human affairs today, too, even if people sometimes come up short.

 

The belief that judgements of morality should be based on universal principles, or ‘directives’, without reference to the practical outcomes, is termed deontology. For example, according to this approach, all lies are immoral and condemnable. There are no attempts to parse right and wrong, to dig into nuance. It’s blanket censure.

 

But it’s easy to think of innumerable drawbacks to the inviolable rule of wholesale condemnation. Consider how you might respond to a terrorist demanding the place and time of a meeting to be held by the intended target. Deontologists like Kant would consider such a lie immoral.

 

Virtue ethics, to this extent compatible with Kant’s beliefs, also says that lying is morally wrong. Their reasoning, though, is that it violates a core virtue: honesty. Virtue ethicists are concerned to protect people’s character, where ‘virtues’ — like fairness, generosity, compassion, courage, fidelity, integrity, prudence, and kindness — lead people to behave in ways others will judge morally laudable. 

 

Other philosophers argue that, instead of turning to the rules-based beliefs of Kant and of virtue ethicists, we ought to weigh the (supposed) benefits and harms of a lie’s outcomes. This principle is called  consequentialist ethics, mirroring the utilitarianism of eighteenth/nineteenth-century philosophers Jeremy Bentham and John Stuart Mill, emphasising greatest happiness. 

 

Advocates of consequentialism claim that actions, including lying, are morally acceptable when the results of behaviour maximise benefits and minimise harms. A tall order! A lie is not always immoral, as long as outcomes on net balance favour the stakeholders.

 

Take the case of your saving a toddler from a burning house. Perhaps, however, you believe in not taking credit for altruism, concerned about being perceived conceitedly self-serving. You thus tell the emergency responders a different story about how the child came to safety, a lie that harms no one. Per Bentham’s utilitarianism, the ‘deception’ in this instance is not immoral.

 

Kant’s dyed-in-the-wool unforgiveness of lies invites examples that challenge the concept’s wisdom. Take the historical case of a Jewish woman concealed, from Nazi military occupiers, under the floorboards of a farmer’s cottage. The situation seems clear-cut, perhaps.

 

If grilled by enemy soldiers as to the woman’s whereabouts, the farmer lies rather than dooming her to being shot or sent to a concentration camp. The farmer chooses good over bad, echoing consequentialism and virtue ethics. His choice answers the question whether the lie elicits the better outcome than would truth. It would have been immoral not to lie.

 

Of course, the consequences of lying, even for an honorable person, may sometimes be hard to get right, differing in significant ways from reality or subjectively the greater good. One may overvalue or undervalue benefits — nontrivial possibilities.

 

But maybe what matters most in gauging consequences are motive and goal. As long as the purpose is to benefit, not to beguile or harm, then trust remains intact — of great benefit in itself.

 

Consider two more cases as examples. In the first, a doctor knowingly gives a cancer-ridden patient and family false (inflated) hope for recovery from treatment. In the second, a politician knowingly gives constituents false (inflated) expectations of benefits from legislation he sponsored and pushed through.

 

The doctor and politician both engage in ‘deceptions’, but critically with very different intent: Rightly or wrongly, the doctor believes, on personal principle, that he is being kind by uplifting the patient’s despondency. And the politician, rightly or wrongly, believes that his hold on his legislative seat will be bolstered, convinced that’s to his constituents’ benefit.

 

From a deontological — rules-focused — standpoint, both lies are immoral. Both parties know that they mislead — that what they say is false. (Though both might prefer to say something like they ‘bent the truth’, as if more palatable.) But how about from the standpoint of either consequentialism or virtue ethics? 

 

The Roman orator Quintilian is supposed to have advised, ‘A liar should have a good memory’. Handy practical advice, for those who ‘weave tangled webs’, benign or malign, and attempt to evade being called out for duplicity.

 

And damning all lies seems like a crude, blunt tool, with no real value by being wholly unworkable outside Kant’s absolutist disposition toward the matter; no one could unswervingly meet that rigorous standard. Indeed, a study by psychologist Robert Feldman claimed that people lie two to three times, in trivial and major ways, for every ten minutes of conversation! 

 

However, consequentialism and virtue ethics have their own shortcomings. They leave us with the problematic task of figuring out which consequences and virtues matter best in a given situation, and tailoring our decisions and actions accordingly. No small feat.

 

So, in parsing which lies on balance are ‘beneficial’ or ‘harmful’, and how to arrive at those assessments, ethicists still haven’t ventured close to crafting an airtight model: one that dots all the i’s and crosses all the t’s of the ethics of lying. 


At the very least, we can say that, no, Kant got it wrong in overbearingly rebuffing all lies as immoral. Not seeking reasonable exceptions may have been obvious folly. Yet, that may be cold comfort for some people, as lapses into excessive risk — weaving evermore tangled webs — court danger by unwary souls.


Meantime, while some more than others may feel they have been cut some slack, they might be advised to keep Quintilian’s advice close.




* ’O what a tangled web we weave / When first we practice to deceive’, Sir Walter Scott, poem, ‘Marmion: A Tale of Flodden Field’.

 

07 August 2022

A Linguistic Theory of Creation

by Thomas Scarborough

Creation of the Earth, by Wenceslas Hollar (1607-1677)

Perhaps it has been obscured through familiarity. There is an obvious curiosity in the opening chapters of Genesis (the creation of the world). Step by step, God creates the world, then names the world—repeatedly both coupling and separating his* creating and his naming.
Would it not be more natural simply to describe God’s creative acts without embellishment? Would not a description of his creative acts alone suffice? Unless God's naming has some special significance in the narrative, it may seem quite superfluous.

Under any circumstances, the opening chapters of Genesis are supremely difficult to interpret. Bearing this very much in mind, the purpose here is to present an alternative view—unfinished, unrefined—as a new possibility.

Existing interpretations of Genesis include the following:

  • Heaven and earth were created in six days
  • The six days were six (longer) periods of time
  • The earth’s great age was ‘created into’ a six-day sequence
  • Genesis represents the re-creation of the world
  • Genesis stitches various creation stories together
  • Its purpose is to glorify God, not first to be factual
  • It is a synopsis, which may not be sequential
  • It is a myth
  • It is a spiritual allegory
  • It describes a dream of Moses

Here, then, is a new alternative—presented merely as a possibility—for greater minds to examine the rough edges and (possibly) inadmissible ideas on an exceedingly complex text.

We begin with a simple linguistic fact. Names, in the Bible, were often commemorative. The ATS Bible Dictionary sums it up well: ‘Names were assumed afterwards to commemorate some striking occurrence in one’s history.’ Therefore, an event took place—then it, or the place of its happening, was named: Babel, Israel, the Passover, and so on. In fact, often with a pause.

If we assume that the creation account in Genesis includes, similarly, a commemorative naming, then the account may separate a stage-by-stage creation of the world from a stage-by-stage naming of it. With this in mind, there would then be four stages to each act of creation in Genesis. For example, in the NASB translation of the Bible (abridged):

  • ‘Then God said, Let there be light.’
  • ‘And there was light.’
  • ‘And God called the light day.’
  • ‘And there was evening and there was morning, one day.’

One may reduce this to two stages:

  • God created.
  • Then God named it.
 
And with some nuance, we may possibly say:

  • God created, within unspecified periods of time.
  • God named his creation during equal pauses (days), as commemorative acts.


In this case, Genesis could be viewed as a series of linguistic events. Its opening verses could set the tone, as a linguistic announcement: ‘And the earth was formless and void’—reminiscent of the linguist Ferdinand de Saussure, ‘In itself, thought is like a swirling cloud, where no shape is intrinsically determinate. No ideas are established in advance, and nothing is distinct, before the introduction of linguistic structure.’ 

Further, one may see a major linguistic shift in Genesis 3:7: ‘Then the eyes of both of them were opened …’ We have, from this point, the language of ‘ought’, as the first rational creatures ostensibly discern right from wrong. Then, needless to say, Babel represents a major linguistic shift in Genesis chapter 11, as languages (plural) appear.

From this, two major issues arise.

Firstly, is God's creating, in each stage of creation, coincident with his naming of it? In other words, did God name things on the same day that he created them, or did he name them afterwards? 

If it was on the same day that he created them, then the theory suggested here would presumably unravel. But arguably, in its favour, each naming is preceded by the word ‘And … ,’ which in the creation account is mostly used to indicate sequences in time. ‘And God called ...’ may represent separate periods of time in which namings occurred, after acts of creation.

A possible problem lies in Genesis 5:2, ‘God named them … in the day they were created.’ However, the word ‘day’ may here encompass every day, as we find in Genesis 2:4. ‘In the day’ may not refer to the separate stages of creation of Genesis chapter 1.

A second issue arises: God's naming does not seem to appear in the text consistently. ‘God called …’ appears only three times in Genesis 1, in connection with the first three days of creation. 

However Genesis, in general, liberally makes use of related words. Take the key words ‘God created ...’ Alternatives that we find in the text are ‘made, ‘formed’, ‘brought forth’, and so on. The same is true of the key words ‘God called …’ Alternatives are ‘saw’, ‘blessed’, ‘sanctified’. An act of commemoration may be implied in all of these words.

In short, the time periods which are described in Genesis may be attached, not first to the creation of the world, but to God’s naming of it—and, incidentally, to man's naming of it. On the sixth day, ‘the man gave names …’

Such a theory would potentially remove major problems of other creation theories. In particular, it could possibly move beyond both literal and liberal readings of Genesis, without colliding with them.

----------------------------------

* I follow Rabbi Aryeh Kaplan: “We refer to G-d using masculine terms simply for convenience’s sake.

Also by Thomas Scarborough: Hell: A Thought Experiment.

31 July 2022

Picture Post #77: The Picnic



'Because things don’t appear to be the known thing; they aren’t what they seemed to be
neither will they become what they might appear to become.'

 

Posted by Martin Cohen



 
Another image from another war. The 1999–2000 battle of Grozny saw the siege and assault of the Chechen capital by Russian forces, and left the city devastated. In 2003, the United Nations called Grozny the most destroyed city on Earth.

But pause to look at this image. There’s a bizarre juxtaposition of suburban normalcy and wartime horror here. For a start, the table set with four chairs. Who else will be coming to dinner? Notice at the moment the two soldiers are a man and a woman, again echoing many a more homely, family scene.

Of course, as with most picnics, it is the setting that makes the moment, but here it is a nightmare scene of blasted apartment blocks and grey, smoking ruins. Not the family car, but the “family tank” is parked nearby.

On the table, the actual food is rather meagre.which may explain why both figures at the table look, frankly, rather miserable.

24 July 2022

‘Philosophical Zombies’: A Thought Experiment

Zombies are essentially machines that appear human.

By Keith Tidman
 

Some philosophers have used the notion of ‘philosophical zombies’ in a bid to make a point about the source and nature of human consciousness. Have they been on the right track?

 

One thought experiment begins by hypothesising the existence of zombies who are indistinguishable in appearance and behaviour from ordinary people. These zombies match our comportment, seeming to think, know, understand, believe, and communicate just as we do. Or, at least, they appear to. You and a zombie could not tell each other apart. 

 

Except, there is one important difference: philosophical zombies lack conscious experience. Which means that if, for example, a zombie was to drop an anvil on its foot, it might give itself away by not reacting at all or, perhaps, very differently than normal. It would not have the inward, natural, individualised experience of actual pain the way the rest of us would. On the other hand, a smarter kind of zombie might know what humans would do in such situations and pretend to recoil and curse as if in extreme pain. 

 

Accordingly, philosophical zombies lead us to what’s called the ‘hard problem of consciousness’, which is whether or not each human has individually unique feelings while experiencing things – whereby each person produces his or her own reactions to stimuli, unlike everyone else’s. Such as the taste of a tart orange, the chilliness of snow, the discomfort of grit in the eye, the awe in gazing at ancient relics, the warmth of holding a squirming puppy, and so on.

 

Likewise, they lead us to wonder whether or not there are experiences (reactions, if you will) that humans subjectively feel in authentic ways that are the product of physical processes, such as neuronal and synaptic activity as regions of the brain fire up. Experiences beyond those that zombies only copycat, or are conditioned or programmed to feign, the way automatons might, lacking true self-awareness. If there are, then there remains a commonsense difference between ‘philosophical zombies’ and us.

 

Zombie thought experiments have been used by some to argue against the notion called ‘physicalism’, whereby human consciousness and subjective experience are considered to be based in the material activity of the brain. That is, an understanding of reality, revealed by philosophers of mind and neuroscientists who are jointly peeling back how the brain works as it experiences, imagines, ponders, assesses, and decides.

 

The key objection to such ‘physicalism’ is the contention that mind and body are separable properties, the venerable philosophical theory also known as dualism. And that by extrapolation, the brain is not (cannot be) the source of conscious experience. Instead, it is argued by some that conscious experience — like the pain from the dropped anvil or joy in response to the bright yellow of fields of sunflowers — is separate from brain function, even though natural law strongly tells us such brain function is the root of everyone's subjective experience.

 

But does the ‘philosophical zombie’ argument against brain function being the seed of conscious experience hold up?

 

After all, the argument that philosophical zombies, whose clever posing makes us assume there are no differences between them and us, seems problematic. Surely, there is insufficient evidence of the brain not giving rise to consciousness and individual experience. Yet, many people who argue against a material basis to experience, residing in brain function, rest their case on the notion that philosophical zombies are at least conceivable.

 

They argue that ‘conceivability’ is enough to make zombies possible. However, such arguments neglect that being conceivable is really just another expression for something ‘being imaginable’. Isn’t that the reason young children look under their beds at night? But, is being imaginable actually enough to conclude something’s real-world existence? How many children actually come face to face with monsters in their closets? There are innumerable other examples, as we’ll get to momentarily, illustrating that all sorts of irrational, unreal things are imaginable  in the same sense that they’re conceivable  yet surely with no sound basis in reality.

 

Proponents of conceivability might be said to stumble into a dilemma: that of logical incoherence. Why so? Because, on the same supposedly logical framework, it is logically imaginable that garden gnomes come to life at night, or that fire-breathing dragons live on an as-yet-undiscovered island, or that the channels scoured on the surface of Mars are signs of an intelligent alien civilisation!

 

Such extraordinary notions are imaginable, but at the same time implausible, even nonsensical. Imagining something doesn’t make it so. These ‘netherworld notions’ simply don’t hold up. Philosophical zombies arguably fall into this group. 

 

Moreover, zombies wouldn’t (couldn’t) have free will; that is, free will and zombiism conflict with one another. Yes, zombies might fabricate self-awareness and free will convincingly enough to trick a casual, uncritical observer — but this would be a sham, insufficient to satisfy the conditions for true free will.

 

The fact remains that the authentic experience of, for example, peacefully listening to gentle waves splashing ashore cannot happen if the complex functionality of the brain were not to exist. A blob that only looks like a brain (as in the case for philosophical zombies) would not be the equivalent of a human brain if, critically, those functions were missing.


It’s those brain functions that, contrary to theories like dualism, assert the separation of mind from body, that make consciousness and individualised sentience possible. The emergence of mind from brain activity is the likeliest explanation of experienced reality. Contemporary philosophers of mind and neuroscientists would agree on this, even as they continue to work jointly on figuring out the details of how all that happens.


The idea of philosophical zombies existing among us thus collapses. Yet, very similar questions of mind, consciousness, sentience, experience, and personhood could easily pop up again. Likely not as recycled philosophical zombies, but instead, as new issues arising longer term as developments in artificial intelligence begin to match and perhaps eventually exceed the vast array of abilities of human intelligence.



 

17 July 2022

Poetry:

Mountaintop to Mountaintop

Aphorists in Dialogue


Yahia Lababidi and Fraser Logan




Yahia Lababidi: We live and unlearn. 
Fraser Logan: Few learn the difference between critical thinking and criticising without thinking. 
YL: When the student is not ready, imposters appear.
Torment is a prerequisite of intellectual growth, not something to stifle in students. 
Let humiliation be your teacher. 
Mockery exposes falsehoods, but truth is already naked. 
Desert: the spiritual sensuality of the world, denuded. 
Alone, the heart denudes itself of pleasantries and splatters forth vulgarities. 

Style is the garment we shed on our way to nakedness
Stylised nakedness puts the art in cathartic. 
Art for art’s sake is a dead end; art for heart’s sake is the way out. 
Art today defies old masters and defiles the world with new disasters. 
Good artists are heralds of the world to come. 
A few artists hear the echo of an ancient starting gun; but we deem them rigid and unfree. 
Being hostage to beauty: the strength and weakness of artists. 
The best artists love morals as much as they hate them. 

We inhabit a moral universe—amorality is immorality.
Truthfulness is moral, but there are immoral truths. 
The best and most dangerous lies are mixed with truth. It’s truth that attracts others and allows them to accept untruths. 
Speak the truth, or speak to soothe. 
We are granted different powers to help one another. 
Socialite virtues are troglodyte vices. 
Some human sins, such as jealousy and pride, are Divine virtues. 
We sinned against the earth—by inventing sin. 

Strange, how what is life-giving―if not handled with care―can become life-threatening.
Those who contradict life, invariably suffer from life—until they are contradicted by death. 
Those with apparent contradictions are better equipped to understand Life’s inherent paradoxes. 
Fear of contradiction ties the tongue. 
If you cannot do good, practice biting your tongue. 
There are nowadays no individual palates, only tastebuds on a common tongue. 
Silence is a powerful punctuation mark. 
The talkative hiker spoils the view. 

The straight path is available to all who forsake crookedness. 
Beware the will which never wandered. 
We are unfree when we stray from Divine Will. 
True freedom oscillates between anarchy and constraint, tires of oscillating, tires of being tired… 
The path less trodden is harder on the feet, but better for the soul. 
When we advance on our enemies, we often use their paths and lose our friends along the way. 
To despair, hate or seek revenge is to be seduced by evil. 
Honesty marries hatred in the prison cell of love. 

When we love our prison, we no longer see the bars. 
Man is carried in the wind, like a leaf; but the wind is also carried in man. 
One day, leaf. One day, branch. One day, root. 
Youth climbs up single branches, hoping to be seen at the top of the tree. Adolescence longs for the all-encompassing view, but ends with a snap. 
Doubt, as a season of the soul, is like a strong wind that prunes trees — loosening dead leaves and weak branches — to fortify our foundation against future storms of the spirit
Maturity stands back to contemplate the tree, judging that every branch can bear fruit. Wisdom seeks out new seeds, advising youth to do the same. 
Wisdom is recovered innocence. 
The sight of youth upsets the old, but overjoys the elderly. 

Much of the suffering in our world is the result of wounded children, parenting. 
Man can suffer from pleasure or take pleasure in suffering. 
Spiritually understood, everything can be used for our development and advantage. 
The eradication of pain would be insufferable; the maximisation of pleasure miserable. 
The punishment for avoiding suffering is superficiality; the reward for embracing it is spirituality. 
Even stars have blackspots. 
We inhabit ourselves more fully when we find our inner light switches in the dark... 
Honesty is strike paper, and we are pyrophobic match-heads. 

Metaphors, like all possible explosives, should be handled with care and by those who know what they’re doing. 
Honesty erupts at the election booth; scholars hide away in yellowing ivory towers. 
Imagine if presidential candidates were required to be well-versed in moral philosophy. 
Politics is a zero-sum game in which dishonest politicians represent an alethophobic electorate. 
Nations fail for the same reason that people fall: they lose their balance. 
Most political arguments stem, not from curiosity, but from the impression that lopsided people need to be righted—or lefted. 
Birds use their wings, not only to fly, but also for balance―just like us. 
There is virtue in being a vole, if one is a vole; but life amongst soil and shrubbery is a cage to those with wings. 

Anything freed from the marble is an angel. Never cease chiselling... 
Greatness is sculpted. 
Better to be good than great. 
Only those who fall from great heights ever make a sound; but we are flattening the earth. 
If you ask to be raised in spiritual station, expect trials to match such an elevation. 
Hardship accelerates maturity. 

In imperilled states, the soul defends itself with poetry. 
The best writings begin with misery and find merriment along the way. 

Poetry is what happens to prose at boiling point. 
Aphorisms are muses brought to climax. 

Aphorisms are the sushi of literature. 
Aphorisms should be chewed on and swallowed—or spat out. 




Yahia Lababidi, an Egyptian-Palestinian author of ten books, has been called "our greatest living aphorist". He has contributed to news, literary and cultural institutions throughout the USA, Europe and the Middle East, such as: Oxford University, Pearson, PBS NewsHour, NPR, and HBO.

Lababidi’s latest work includes: Revolutions of the Heart (Wipf and Stock, 2020), a book of essays and conversations exploring crises and transformation, Learning to Pray (Kelsay Books, 2021) a collection of his spiritual aphorisms and poems as well as Desert Songs (Rowayat, 2022) a bilingual, photographic account of his mystical experiences in the deserts of Egypt.

*

Fraser Logan is an M4C-funded PhD student in Philosophy at the University of Warwick, UK. His project examines Nietzsche's views of honesty. Fraser’s aphorisms have been commended by the Oscar Wilde Society, and he has written a book, titled Eruptions, of 600 original aphorisms.

10 July 2022

Religions as World History

Religious manuscripts in the fabulous library of Timbuktu. Such texts are a storehouse of ancient knowledge.
By Keith Tidman

Might it be desirable to add teaching about world religions to the history curriculum in schools?


Religions have been deeply instrumental in establishing the course of human civilisation, from the earliest stirrings of community and socialisation thousands of years ago. Yet, even teaching about the world’s religions has often been guardedly held at arm’s length, for concern instruction might lapse into proselytising. 


Or at least, for apprehension over instructors actions being seen as such.


The pantheon of religions subject to being taught span the breadth: from Hinduism, Islam, Zoroastrianism, and Judaism to Buddhism, Christianity, and Sikhism  including indigenous faiths. The richness of their histories, the literacy and sacred quality of their storytelling, the complexities and directive principles held among their adherents, and religions seminal influences upon the advancement of human civilisation are truly consequential.


This suggests that religions might be taught as a version of world history. Done so without exhortation, judgment, or stereotyping. And without violating religious institutions desire to be solely responsible for nurturing the pureness of their faith. School instruction ought be straightforwardly scholarly and factual, that is  without presumption, spin, or bias. Most crucially, both subject-matter content and manner of presentation should avert transgressing the beliefs and faiths of students or their families and communities. And avoid challenging what theologians may consider axiomatic about the existence and nature of God, the word of authoritative figures, the hallowed nature of practices like petitionary prayer, normative canon, or related matters.

 

Accordingly, the aim of such an education would not be to evangelise or favour any religions doctrine over another’s; after all, we might agree that choice in paving a child’s spiritual foundation is the province of families and religious leaders.


Rather, the vision I wish to offer here is a secularised, scholarly teaching of religious literacy in the context of broader world histories. Adding a philosophical, ideas-based, dialogue-based layer to the historical explanation of religions may ensure that content remains subject to the rationalism (critical reflection) seen in educational content generally: as, for example, in literature, art, political theory, music, civics, rhetoric, geography, classics, science and math, and critical thinking, among other fields of enquiry.

 

You see, there is, I propose, a kind of DNA inherent in religion. This is rooted in origin stories of course, but also revealed by their proclivity toward change achieved through a kind of natural selection not dissimilar to that of living organisms. An evolutionary change in which the faithful — individuals, whole peoples, and formal institutions — are the animating force. Where change is what’s constant. We have seen this dynamical process result in the shaping and reshaping of values, moral injunctions, institutions, creeds, epistemologies, language, organisation, orthodoxies, practices, symbols, and cultural trappings.

 

In light of this evolutionary change, a key supporting pillar of an intellectually robust, curious society is to learn — through the power of unencumbered ideas and intellectual exploration — what matters to the development and practice of the world’s many religions. The aim being to reveal how doctrine has been reinterpreted over time, as well as to help students shed blinkers to others’ faith, engage in free-ranging dialogue on the nature, mindset, and language of religion writ large, and assume greater respect and tolerance.


Democracies are one example of where teaching about religion as an academic exercise can take firmest hold. One goal would be to round out understanding, insights, skills, and even greater wisdom for effective, enlightened citizenship. Such a program’s aim would be to encompass all religions on a par with one another in importance and solemnity, including those spiritual belief systems practiced by maybe only a few — all such religious expression nonetheless enriched by the piloting of their scriptures, ideologies, philosophies, and primary texts.

 

The objective should be to teach religious tenets as neutral, academic concepts, rather than doctrinal matters of faith, the latter being something individuals, families, and communities can and should choose for themselves. Such that, for example, whose moral code and doctrinal fundamentals one ought to adopt and whose to shy from are avoided — these values-based issues regarded as improper for a public-education forum. Although history has shown that worship is a common human impulse around the world, promoting worship per se ought not be part of teaching about religions. That’s for another time and place. 


Part and parcel, the instructional program should respect secularist philosophies, too. Like those individuals and families who philosophically regard faith and notions of transcendentalism as untenable, and see morality (good works) in humanistic terms. And like those who remain agnostically quizzical while grappling with what they suppose is the unknowability of such matters as higher-order creators, yet are at peace with their personal indecision about the existence of a deity. People come to these philosophical camps on equal footing, through deliberation and well-intentioned purposes — seekers of truth in their own right.

 

In paralleling how traditional world histories are presented, the keystone to teaching about religions should be intellectual honesty. Knowing what judiciously to put in and leave out, while understanding that tolerance and inclusion are core to a curious society and informed citizenry. Factually and scholarly opening minds to a range of new perspectives as windows on reality.

 

As such, teachings focus should be rigorously academic and instructional, not devotional. In that regard, it’s imperative that schools sidestep breaching the exclusive prerogative of families, communities, and religious institutions to frame whose ‘reality’ — whose truth, morality, orthodoxy, ritual, holy ambit, and counsel — to live by.


These cautions notwithstanding, it seems to me that schools ought indeed seek to teach the many ways in which the world’s religions are a cornerstone to humanity’s cultural, anthropological, and civilisational ecology, and thus a core component of the millennia-long narratives of world history.

 

03 July 2022

Picture Post #76 Ancient Salt Marshes



'Because things don’t appear to be the known thing; they aren’t what they seemed to be
neither will they become what they might appear to become.'

 

Posted by Thomas Scarborough


Remains of Cape Town's Salt Marshes (Thomas Scarborough, 2022)

In Cape Town, there are everywhere reminders of the Holocene ‘transgression’ which peaked around 3400 BC. At that time, most of the area on which the city now stands was invaded by sea and salt marshes. Then the sea retreated once more.
 
However, in this picture, railway lines, separating Cape Town’s suburbs of Milnerton and Rugby, reveal the remains of ancient marshes on either side of the tracks. One finds patches of these marshes all over: along freeways, on undeveloped properties, and surrounding remaining wetlands, in particular. Once one is aware of them, they seem to be everywhere. They are older than history, so often undervalued, and yet they are still among us.

The image reminds us us that another kind of history surrounds us everywhere. This is cultural history. I think in particular of the culture of thought. Richard Feynman once said, ‘History is fundamentally irrelevant.’ But is it? Where did today’s thoughts come from? Why? How much more would we better understand, if we were aware of the ancient remains among us?

26 June 2022

The Rules of Capitalism

by Allister J. Marran

The philosophical theologian Paul Tillich once wrote, ‘The fundamental virtues in the ethics of a capitalist society are economic efficiency, developed to the utmost degree of ruthless activity.’

The rules of capitalism put profit over everything else. Everything else. Nothing is sacred or taboo.

It is a complex man-made set of rules, it does not exist in nature, and requires its servants to ignore common sense and its obvious dangers and pitfalls.

It is a giant pyramid scheme of investors and producers at the top, and consumers down below, that requires the base to constantly grow, which is why we now have eight billion plus people on a planet that has very limited resources. It demands infinite growth cycles when raw materials are in short and finite supply.

To ensure its ongoing sustainability, we have constantly to create hype about new products that nobody wanted or asked for in order to make another sale, with built in obsolescence so that we can sell a new model again tomorrow.

Marketing costs for products and services often far exceed R&D and cost-of-production budgets, in order to convince you to fill your house to a large degree with, call it ‘trinkets’, ‘junk’.

The over-mining, over-fishing, over-production, and mass pollution is not sustainable. That's simply a fact.

While every scientist on earth is predicting doom and gloom for future generations, the economist disagrees, and tells us to put out heads in the sand, and ignore the signs. Keep calm and keep spending.

There is another thing. In its appetite to compete, capitalist economics has now become the science of scarcity.  In order to compete, we need to optimize—and optimize everything we possibly can. We strive for less wastage, smaller margins of error, faster turnover.

This means that we sail ever closer to the wind. Let one thing go wrong—a computer hack, a bacterial contamination, a military invasion in a faraway place—and millions of people’s livelihoods and even lives may be imperilled.

As capitalism multiplies the dangers, so it multiplies our vulnerability.

This generation, our generation, the ones who were told by the scientists and experts to just look around and heed the obvious warnings, will be known as the idiots who could have stopped it but chose greed over life, profit over common sense.

We have no water where I live, because the rains haven't come for nearly 10 years. The world is cooling where it's hot, and heating up where it's cold. Smog sits over the cities, and poison infects our water sources. Landfills are full, and growing fuller every day. Our oceans are being fished to extinction, and good farming land is being paved over and cleared for urban development and new roads and highways.

Having stuff, and being able to read and write, and exploit a man-made system, does not make a person smart. If people can't see beyond their basic, immediate, satiating needs and zoom out to see the bigger picture of an exhausted ecosystem with resources heading to zero, and the only world we will ever have struggling to cope, then perhaps we were never that smart or evolved in the first place.

We do not have a divine right to rule this planet. We are just the next animal to over-evolve and get to the top of the food chain. It's an awesome responsibility which sees us on a perilous perch which can be toppled if we do not proceed with caution and humility.

Just ask the previous mantle holders, those fearsome and magnificent dinosaurs, how tenuous that grip on the top dog spot is.

We can’t ask them, of course. They are extinct.

19 June 2022

Modesty (a Poem)

by Carl-Theodor Olivet *

Henri Matisse, Branch of Flowers, 1906.

I love you and give myself to you,
Yet you must promise me one thing;
Do not betray too much of you to me,
It could break our happiness in pieces …

The way that you consider and joke and risk it,
All have their origins, I know,
Your clear look, the vague fear,
The cheeky mockermouth.

Your magic, which tenderly plays about you,
Just as it is, is entirely preserved for me,
The days, the nights, year after year,
Shall design me delightfully.

Do not forewarn me, that I’ve gone too far
With my sweet daydreams,
I’m treating myself to a piece of eternity
And would not want to squander any part of you.

Do not betray yourself, lest you manoeuvre
into place to thwart me in some way, wickedly,
I do not analyse either
Whether you will always remain so unique.

Don’t speak in reply; that would be a mistake,
Permit me to paint the greatest bliss,
It will yet be, as mother says:
There comes a day to pay one’s wages.

If it’s over, then say it coldly, boldly.
That has broken many hearts, indeed …
And would tear me likewise from my dream.
Surely, I will get over it. 

 

* Theo Olivet is an author, artist, and retired judge in Schleswig-Holstein.