Showing posts with label Isaac Newton. Show all posts
Showing posts with label Isaac Newton. Show all posts

26 April 2020

The Curiosity of Creativity and Imagination

In Chinese mythology, dragon energy is creative. It is a magical energy, the fire of the soul itself. The dragon is the symbol of our power to transmute and create with imagination and purpose.
Posted by Keith Tidman

Most people would agree that ‘creativity’ is the facility to produce ideas, artifacts, and performances that are both original and valuable. ‘Original’ as in novel, where new ground is tilled. While the qualifier ‘valuable’ is considered necessary in order to address German philosopher Immanuel Kant’s point in The Critique of Judgment (1790) that:

‘Since there can also be original nonsense, its products [creativities] must at the same time be models, i.e., be exemplary’.

An example of lacking value or appropriateness in such context might be a meaningless sequence of words, or gibberish.

Kant believed that creativity pertains mostly to the fine arts, or matters of aesthetics — a narrower perspective than today’s inclusive view. He contended, for example, that genius could not be found in science, believing (mistakenly, I would argue) that science only ever adheres to preset methods, and does not allow for the exercise of imagination. He even excluded Isaac Newton from history’s pantheon of geniuses, despite respecting him as a great man of science.

Today, however, creativity’s reach extends along vastly broader lines, encompassing fields like business, economics, history, philosophy, language, physics, biology, mathematics, technology, psychology, and social, political, and organisational endeavours. Fields, that is, that lend themselves to being, at their creative best, illuminative, nontraditional, gestational, and transformational, open to abstract ideas that prompt pondering novel possibilities. The clue as to the greatness of such endeavors is provided by the 16th/17th-century English philosopher Francis Bacon in the Novum Organum (1620), where he says that:

‘By far the greatest obstacle to the progress . . . and undertaking of new tasks and provinces therein is found in this — that men despair and think things impossible’.

Accordingly, such domains of human activity have been shown to involve the same explorative and generative functions associated with the brain’s large-scale neural networks. A paradigm of creative cognition that is flexible and multidimensional, and one that calls upon several features:
  • an unrestricted vision of what’s possible,
  • ideation, 
  • images, 
  • intuitions,
  • thought experiments, 
  • what-if gaming, 
  • analogical reasoning, 
  • metaphors, 
  • counterfactual reasoning, 
  • inventive free play, 
  • hypotheses, 
  • knowledge reconceptualisation, 
  • and theory selection.
Collectively, these are the cognitive wellspring of creative attainment. To those extents, creativity appears fundamental to defining humanity — what shapes us, through which individual and collective expression occurs — and humanity’s seemingly insatiable, untiring quest for progress and attainment.

Societies tend to applaud those who excel at original thought, both for its own sake and for how it advances human interests. That said, these principles are as relevant to the creative processes of everyday people as to those who eventually are recorded in the annals of history as geniuses. However, the creative process does not start out with the precise end (for example, a poem) and the precise means to getting there (for example, the approach to writing that poem) already known. Rather, both the means and the end product are discoverable only as the creative process unfolds.

Above all, imagination sits at the core of creativity. Imagination is representational, of circumstances not yet real but that nevertheless can evoke emotions and behaviours in people. The world of imagination is, of course, boundless in theory and often in practice, depending on the power of one’s mind to stretch. The American philosopher John Dewey spoke to this point, chalking up every major leap in science, as he boldly put it in The Quest for Certainty, to ‘a new audacity of the imagination’. Albert Einstein’s thoughts paralleled these sentiments, declaring in an interview in 1929 that ‘Imagination is more important than knowledge’. Wherein new possibilities take shape. Accordingly and importantly, imagination yields ideas that surpass what’s already supposed.

Imagination is much more, however, than a mere synonym for creativity, otherwise the term would simply be redundant. Imagination, rather, is a tool: freeing up, even catalysing, creativity. To those ends, imagination entails visualisation (including thought experiments, engaged across disciplines) that enables a person to reach out for assorted, and changing, possibilities — of things, times, places, people, and ideas unrestricted by what’s presumed already experienced and known concerning subjective external reality. Additionally, ‘mirroring’ might occur in the imaginative process, where the absence of features of a mental scenario are filled in with analogues plucked from the external world around us. Ultimately, new knowledge and beliefs emerge, in a progressive loop of creation, validation, application, re-imagination.

Imagination might revolve around diverse dominions, like unconstrained creative thought, play, pretense, the arts, allegorical language, predictive possibilities, and imagery, among others. Imagination cannot, however, guarantee creative outcomes — nor can the role of intuition in human cognition — but imagination is essential (if not always sufficient) for creative results to happen. As explained by Kant, imagination has a ‘constitutive’ role in creativity. Something demonstrated by a simple example offered by 17th-century English philosopher Thomas Hobbes:

‘as when from the sight of a man at one time, and a horse at another, we conceive in our mind a Centaur’. 

Such imaginative, metaphorical playfulness being the stuff not only of absorbed, undaunted children, of course — though they are notably gifted with it in abundance — but also of freethinking adults. Adults whose minds marvel at alternatives in starting from scratch (tabula rasa), or from picking apart (divergence) and reassembling (convergence) presumed reality.

The complexities of imagination best nourish what one might call ‘purposeful creativity’ — where a person deliberately aims to achieve a broad, even if initially indeterminate outcome. Such imagining might happen either alone or with the involvement of other participants. With purposeful creativity, there’s agency and intentionality and autonomy, as is quintessentially the case of the best of thought experiments. It occasions deep immersion into the creative process. ‘Passive creativity’, on the other hand, is where someone has a spontaneous, unsought solution (a Eureka! moment) regarding a matter at hand.

Purposeful, or directed, creativity draws on both conscious and unconscious mechanisms. Passive creativity — with mind open to the unexpected — largely depends on unconscious mental apparatuses, though with the mind’s executive function not uncommonly collaboratively and additively ‘editing’ afterwards, in order to arrive at the final result. To be sure, either purposeful or passive creativity is capable of summoning remarkable insights.

The 6th-century BC Chinese spiritual philosopher Laozi perhaps most pithily described people’s capacity for creativity, and its sometimes-companion genius, with this figurative depiction in the Teo Te Ching, the context being to define ‘genius’ as the ability to see potential: ‘To see things in the seed’ — long before germination eventually makes those ‘things’ apparent, even obvious, to everyone else and become stitched into the fabric of society and culture.

27 May 2018

Occam's Razor: On the Virtue of Simplicity

As a Franciscan monk, simplicity was at the heart of   William's daily life.
Posted by Keith Tidman

The English philosopher and monk, William of Occam (c. 1287–1347), surely got it about right with his ‘law of parsimony’, which asserts, as a general principle, that when there are two competing explanations or theories, the one with the fewest assumptions (and fewest guesses or variables) more often is to be prefered. As the ‘More than Subtle Doctor’ couched the concept in his Summa Logicae, ‘It is futile to do with more what can be done with fewer’ — itself an example of ‘economy’. William’s law is typically referred to as Occam’s razor — the word ‘razor’ signifying a slicing away of arguably unnecessary postulates. In many instances, Occam’s razor is indeed right; in other examples, well, perhaps not. Let’s explore the ideas further.

Although the law of parsimony has always been most closely associated with William of Occam, (Occam, now called ‘Ockham’, being the village where he was born), he hasn’t been the principle’s only proponent. Just as famously, a millennia and a half earlier, the Greek philosopher Aristotle said something similar in his Posterior Analytics:
‘We may assume the superiority ceteris paribus [other things being equal] of the demonstration which derives from fewer postulates or hypotheses.’
And seven centuries after William, Albert Einstein, perhaps thinking of his own formulation of special relativity, noted that ‘the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible’. Many other philosophers, scientists, and thinkers have also admired the concept.

Science’s favoritism toward the parsimony of Occam’s razor is no more apparent than in the search for a so-called ‘theory of everything’ — an umbrella theory unifying harmoniously all the physical forces of the cosmos, including the two cornerstones of 20th-century physics: the general theory of relativity (describing the macro scale) and quantum theory (describing the micro scale). This holy grail of science has proven an immense but irresistible challenge, its having occupied much of Einstein’s life, as it has the imagination of other physicists. But the appeal to scientists is in a unified (presumed final or all-encompassing) theory being condensed into a single set of equations, or perhaps just one equation, to describe all physical reality. The appeal of the theory’s potential frugality in coherently and irreducibly explaining the universe remains immense.

Certainly, philosophers too, often regard parsimony as a virtue — although there have been exceptions. For clarity, we must first note that parsimony and simplicity are usually, as a practical matter, considered one and the same thing — that is, largely interchangeable. For its part, simplicity comes in at least two variants: one equates to the number and complexity of kinds of things hypothesised, and sometimes referred to as ‘elegance’ or ‘qualitative parsimony’; the second equates to the number and complexity of individual, independent things (entities) hypothesised, and sometimes referred to as ‘quantitative parsimony’. Intuitively, people in their daily lives usually favor simpler hypotheses; so do philosophers and scientists. For example, we assume that Earth’s gravity will always apply rather than its suddenly ceasing — that is, rather than objects falling upward unassisted.
Among the philosophers who weighed in on the principle was Thomas Aquinas, who noted in Summa Theologica in the 13th century, ‘If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments where one suffices.’ And the 18th-century German philosopher Immanuel Kant, in the Critique of Pure Reason, similarly observed that ‘rudiments or principles must not be unnecessarily multiplied.’ In this manner, philosophers have sometimes turned to Occam’s razor to criticise broad metaphysical hypotheses that purportedly include the baggage of unnecessary ontological concepts. An example of falling under such criticism via the application of Occam’s razor is Cartesian dualism, which physicalists argue is flawed by an extra category — that is, the notion that the mind is entirely apart from the neuronal and synaptic activity of the brain (the physical and mental purportedly being two separate entities).

Returning to Einstein, his iconic equation, E=mc2, is an example of Occam’s razor. This ‘simple’ mathematical formula, which had more-complex precursors, has only two variables and one constant, relating (via conversion) the amount of energy to the amount of matter (mass) multiplied by the speed of light squared. It allows one to calculate how much energy is tied up in the mass of any given object, such as a chickpea or granite boulder. The result is a perfectly parsimonious snapshot of physical reality. But simplicity isn’t always enough, of course. There must also be consistency with the available data, with the model necessarily accommodating new (better) data as they become available.

Other eminent scientists, like the 17th-century physicist and mathematician Isaac Newton, similarly valued this principle of frugality. The first of Newton’s three ‘rules of reasoning in philosophy’ expressed in his Principia Mathematica offers:
‘We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances. . . . Nature is pleased with simplicity, and affects not the pomp of superfluous causes.’
But, as noted above, Occam’s razor doesn’t always lead to truth per se. Nor, importantly, does the notion of ‘simplicity’ necessarily equate to ease of explanation or ease of understanding. Here are two examples where frugality arguably doesn’t win the day. One theory presents a complex cosmological explanation of the Big Bang and the physical evolution of a 13.8-billion-year-old universe. A single, but very-late-on-the-stage thread of that cosmological account is the intricate biological evolution of modern human beings. A second, creationist explanation of the current universe and of human beings — with far fewer assumptions and hypotheses — describes both as having roots in a single event some 6,000 to 10,000 years ago, with the cosmos conveniently made to look older. Available evidence suggests, however, that the first explanation is correct, despite the second explanation’s parsimony.

In broad ways, Occam’s razor has been supported by the empirical successes of theories that proved parsimonious in their explanations: with fewer causes, entities, properties, variables, and processes embedded in fewer assumptions and hypotheses. However, even though people tend instinctively and understandably to be drawn toward simpler accounts of hoped-for reality, simplicity hasn’t always triumphed. For example, the earlier nature-versus-nurture debate posed a simpler, albeit false, either-or dichotomy in trying to understand a person’s development and behaviour on the basis of either the environment — the influence of external factors, such as experience and learning, on an otherwise blank slate or perhaps set of instincts — or genes and heritability — that is, biological pre-wiring. Reality is, of course, a complex mix of both nature and nurture, with one influencing the other.

To avoid such pitfalls, as the English mathematician and philosopher Alfred North Whitehead pointedly (and parsimoniously) suggested:
‘. . . every natural philosopher should seek simplicity and distrust it.