Showing posts with label Albert Einstein. Show all posts
Showing posts with label Albert Einstein. Show all posts

30 December 2018

Breaking the Universal Speed Limit?


Well, how do you measure the speed of light - and thus check that everything is observing this ‘universal speed limit’? Seven years ago, the closing months of 2011 saw much excitement in sciencey circles with the highly mediatized announcement that researchers at CERN, the world's most expensive physics laboratory, had detected sub-atomic particles apparently travelling faster than the speed of light. This, the papers assured us, was in defiance of Einstein and all the rules of relativity. Yet the plain ‘fact’ of the matter is that the speed of light is not magically ‘out there’ but merely a human convention. In a relativistic universe, how could it be otherwise?

Here the point is put nicely by Burt Jordaan in a blog posting of January 25, 2010. Burt writes:
‘In order to measure any one-way velocity, we essentially need two clocks: one at the start and one at the end. Obviously, the two clocks need to be synchronized and run at the same rate (and to be sure, they must not be moving relative to each other and also be at the same gravitational potential). Yet we reasonably assume that the two clocks run at the same rate, at least close enough for all practical purposes. Now we need to synchronize the two clocks to read the same at the same moment. How is this done?


In his 1905 paper on Special Relativity, Einstein says: “We have not defined a common ‘time’ for A and B, for the latter cannot be defined at all unless we establish by definition that the ‘time’ required by light to travel from A to B equals the ‘time’ it requires to travel from B to A”.

One can reasonably read Einstein's ‘by definition’ as ‘by convention’. 
Using Einstein’s convention to set the distant clock at a known distance, call it ‘D’, in empty space, we send a light signal at (say) time zero and when the distant clock detects the signal, it sets its time to D/c sec (the light travel time), where c is the standard speed of light in vacuum.

Now we can measure the speed of any object moving between the two clocks. We can also use the two clocks to measure the one-way speed of light, but we are obviously guaranteed to always get c. In this sense, we get the speed of any object only relative to c and not absolutely. 
In this way, the one-way speed of light is a convention, depending on the convention for clock synchronization."
Burt concludes by observing that there is a general belief system prevailing in physics that ‘whatever is known exists and rest is non-existent’. It is because of this belief system that scientists tend to fill  these existence-nonexistence gaps by cofficients. Yet there can be much more existent and important entities quite apart from the usual quantitites of space and time which physicist are led to ignore. This attitude is the reason that the existence of Dark Matter was unimaginable for four hundred years. As to the spped of light itslef, Burt says explicitly that he cannot understand why Einstein established a ‘religion of special abilities and qualities’ for light. Specifcally, he objects tha even though there are ways to measure the speed of light, there is no reason to believe that nothing can travel faster.

Our own correspondent, Muneeb Faiq, took up the issue for Pi too. Here he offers a thought experiment which again shows the arbitariness of the ‘speed of light’.
‘In fact, there is a lot of confusion about the harmony between the classical and quantum definitions of speed.If both quantum speed and classical speed mean the same then a very interesting difficulty comes to the front.

Suppose there exists only one body in the universe. Just a single point mass and space. Is it at rest or in motion? If, however, there come out two photons of light moving parallel to each other. What speed are they moving at? If an observor is stationed on the point mass, then both the photons are moving with the velocity of light. Suppose, all of a sudden, the point mass ceases to exist. Now there are two photons moving with same speed parallel to each other. Nothing else exists except space. Are these two photons moving now because they are at same position in relation to each other which will be defined as the state of rest.

It is interesting to note that before the point mass existed, the two photons were moving with the velocity of light. Now since the point mass has ceased to exist but nothing changed about the photons, they are not supposed to be moving now even if they are moving with the same previous speed.’

27 May 2018

Occam's Razor: On the Virtue of Simplicity

As a Franciscan monk, simplicity was at the heart of   William's daily life.
Posted by Keith Tidman

The English philosopher and monk, William of Occam (c. 1287–1347), surely got it about right with his ‘law of parsimony’, which asserts, as a general principle, that when there are two competing explanations or theories, the one with the fewest assumptions (and fewest guesses or variables) more often is to be prefered. As the ‘More than Subtle Doctor’ couched the concept in his Summa Logicae, ‘It is futile to do with more what can be done with fewer’ — itself an example of ‘economy’. William’s law is typically referred to as Occam’s razor — the word ‘razor’ signifying a slicing away of arguably unnecessary postulates. In many instances, Occam’s razor is indeed right; in other examples, well, perhaps not. Let’s explore the ideas further.

Although the law of parsimony has always been most closely associated with William of Occam, (Occam, now called ‘Ockham’, being the village where he was born), he hasn’t been the principle’s only proponent. Just as famously, a millennia and a half earlier, the Greek philosopher Aristotle said something similar in his Posterior Analytics:
‘We may assume the superiority ceteris paribus [other things being equal] of the demonstration which derives from fewer postulates or hypotheses.’
And seven centuries after William, Albert Einstein, perhaps thinking of his own formulation of special relativity, noted that ‘the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible’. Many other philosophers, scientists, and thinkers have also admired the concept.

Science’s favoritism toward the parsimony of Occam’s razor is no more apparent than in the search for a so-called ‘theory of everything’ — an umbrella theory unifying harmoniously all the physical forces of the cosmos, including the two cornerstones of 20th-century physics: the general theory of relativity (describing the macro scale) and quantum theory (describing the micro scale). This holy grail of science has proven an immense but irresistible challenge, its having occupied much of Einstein’s life, as it has the imagination of other physicists. But the appeal to scientists is in a unified (presumed final or all-encompassing) theory being condensed into a single set of equations, or perhaps just one equation, to describe all physical reality. The appeal of the theory’s potential frugality in coherently and irreducibly explaining the universe remains immense.

Certainly, philosophers too, often regard parsimony as a virtue — although there have been exceptions. For clarity, we must first note that parsimony and simplicity are usually, as a practical matter, considered one and the same thing — that is, largely interchangeable. For its part, simplicity comes in at least two variants: one equates to the number and complexity of kinds of things hypothesised, and sometimes referred to as ‘elegance’ or ‘qualitative parsimony’; the second equates to the number and complexity of individual, independent things (entities) hypothesised, and sometimes referred to as ‘quantitative parsimony’. Intuitively, people in their daily lives usually favor simpler hypotheses; so do philosophers and scientists. For example, we assume that Earth’s gravity will always apply rather than its suddenly ceasing — that is, rather than objects falling upward unassisted.
Among the philosophers who weighed in on the principle was Thomas Aquinas, who noted in Summa Theologica in the 13th century, ‘If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments where one suffices.’ And the 18th-century German philosopher Immanuel Kant, in the Critique of Pure Reason, similarly observed that ‘rudiments or principles must not be unnecessarily multiplied.’ In this manner, philosophers have sometimes turned to Occam’s razor to criticise broad metaphysical hypotheses that purportedly include the baggage of unnecessary ontological concepts. An example of falling under such criticism via the application of Occam’s razor is Cartesian dualism, which physicalists argue is flawed by an extra category — that is, the notion that the mind is entirely apart from the neuronal and synaptic activity of the brain (the physical and mental purportedly being two separate entities).

Returning to Einstein, his iconic equation, E=mc2, is an example of Occam’s razor. This ‘simple’ mathematical formula, which had more-complex precursors, has only two variables and one constant, relating (via conversion) the amount of energy to the amount of matter (mass) multiplied by the speed of light squared. It allows one to calculate how much energy is tied up in the mass of any given object, such as a chickpea or granite boulder. The result is a perfectly parsimonious snapshot of physical reality. But simplicity isn’t always enough, of course. There must also be consistency with the available data, with the model necessarily accommodating new (better) data as they become available.

Other eminent scientists, like the 17th-century physicist and mathematician Isaac Newton, similarly valued this principle of frugality. The first of Newton’s three ‘rules of reasoning in philosophy’ expressed in his Principia Mathematica offers:
‘We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances. . . . Nature is pleased with simplicity, and affects not the pomp of superfluous causes.’
But, as noted above, Occam’s razor doesn’t always lead to truth per se. Nor, importantly, does the notion of ‘simplicity’ necessarily equate to ease of explanation or ease of understanding. Here are two examples where frugality arguably doesn’t win the day. One theory presents a complex cosmological explanation of the Big Bang and the physical evolution of a 13.8-billion-year-old universe. A single, but very-late-on-the-stage thread of that cosmological account is the intricate biological evolution of modern human beings. A second, creationist explanation of the current universe and of human beings — with far fewer assumptions and hypotheses — describes both as having roots in a single event some 6,000 to 10,000 years ago, with the cosmos conveniently made to look older. Available evidence suggests, however, that the first explanation is correct, despite the second explanation’s parsimony.

In broad ways, Occam’s razor has been supported by the empirical successes of theories that proved parsimonious in their explanations: with fewer causes, entities, properties, variables, and processes embedded in fewer assumptions and hypotheses. However, even though people tend instinctively and understandably to be drawn toward simpler accounts of hoped-for reality, simplicity hasn’t always triumphed. For example, the earlier nature-versus-nurture debate posed a simpler, albeit false, either-or dichotomy in trying to understand a person’s development and behaviour on the basis of either the environment — the influence of external factors, such as experience and learning, on an otherwise blank slate or perhaps set of instincts — or genes and heritability — that is, biological pre-wiring. Reality is, of course, a complex mix of both nature and nurture, with one influencing the other.

To avoid such pitfalls, as the English mathematician and philosopher Alfred North Whitehead pointedly (and parsimoniously) suggested:
‘. . . every natural philosopher should seek simplicity and distrust it.