Showing posts with label scientific method. Show all posts
Showing posts with label scientific method. Show all posts

12 January 2020

A Modest Proposal for Science

Posted by Andrew Porter

For several centuries, modern science has banked on and prided itself in ‘the scientific method’. This scheme of hypothesis and experiment has been useful and effective in countering superstition. Discoveries of all sorts have been made and verified, from the circumference of orbits to the range of elements to the function of organelles and proteins in a cell. Confirmation from experiment seems like a clear way to separate fact from fiction. But it is crucial to note that the scientific method also fails.

Recent conundrums of physicality, consciousness, entanglement, dark matter, and the nature of natural laws have spurred many to rethink assumptions and even findings. Our search for what is real and natural needs a new method, one that is in keeping with the natural facts themselves – natural facts not as reduced or squeezed or contorted by the scientific method, but as their own holistic selves. The method of approach and apprehending that seems to offer the most promising advance is that which consists of a whole person in a whole natural environment.

Why do I emphasise wholeness? Because facts shrink away at the first sign of partiality or limited agenda. Truth, conversely, tends to open itself to an apt seeker, to a method that goes whole at a host of levels. Nature tends to recognise her own, it seems.

Kristin Coyne, in an article called ‘Science on the Edge’ in the February 17, 2017 issue of the magazine, Fields: Science, Discovery & Magnetism, writes:
‘At the dividing line between two things, there’s often no hard line at all. Rather, there’s a system, phenomenon or region rich in diversity or novel behavior – something entirely different from the two things that created it.’
She offers various examples of the same: fringe physics, borderline biology, and crossover chemistry. Such ‘science on the edge’ is one aspect of the changes typical science is undergoing. Other researchers in areas such as telepathy and theoretical physics are pushing the bounds of science while arguing that it certainly is science, just a deeper form.

This suggested new method, that would largely overturn contemporary science, would measure, as it were, by that of nature’s measurements: it is anti-reductionist; it is synthetic more than analytic. As we are learning, it may not be too much to say that one has to be the facts to know the facts, to be a synergy of ‘observer’ and ‘observed’ at all levels. The knowledge gleaned from wholeness is like a star’s heat and light understood, not just the hydrogen and helium involved.

This idea of the ‘scientist’ in tune with nature in a thorough way would be the human equivalent of a goshawk whose instincts are a portion of Earth-wide wildness. No disjunct with results that turn self-referential and untrue. If one is studying an ecosystem, for instance, he or she, or his or her team, must, by the requirements of nature, be of the same stuff and of the same conceptions as the individualities, relations, and wholes of that ecosystem. So much more of the actuality reveals itself to the sympathetic, of-a-piece ‘observer’. If we ignore or shunt aside the question of what is a whole person, how can we ever expect to discern the deeper reality of nature?

It seems to hold true that the more receptive the subject is to the essence and character of the object, the better it is understood. Who knows one’s dog better: a sympathetic owner or an objective voice? If the dog is sick, perhaps the latter, but all the time the dog is exuberantly healthy, the former is the one who comprehends.

The goal, of course, is to elucidate facts, to unite in some meaningful way with reality. Delusion is all too easy, and partial truths sustain centuries of institutions, positions, governments, and cultures. Modern science started out as reactionary in the sense of being hostile to things like superstition or intuition or revelation. It substituted experiment and observation, keeping the studied apart from those who studied. This is fine for shallow comprehension, but it only gets you so far. It obscures another possibility, that is somewhat similar to the communion and connection between the quantum realm and the macro world.

I suggest that deep facts only reveal themselves to a person metamorphosed, as it were, into ways of being in keeping with the parts or portions of nature studied. All nature may be of this type, open to human comprehension only as that comprehension is within a whole person. What a complete person is and what a fullness of nature is might not only be a philosopher’s job, but the focus of science itself, re-trained to benefit from its transformed method.

The hint in current puzzlements is that science in the 21st century and beyond may benefit significantly by re-crafting itself. A transformed method might yield deeper or actual knowledge. That is, knowing as opposed to seeming to know, may require a new approach.

Jacob Needleman and David Applebaum wrote, ‘Unless scientific progress is balanced by another kind of enquiry, it will inevitably become an instrument of self-destruction.’

The ‘objective’ revolution need not be the last. In today’s world, we have the ball-and-chain of modern scientific ways and even scientism weighting our thinking; it would be good to free ourselves from this. But we are confused. About what of objectivity is liberating or limiting, and what of subjectivity is useful or obfuscatory.

23 June 2019

The world in crisis: it’s not what we think

Posted by Thomas Scarborough

The real danger is an explosion - of Big Data

We lived once with the dream of a better world: more comfortable, more secure, and more advanced.  Political commentator Dinesh D’Souza called it ‘the notion that things are getting better, and will continue to get better in the future’.  We call it progress.  Yet while our world has in many ways advanced and improved, we seem unsure today whether the payoff matches the investment.  In fact, we all feel sure that something has gone peculiarly wrong—but what?  Why has the climate turned on us?  Why is the world still unsafe?  Why do we still suffer vast injustices and inequalities?  Why do we still struggle, if not materially, then with our sense of well-being and quality of life?  Is there anything in our travails which is common to all, and lies at the root of them all?

It will be helpful to consider what it is that has brought us progress—which in itself may lead us to the problem.  There have been various proposals:  that progress is of the inexorable kind; that it is illusory and rooted in the hubristic belief that earlier civilisations were always backward; or it is seen as a result of our escape from blind authority and appeal to tradition.  Yet above all, progress is associated with the liberating power of knowledge, which now expands at an exhilarating pace on all fronts.  ‘The idea of progress,’ wrote the philosopher Charles Frankel, ‘is peculiarly a response to ... organized scientific inquiry’.

Further, science, within our own generation, has quietly entered a major new phase, which began around the start of the 21st Century.  We now have big data, which is extremely large data sets which may be analysed computationally.

Now when we graph the explosion of big data, we interestingly find that this (roughly) coincides on two axes with various global trends—among them increased greenhouse gas emissions, sea level rise, economic growth, resource use, air travel—even increased substance abuse, and increased terrorism.  There is something, too, which seems more felt than it is demonstrable.  A great many people sense that modern society burdens us—more so than it did in former times.

Why should an explosion of big data roughly coincide—even correlate—with an explosion of global travails?

On the one hand, big data has proved beyond doubt that it has many benefits.  Through the analysis of extremely large data sets, we have found new correlations to spot business trends, prevent diseases, and combat crime—among other things.  At the same time, big data presents us with a raft of problems: privacy concerns, interoperability challenges, the problem of imperfect algorithms, and the law of diminishing returns.  A major difficulty lies in the interpretation of big data.  Researchers Danah Boyd and Kate Crawford observe, ‘Working with Big Data is still subjective, and what it quantifies does not necessarily have a closer claim on objective truth.’  Not least, big data depends on social sorting and segmentation—mostly invisible—which may have various unfair effects.

Yet apart from the familiar problems, we find a bigger one.  The goal of big data, to put it very simply, is to make things fit.  Production must fit consumption; foodstuffs must fit our dietary requirements and tastes; goods and services must fit our wants and inclinations; and so on.  As the demands for a better fit increase, so the demand for greater detail increases.  Advertisements are now tailored to our smallest, most fleeting interests, popping up at every turn.  The print on our foodstuffs has multiplied, even to become unreadable.  Farming now includes the elaborate testing and evaluation of seeds, pesticides, nutrients, and so much more.  There is no end to this tendency towards a better fit.

The more big data we have, the more we can tailor any number of things to our need:  insurances, medicines, regulations, news feeds, transport, and so on.  However, there is a problem.  As we increase the detail, so we require great energy to do it.  There are increased demands on our faculties, and on our world—not merely on us as individuals, but on all that surrounds us.  To find a can of baked beans on a shop shelf is one thing.  To have a can of French navy beans delivered to my door in quick time is quite another.  This is crucial.  The goal of a better fit involves enormous activity, and stresses our society and environment.  Media academic Lloyd Spencer writes, ‘Reason itself appears insane as the world acquires systematic totality.’  Big data is a form of totalitarianism, in that it requires complete obedience to the need for a better fit.

Therefore the crisis of our world is not primarily that of production or consumption, of emissions, pollution, or even, in the final analysis, over-population.  It goes deeper than this.  It is a problem of knowledge—which now includes big data.  This in turn rests on another, fundamental problem of science: it progresses by screening things out.  Science must minimise unwanted influences on independent variables to succeed—and the biggest of these variables is the world itself.

Typically, we view the problems of big data from the inside, as it were—the familiar issues of privacy, the limits of big data, its interpretation, and so on.  Yet all these represent an enclosed view.  When we consider big data in the context of the open system which is the world, its danger becomes clear.  We have screened out its effects on the world—on a grand scale.  Through big data, we have over-stressed the system which is planet Earth.  The crisis which besets us is not what we think.  It is big data.



The top ten firms leveraging Big Data in January 2018: Alphabet, Amazon, Microsoft, Facebook, Chevron, Acxiom, National Security Agency, General Electric, Tencent, Wikimedia (Source: Data Science Graduate Programs).


Sample graphs. Red shade superimposed on statistics from 2000.

24 December 2018

Homeopaths, Holocaust Deniers and 'Philosophers of Science'

On January 20, 2010, at 10:23 (Oxford time, we may suppose), thousands of brilliant minds tried to prove, by guzzling homeopathy pills, that homeopathic remedies could not kill people, and thus that homeopathy doesn't work (and that "there's nothing in it"). A magnificient demonstration of public adherence to the scientific method!
Reposted and updated from Pi Alpha. Edited by Martin Cohen with original research by Perig Gouanvic


“The misrepresentations of history presented by Holocaust deniers and other pseudo-historians are very similar in nature to the misrepresentations of natural science promoted by creationists and homeopaths. ... we find a wide variety of movements and doctrines, such as creationism, astrology, homeopathy, and Holocaust denialism that are in conflict with results and methods that are generally accepted in the community of knowledge disciplines. ”

- Stanford Encyclopedia of Philosophy


The Mass Suicide of Homeopathy Skeptics

Almost all of the systematic reviews in conventional journals start on a skeptical note. Indeed, nine out of ten of the articles begin with a statement that questions the scientific plausibility of homeopathy. Some of the articles use relatively strong language to make the point. For example, one by ‘Ernst and Pittler’ suggests that it is the use of ‘highly diluted material that overtly flies in the face of science and has caused homeopathy to be regarded as placebo therapy at best and quackery at worst’.

But to get a good sense of what the masses, including those who make up ‘the scientific consensus’, really think, Wikipedia is a passable indicator. Wikipedians, amongst them, in such articles, we find watchdogs of ‘reason’, including various hired professionals from the ‘Public Understanding of Science’ (and their trusted mercenaries) love to indulge in this dusty old strawman argument:
‘a 12C [homeopathic] solution is equivalent to a 'pinch of salt in both the North and South Atlantic Oceans'... One third of a drop of some original substance diluted into all the water on earth would produce a remedy with a concentration of about 13C.’
This is a stunning demonstration of the lack of intelligence not only of the ‘scientific consensus’, but of the democratic process of knowledge itself. And leading the process is Wikipedia, which turns donkeys into horses on a daily basis, as Socrates would say, while in the background is the poor state of debate between the Orthodoxy and the scientists and philosophers who are trying to make sense of homeopathy. Hahnemann spoke about a ‘forc’ that remained after dilutions and succussions, but pseudoskeptics have kept making the same strawman argument for the last 200 years.

The reality is that Hahnemann wrote a great deal and never shied away from philosophical questions. He argues:
‘A substance divided into ever so many parts must still contain in its smallest conceivable parts always some of this substance, and that the smallest conceivable part does not cease to be some of this substance and cannot possibly become nothing; - let them, if they are capable of being taught, hear from natural philosophers that there are enormously, powerful things (forces) which are perfectly destitute of weight.’
You may not agree, but it is not foolish stuff. Indeed, these days, the ‘homeopathic force’, for instance, could be described in a context of systems biology.

According to Ilya Prigogine, a Russian-born Belgian chemist best known for his definition of dissipative structures ‘and their role in thermodynamic systems far from equilibrium’(work that led him being awarded the Nobel Prize in Chemistry in 1977), in the domain of deterministic physics, all processes are time-reversible, meaning that they can proceed backward as well as forward through time. As Prigogine explains, determinism is fundamentally a denial of the arrow of time. With no arrow of time, there is no longer a privileged moment known as the ‘present’, which follows a determined ‘past’ and precedes an undetermined ‘future’. Instead, all of time is simply a given, with the future just as determined as the past. With irreversibility, the arrow of time is reintroduced to physics. Prigogine notes numerous examples of irreversibility, including diffusion, radioactive decay, solar radiation, weather and the emergence and evolution of life.

This applies especially well to homeopathy. Orthodox scientists evaluate homeopathy through the lens of the results (it’s only water/alcohol!) and tirelessly calculate oceanographic metaphors to deride what they believe is homeopathy, oblivious of the fact that dilution is conceived as a process leading to a change in the way the molecules of the solvent behave together — a change in the structure of water and a concurrent change in the forces likely to make these structures possible.


Brian Josephson, Nobel laureate of physics, has commented on a typical debunking exercise made by the New Scientist journal that:
‘criticisms [of homeopathy] centred around the vanishingly small number of solute molecules present in a solution after it has been repeatedly diluted are beside the point, since advocates of homeopathic remedies attribute their effects not to molecules present in the water, but to modifications of the water's structure. Simple-minded analysis may suggest that water, being a fluid, cannot have a structure of the kind that such a picture would demand. But cases such as that of liquid crystals, which while flowing like an ordinary fluid can maintain an ordered structure over macroscopic distances, show the limitations of such ways of thinking. There have not, to the best of my knowledge, been any refutations of homeopathy that remain valid after this particular point is taken into account.’
The particular homeopathic claim that water can ‘remember’ substances with which it has been in contact, and that such memory might be mediated by hydrogen bonds has also been criticised, typically on theoretical grounds. Many such arguments involve the short duration of individual hydrogen bonds in liquid water ( which is about a picosecond).

However, it is not to be assumed that the mesoscale structure of water must change on the same time scale. For example, in ice, hydrogen bonds are also very shortlived but an ice sculpture can ‘remember’ its shape over extended periods. (Here our essay assumes a suitbly seasonal feel - Editor.) On a smaller scale, cation hydrates are commonly described with particular structure (for example,  the octahedral Na+(H2O)6 ion) even though the individual water molecules making up such structures have very brief residence times (measured in microseconds).

Such arguments ignore the fact that the behaviour of a large population of water molecules may be retained even if that of individual molecules is constantly changing, just as a wave can cross an ocean, remaining a wave although its molecular content is continuously changing.

Evidence denying the long life of water clusters is mostly based on computer simulations but these cover only nanoseconds of simulated time. Such short periods are insufficient to show longer temporal relationships, for example those produced by oscillating reactions. They also involve relatively few water molecules and small (nanometre) dimensions, insufficient to show mesoscale (micron) effects. In short, they use models of the water molecule whose predictions correspond poorly to the real properties of water.

Certain 'memory' effects in water are well established and uncontroversial: for instance the formation of clathrate hydrates from aqueous solutions whereby previously frozen clathrates within the solution, when subsequently melted, predispose later to more rapid clathrate formation. This is explained by the presence of nanobubbles, extended chain silicates or induced clathrate initiators.

Can a homeopathic remedy work if it contains none of the original curative substance?

John Dalton (1776 - 1844) was able to estimate relative atomic masses of various molecules, the smallest unit that a chemical can exist in without losing its identity. His values were soon improved by Amadeo Avogadro (1776 - 1856), in 1811. Avogadro made the very important proposal that the volume of a gas (strictly, of an ideal gas ) is proportional to the number of atoms or molecules that are present. Hence, the relative molecular mass of a gas can be calculated from the mass of a sample of known volume. BUT neither Avogadro nor Dalton knew how many molecules there were in a given mass of a substance.  This is historically significant because it means that, although Hahnemann realised that there was a limit to the dilutions that could be used, he had no way of knowing what that limit was. An historical curiousity - or confirmation of the importance of the homeopathic principle? - is the fact that Darwin tested out ultrahigh dilutions on carnivorous plants. In Insectivorous Plants (1875) he writes:
‘The reader will best realize this degree of dilution by remembering that 5,000 ounces would more than fill a thirty-one gallon cask [barrel]; and that to this large body of water one grain of the salt was added; only half a drachm, or thirty minims, of the solution being poured over a leaf. Yet this amount sufficed to cause the inflection of almost every tentacle, and often the blade of the leaf. … My results were for a long time incredible, even to myself, and I anxiously sought for every source of error. … The observations were repeated during several years. Two of my sons, who were as incredulous as myself, compared several lots of leaves simultaneously immersed in the weaker solutions and in water, and declared that there could be no doubt about the difference in their appearance. … In fact every time that we perceive an odor, we have evidence that infinitely smaller particles act on our nerves.’
But we have to be careful; homeopathy was not the declared, explicit, subject of this text, although it may have been an underlying riddle for Darwin (we know that he visited an homeopath, out of despair about his condition, and felt better after).

In any case, in the Sixth edition of Hahnemann's Organon, which is the ‘Bible’ for practising homeopaths, Hahnmann explicitly moves beyond ‘physical’ cause and effect into the mystical world of mesmerism - or healing by the mystical agency of the so-called vital force (popular at the time and perhaps similar to the notion of chi in Chinese medicine.)
‘I find it necessary to allude here to animal magnetism, as it is termed, or rather mesmerism (as it should be called, out of gratitude to Mesmer, its first founder), which differs so much in its nature from all other therapeutic agents. 
 
This curative power, often so stupidly denied, which streams upon a patient by the contact of a well-intentioned person powerfully exerting his will, either acts homoeopathically, by the production of symptoms similar to those of the diseased state to be cured; and for this purpose a single pass made, without much exertion of the will, with the palms of the hands not too slowly from the top of the head downwards over the body to the tips of the toes, is serviceable in, for instance, uterine haemorrhages, even in the last stage when death seems approaching; or it is useful by distributing the vital force uniformly throughout the organism, when it is in abnormal excess in one part and deficient in other parts, for example, in rush of blood to the head and sleepless, anxious restlessness of weakly persons, etc., by means of a similar, single, but somewhat stronger pass; or for the immediate communication and restoration of the vital force to some one weakened part or to the whole organism, - an object that cannot be attained so certainly and with so little interference with the other medicinal treatment by any other agent besides mesmerism.’
According to the German newspaper Bild, a seventh edition of the Organon was recently unearthed in his native Germany, and this reveals that the doctor had continued his work on replacing dilutions with mesmerism and had completed experiments on the resuscitation of dead dogs. Alas, as the newspaper puts it, ‘He died shortly afterwards.’

The bottom line is that homeopathic dilution has not been shown o work, but nor yet has it been shown to be impossible. Some will say ‘well, you cannot prove a negative’ which may be true, but clearly the history of science is of things that people rejected as impossible becoming accepted in the light on new and more sophisticated understandings. The same could yet be said for the mystery of homeopathic dilution.

22 May 2016

Revisiting Scientific Revolutions

AlanWinstanley.com  Incandescent Collection
Posted by Thomas Scarborough 
Thomas Kuhn was wrong.  He failed to understand the dynamics of scientific revolutions. Far from such revolutions occurring through an accumulation of evidence – until, so to speak, the dam bursts – they fail to occur until such time as scientific constraints have been weakened – namely, the scientific method.  I shall explain.
In recent generations, we have witnessed a rising awareness of an inter-connected world, and cosmos. One of the results of modern science in particular is the perception that 'everything is related to everything else'.  Yet paradoxically, even at the same time, we find that science requires the very opposite of openness to the totality of things, to survive and to thrive. For science to advance, there is the need for scientific control on the one hand, and a strictly normed language on the other. In the words of Wilhelm Kamlah and Paul Lorenzen, science must 'screen things out'. This applies to all four phases of the scientific method: characterisations, hypotheses, predictions, and experiments.

There is something equally true about science which we typically do not much pay nearly as much attention to. If the scientific method should exert any influence on those potential influences which it excludes, then scientific control is compromised. For instance, if in seeking to establish how much energy is required to convert a kilogram of ice into steam, I find that I am warming the laboratory at the same time, then the procedure is fundamentally flawed. Energy is being lost. We therefore require what I shall call a 'double isolation' in science. Not only does science screen things out, but it needs to screen itself out from its environment.

This 'double isolation' has led historically to two major problems:

Firstly science, having screened itself out from the world, ultimately needs to 're-enter' the world. After the final, experimental stage of the scientific method, with the artificial conditions of the laboratory removed, science begins again to have an effect on the world. Yet little thought is given to what happens at this point. Science, when it re-enters the world, typically goes beyond anything that was formally taken into account in the scientific method. The disasters which have here occurred have led various thinkers to suppose that science is responsible for the ruination of our world. Stephen Hawking puts it simply: science may score an own goal.

Secondly, the isolation of science from the world has resulted in confusion as to how science really advances. The orthodox view is that science advances by and large through an inductive process: by making broad generalisations from specific observations. Yet consider that those specific observations have already followed the procedure of 'screening things out'. That is, such science has already minimised the effects of variables. It has excluded a great many possible relations in order to trace the relations which it does. There is a limit, therefore, to what can be achieved with previous scientific observations, as far as the tracing of relations is concerned.

Not only this. Experience tells us that scientific conjectures are not adequately explained by an inductive method. Here is an example from my own experience, dating from 2004. In that year I came up with a new principle for metal detecting, called coil coupled operation (CCO).  I was already familiar with the transformer coupled oscillator. This is governed by theory which, even in its full complexity, has little or no interest in outside influences on the oscillator. Now consider that such outside influences could include coins beneath the soil, which may change the frequency of the oscillator. In order to turn this into a metal detector, my mind needed to leap outside of the theory, to discover a principle which rested precisely on those influences which the present theory excluded.

Science, therefore, would seem to require not only the inductive method, but something far larger – namely intuition. Albert Einstein wrote, 'Knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand.' This has important implications for the scientific method. The inductive method should be taught only as one possible means of doing science – and probably not the best way. Rather, the emphasis should be on a more frenetic and imaginative thinking. This is borne out, among other things, by the fact that many scientists of note were inter-disciplinary or multi-disciplinary in their pursuits – among them Archimedes, Leonardo da Vinci, and Albert Einstein.

On the other hand, science should take account not only of the individual mechanisms which are isolated in controlled experiments. It should deliberately keep track of those mechanisms which are excluded from such control. These may potentially be infinite – yet it is crucial that there be an attempt to list them. No experiment is truly complete until this has been done, and no experimenter has been truly responsible without it. Inconsistently, today, some of our scientific pursuits are systematically regulated and supervised after the final, experimental stage of the scientific method – most notably in the areas of food and drugs – while vast areas remain ill-considered. The scientific method, far from being closed after four stages, should be an open-ended process.

This is intimately connected with the philosophy of scientific revolutions. In the process of 'screening things out', scientists' thinking is constrained. Yet a paradigm shift requires an eye for the wider canvas of relations. Therefore science, through the very scientific method, works to prevent paradigm shifts. However, as a science advances, the need for scientists to 'screen things out' becomes weaker. The work of scientific control has been done, and the ability to think creatively becomes stronger. Rather than paradigm shifts occurring through an accumulation of evidence, they occur where the scientific method is weakened – like a housewife, perhaps, who after kneading her bread, looks up to see the sun rise.

16 August 2015

Special Investigation: Gene Therapy and the Origins of Life

The process of natural selection and survival of the fittest lies at the surface of the great molecular chronicle of gene therapy. This investigation argues  the approach will play a great use in near future- as long as  attention is paid to the very spirit of its conceptualisation.

A Special Pi Investigation into the Biochemical Mechanisms involved in Origins and the Evolution of Life - centred on the role of Gene Shuffling.

by Muneeb Faiq and PI editors
Is gene therapy - or gene shuffling as we might alternatively call it -  a product of human genius or a traditional method employed by evolution for last 3.2 billion years in order to give rise to all forms of life that the planet earth has seen? 

Indisputably, this is a very important question which has escaped attention from theoretical biologists for almost four decades (since gene therapy was conceptualised) and there seems to be almost no literature available on it. Instead, there is a general tendency to think that gene therapy is a very recent phenomenon innovated by human mind to achieve desired functioning of a gene and consequently an organism.

That notion is correct in its own right but when you look at it with a little scrutiny, you have to be drawn to the conclusion that gene therapy has been the modus operandi of the process of evolution for billions of years and it is the process of gene therapy (or gene manipulation for that matter) that has brought about the variety and complexity of life that we witness today.

This philosophical investigation will oppose the self-evident notion that the best survives (which begs the question of what the 'best' means) by emphasising that it is the shuffling, the complexity, of gene manipulations that is the real engine of evolution.




Gene therapy lies at the crux of evolutionary mechanisms and how the manipulation of DNA (the genetic material, DNA, or deoxyribonucleic acid, and I find it is the hereditary material in humans and almost all other organisms) has helped gene therapy to achieve the target of producing millions of different species of plants and animals. But before discussing this further, it is imperative to define the terms first so that it becomes easy to understand the arguments put forth in this investigation. Here is how we go, starting with what I call the Central Dogma of Molecular Biology.


First, the most important thing to explain is that, broadly speaking (there are exceptions), DNA makes up the genetic material. DNA synthesis from DNA (to be passed on to progeny) is called Replication. RNA synthesis from DNA is called Transcription and protein synthesis from RNA is called Translation. This is all that we need to bear in mind in order to make sense of what we are about to discuss.

 

UNDERSTANDING THE CELL



To perform any function, the ultimate goal of a cell is to produce a protein which will bring about the requisite purpose. Proteins are like workhorses of the cells while DNA and RNA molecules are just information carrying molecules. DNA carries information across generations while RNA carries information from DNA to the protein synthesis machinery of cells. Proteins are the part of structures of cells, they are enzymes, they play role in most biochemical transformations, they are signalling molecules, they carry signals to various parts of the cells as well as various cells of the body of a multicellular organism (multicellular simply means an organism comprised of more than one cell).

Proteins also manipulate DNA to produce its copies to bring about that process called Replication. They also regulate which part of the DNA (i.e. which gene) is to be expressed in a particular cell type by forming euchromatic and heterochromatic regions. Proteins also help cells to react to various environmental conditions by carrying information to the DNA and then leading to synthesis of relevant proteins to bring about the pertinent function. The expression of genes (simply understood as stretches of DNA) is also regulated by proteins (as transcription factors etc.).

So, in simple terms, life can be understood as the chemical hodgepodge, pot-pourri and conglomeration of these three types of molecules (viz. DNA, RNA and Proteins), one managing the synthesis and function of the other. This intricate fabric of interconnected and interdependent saga of chemical processes (brought about by these three types of molecules) is essentially life. All cells and multicellular organisms (no exceptions) are the extreme expression of the biochemical transformations brought about by these three important molecule types.

GENE MANIPULATIONS AND EVOLUTION


Now, what is gene therapy? This is an easy question and definitions have well been developed for this process. Before going into that, let us first discuss what genes do and where does gene therapy come from. Living systems have been around on earth no less than 3.2 billion years ago with simpler forms of life evolving first and then evolution bringing about more and more complex forms. We don’t need to get confused about what is simple and what is complex. It is conventional. (There is no need for a philosophical metaphysics of this.)

The earliest living organisms were single celled and did not contain any internal organelles. (An organelle is a minuscule organ inside certain cells called Eukaryotic cells). The DNA of such organisms (recall that life is believed to have started with RNA as the first genetic material) is not bound by histones (proteins that bind to DNA and help in its packing inside the cell and its regulation). So we understand that the earliest living forms were simple and capable of a limited number of functions. Assuming that their genetic material was  RNA,  DNA gradually took over owing to its greater stability. (Even today, RNA serves as genetic material in certain viruses,  but otherwise all living cells have DNA as genetic material). DNA contains genes and genes contain bits of information. These collections of all the genes (the DNA) decide every aspect (structure, function, behavior, lifespan, shape etc.) of a cell. So everything a cell does is in fact an omnibus of the dictums of its genes embedded in the DNA molecule.

You may be wondering how this fits into the definitions of gene therapy. But we were, till now, just discussing the background of the concept of gene therapy and now it is easy to understand gene therapy by definition itself. If any cell has any kind of defect in its gene (or more than one gene), the function related to that gene will be hampered which, as a consequence, will jeopardise the life of the cell. This is what happens in many diseases like phenylketonuria, cystic fibrosis et cetera. In these diseases one of the genes is defective which disturbs the whole cascade of cellular functions and precipitates conditions of metabolic stress inside the cells; thereby generating many metabolic defects (and even death in certain lethal gene defects).

If, however, we have a method by which we could transfer a correct copy of the gene to the DNA of the cells (in place of the defective gene), we will overcome the problem of the lack of function due to defect in the gene in question. This technological method is called gene therapy. So gene therapy can be thought of as a therapeutic use of genetic molecules (genes and RNAs) in order to set right and /or fine-tune the functioning of a living cell. We could, alternatively, define gene therapy as “the therapeutic use of DNA as a pharmaceutical agent”. This can be brought about by a variety of methods like use of viruses, plasmids etc. but I don’t intend to go into the details here because our focus in this investigation is different.

POPULAR CULTURE


Historically speaking, it was in a classic paper by Freidmann and Roblin titled 'Gene Therapy for Human Genetic Disease?' published in Science in 1972 that this technique was conceptualised. But it was not until 1990 that a Food and Drug Administration's approved trial was carried out for a disease called ADA-SCID. As of now, almost 1700 clinical trials of gene therapy have been conducted worldwide using various methods of gene transfer (e.g. adenoviruses, lentiviruses, herpes simplex virus, vaccinia, pox virus, injection of naked DNA, electroporation, sonoporation, magnetofection, lipoplexes, nanoparticles, gene guns etc.). Gene therapy has also been a great topic for science fiction writers and Hollywood including the TV series 'Dark Angel', the video game 'Metal Gear Solid', the  James Bond Movie, Die Another Day, a fiction book titled Next, and so on. The technique has seen considerable success in last two decades with alternating period of ups and downs. But recently the world has once again seen a great hope in this therapeutic domain of technological advancement of human species.

GENE THERAPY: THE HEART OF EVOLUTION


As far as the current literature is reviewed, gene therapy is viewed as a purely human development (artificial in its essence) which has nothing to do with natural processes. Gene therapy has always been thought of technological exploitation of some properties of genes and gene vectors without realising that it might have a much deeper connection with life and its evolution and maintenance. No scientist (so far as the current status of the available literature is concerned) views gene therapy as a process that was going on in nature for billions of years. Gene therapy is rather thought of a very recent concept restricted to last four decades involving a few human technological interventions to manipulate DNA.

It is important to understand that this is not the case. Gene therapy is the means nature has been exercising since the inception of life for development and evolution of life. If a little thought is applied on the biochemical mechanism of evolution, one easily understands that evolution is a process of piecemeal improvement and modulation of genes of an organism. This process initially took place through gene transfer and then gene modification (via mutations and polymorphisms). Development of complex forms of life from simpler ones is just an outcome of the process of natural gene therapy. Though it may not be viewed as therapy per se, but improvement of gene function in order to cope up with environmental challenges is also a therapy and that is what evolution has been doing for billions of years.

The hallmark of evolution is to lead to the formation of organisms better suited to the environments they live in. Development of carnivorous set of teeth in flesh eating animals, flat teeth in herbivores, development of wings in birds, cellulose digesting enzymes in grass eating animals, taste buds on the tongue, most sense organs on the heads and modes of excretion are all decided and developed by the process of evolution.
How does evolution do all that? This is also a very important question and we need to address it in detail. First, recall that initial discussion about the function of three basic molecules of life (DNA, RNA and proteins) where it is explained that almost all cellular processes are carried out by proteins which are directed and dictated ultimately by DNA. So if any process is to be enhanced, improved, modulated, upregulated, downregulated and so on, all that is needed is to alter the genes in DNA. This can be done at protein level but then proteins are not transmitted to the progeny. It is very much sensible to introduce changes in DNA in order to make the change in function (or any other aspect of the cell) to manifest in coming progenies. This is simply what evolution does. Evolution works by altering the DNA sequences (i.e. genes) in order to modulate and improve certain functions of an organism (both unicellular as well as multicellular). Scientists have been using various methods of changing DNA sequences (like we have already mentioned above) in order to bring about the desired function in the organism and they have learnt to introduce genes in the DNA that replace the malfunctioning genes of the organism. This is what gene therapy is and that is what evolution also does (more or less).

A PubMed  search of the word 'gene therapy' (PubMed is the largest database of references and abstracts on life sciences and biomedical topics) returned 139950 entries on 9th August, 2012. In spite of so much of a literature available, the connections of gene therapy with natural process of evolution has not been explored. There are certain reports which discuss gene therapy and the process of evolution but they don’t tend to see the similarities in the two processes. These reports rather consider the effects of conventional gene therapy (a scientific development) on the process of evolution thereby completely ignoring the fact that conventional gene therapy (kind of a process) could well be a part of evolutionary mechanism used by nature to produce the splendour of life.

Let us now come out of the complexities of these terms of molecular biology. Let us put it in very simple words. Evolution is gene modification in order to improve function (a natural process) and gene therapy is gene modification in order to improve function (a scientific method). From this we arrive at a very gay but strong conclusion that gene therapy, though a recently conceptualised technique is a very important concept exploited by nature in order to play evolution for no less than 3.2 billion years and exploited by scientists in order to treat diseases.

At this point, it is necessary to justify the above hypothesis of this investigation by examples. We will be considering a few biological processes ranging from the origins of multicellularity, development of complex forms of life, development of cellular organelles to cellular processes like reproduction and recombination.

GENE THERAPY AND THE ORIGIN OF LIFE


Origin of life itself was (presumably) a process of gene therapy (this may or may not be true, it is just a presumption). Theoretically speaking, primitive cells are minuscule bags of chemicals containing DNA to manipulate these chemicals. So if a bag of such chemicals (we now refer to as cytoplasm/protoplasm) is being subjected to a process of gene therapy (i.e. supplied with a full stretch DNA molecule), in principle, it makes the bag of chemical alive and this bag of chemicals starts behaving as a full-fledged living cell. So a simple process of gene therapy may presumably have brought life into existence at the first place. It should be noted that this concept is not as non-practical as it seems to be.

A recent technique of Somatic Cell Nuclear Transfer (SCNT) exploits this phenomenon (more or less) to create stem cells or to bring fossil DNA molecules to life. SCNT is the technique that was used to create the world’s most famous animal (Dolly the sheep). This discussion gives impetus to the idea that creation of life by putting DNA (or RNA) into a small chemical bag could very well have lead to the formation of primitive forms of life and then the whole process of evolution would have been the improvement and alteration of this DNA molecule inside this chemical bag (we think of a primitive cell) thereafter. Now that the concept of gene therapy has helped us to hypothesise how life presumably could have originated by gene therapy, we will now see how evolution brought about its great orchestra of life using this beautiful process.

 

GENE THERAPY AND THE EVOLUTION OF MITOCHONDRIA AND CHLOROPLASTS


The development of eukaryotic cells (that is cells with internal organelles, nucleus bound in membrane etc.) emerged long back in evolution at about 1.6 to 2.7 billion years ago. Before that cells were relatively simple. Eukaryotic cells are thought to have evolved from simpler forms (Prokaryotic cells). Eukaryotic cells have their own small organs (organelles) to perform various functions. One of the very imperative organelles is the energy providing Mitochondrion (plural mitochondria). Mitochondria evolved by a process called endosymbiosis.

In this process (it is thought) one cell ate (engulfed) a prokaryotic cell; but instead of digesting it, used it as a servant inside its body. It was good for the servant cell also because the servant got readily available food and shelter and provided energy supplies to its master. So the relation turned to be a symbiotic relation rather than a strict Master-servant relation. With the passage of time this servant cell lost some of its functions (efficiently carried by the Master cell for itself and the servant) and vice versa. Utilisation of oxygen (oxidative phosphorylation) was then the duty of the servant celluloid (the cell like creature created out of the servant cell with the passage of time). With the passage of time, this servant cell retained genes necessary for its replication and oxidative phosphorylation, which turned off in the master cell. There was, therefore, a symbiosis established. The servant cell became the mitochondria and the master cell a more complex eukaryotic cell.

This process describing the emergence and evolution of mitochondria is also a typical gene therapy process. The small cell that was engulfed can be thought of a vector with certain genes capable of doing some specialized functions (oxidative phosphorylation in this case). It was just that this gene transfer enhanced the oxidative respiration capacity of the host cell: this is simply what can be referred to as gene therapy. It was a process of transfer of genes (via a bag of chemicals containing a DNA stretch) into a cell in order to improve its bioenergetics. If a little thought is applied in this domain, one is compelled to think that mitochondria are the outcome of the process of gene therapy. Same is the case with chloroplasts (organelles for photosynthesis created out of cynobacteria-the blue-green algae) which also emerged by the same mechanism and lead to the evolution of all the plants on earth.
So gene therapy has paved way for the emergence and evolution of all animals and plants on earth and yet scientists still think of gene therapy as a scientific genius a mere four decades of age.

GENE THERAPY AND EVOLUTION OF MULTICELLULARITY


Development of multicellularity is a very important event in evolution. More than one type of cell in an organism is particularly very useful because it makes every cell well suited to its functions (through division of labour). Sensory neurons in our body carry signals from the peripheral areas to the brain, retinal neurons carry the codes of photons to the brain, beta cells of pancreas secrete insulin, cells on the intestinal villi absorb nutrients, hepatocytes (liver cells) produce bile etc. every cell in a multicellular organism has its special functions. One cell type can do its function at a significantly amplified pace and efficiency but is not able to do many of the other functions. So there are different cells meant for different functions in a multicellular organism.

All these highly specialised, improved and efficiently conceded functions are only possible in a multicellular organism if certain genes are upregulated, some downregulated and others switched off. This means that certain functions need to be improved and others need to be downregulated while a few others completely switched of depending on the cell type. Nature has methods for that. The DNA is packed in the chromosomes and the packing pattern of the DNA decides the expression of particular genes in a particular cell type (though there are other ways also). If we see, this process is also gene therapy by definition. It is the regulation, modification and modulation of the gene function.

 

GENE THERAPY AND REPRODUCTION


Let us now turn our attention towards reproduction. The most important process that takes place in the physiology and biochemistry of reproduction is fertilisation. In this process, the ovum (egg) waits for the sperm and the sperm swims in the genital tract, reaches the egg and then pushes the genetic material into the egg. This initiates a saga of processes of life starting from the formation a zygote followed by formation of morula, then blastula and moving through many embryonic stages to the birth of a new organism.

It is worth noting that before the sperm transfers the DNA into the ovum (egg), no process of embryonic development initiates and no new organism is formed. Once the sperm transfers its DNA to the ovum; the biochemical narrative of life starts, leading to the development of a new organism. What sperm does is just the transfer of DNA into the ovum and start of all the life processes. Let us wait a little and think for ourselves. Isn’t is gene therapy of the ovum? At least in a certain sense that has lead to the embryogenesis and development of a new organism. It is mentionworth that the sperm does not transfer anything other than DNA to the ovum (not even mitochondria or cytoplasm). So the process of fertilization is just meant to transfer DNA to the ovum and from there all the processes of life start; the consequence of which is all variety of plants and animals around us. This is what evolutionary gene therapy has done for us. Moreover, the events referred to as recombination events in cellular division (meiosis) could also be described as some sort of gene therapy.

AND THE MORAL IS?


The process of natural selection and survival of the fittest lies at the surface of the great molecular chronicle of gene therapy. This investigation may find a great use in near future if attention is paid to the very spirit of its conceptualisation. Gene therapy could be improved upon and the technique could find new horizons inspired from evolution (biomimicry). I believe that gene therapy has much greater potential than is currently being spoken of. I would go so far as to say that gene therapy is the hope of  future medicine. But right now, scientists need already to acknowledge that it is a natural process which can and should be utilised - in a considered and  controlled manner - to bring about many of those long sought dreams of therapeutic medicine.


This is an edited version of an essay  by Muneeb Faiq (adapted for Pi), who is ICMR Senior Research Fellow at All India Institute of Medical Sciences, New Delhi.

10 May 2015

What is a philosophical problem? The irrefutable metahypothesis

By Matthew Blakeway

If we ban speculation about metahypotheses, does philosophical debate simply evaporate? 



Karl Popper explained how scientific knowledge grows in his book Conjectures and Refutations. A conjecture is a guess as to an explanation of a phenomenon. And an experiment is an attempt to refute a conjecture. Experiments can never prove a conjecture correct, but if successive experiments fail to refute it, then gradually it becomes accepted by scientists that the conjecture is the best available explanation. It is then a scientific theory. Scientists don’t like the word “conjecture” because it implies that it is merely a guess. They prefer the word “hypothesis”. Popper’s rule is that, for a hypothesis to be considered scientific, it must be empirically falsifiable.

When scientists consider a phenomenon that is truly mystifying, it seems reasonable to ask “what might a hypothesis for this look like?” At this point, scientists are hypothesising about hypotheses. Metahypothetical thinking is the first step in any scientific journey. When this produces no results, frustration gets the upper hand and they pursue the following line of reasoning: “the phenomenon is an effect, and must have a cause. But since we don’t know what that cause is, let’s give it a name ‘X’ and then speculate about its properties.” A metahypothesis is now presumed to be 'A Thing', rather than merely an idea about an idea.

The problem is the irrefutability of its existence.
X is a metahypothetical idea, and until we have a hypothesis, we don’t actually know what we are supposed to be refuting. Popper would say that it wasn’t scientific, yet it sprang from a scientific speculation. There is a false impression of truth that actually derives from a misrepresentation of axiom. “X is a thing” actually means “’X’ is a name we have given to an idea where we don’t even know what the idea represents” and the confusion between idea and thing is born. A false logical conclusion arises, not from truth, but because incoherent statements are irrefutable by their nature.

We can trace this through the history of philosophy. Most of it can be reduced to the following two questions:

• “What is X?” and
• “Does X exist?”

- where “X” is a metahypothetical idea that sprang from a scientist speculating about a cause of an unexplained phenomenon. The “X” could represent: God, evil, freewill, the soul, knowledge, etc. Each of these is a metahypothesis that originated with a scientist seeking to explain respectively: the existence of the universe, destructive actions by humans, seemingly random actions by humans, human actions that no one else can understand, human understanding.

The question “what is knowledge?” led to thousands of years of debate that ended when everybody lost interest in it. And I'm sure that the questions “what is freewill?” and “do humans have it?” are currently going through their death throes – again after a thousand years of debate. Or take the statement: “Evil people perform evil actions because they are evil.” If you are reading this blog, you will recognise that as so incoherent that it is barely a sentence, yet the individual components of it frequently pass as explanation for human actions that we don’t like. The idea of “evil” being some sort of thing is irrefutable despite being meaningless. What is there here to refute?

The sheer persistence of any proposition concerning a metahypothesis represented as 'A Thing' is illustrated by a real debate recently. The British actor, Stephen Fry,  gave an interview with Irish television in which he argued that if God exists, then he is a maniacal bastard. [To paraphrase!]

Yes, the world is very splendid but it also has in it insects whose whole lifecycle is to burrow into the eyes of children and make them blind. They eat outwards from the eyes. Why? Why did you do that to us? You could easily have made a creation in which that didn’t exist.

Giles Fraser, a Christian, responded with an article “I don’t believe in the God that Stephen Fry doesn’t believe in either.”

If we are imagining a God whose only power, indeed whose only existence, is love itself – and yes, this means we will have to think metaphorically about a lot of the Bible – then God cannot stand accused as the cause of humanity’s suffering.

I expect that you are positively itching to take a side in this debate. But resist the urge! Instead imagine that you are a Martian gazing down at the tragic poverty of the debates of Earth people. Fry is taking a literal interpretation of God and thereby is converting a metahypothesis into a hypothesis, but he is doing this purely with the intention of refuting it. Deliberately establishing a false hypothesis is a good debating tactic, but a dishonest one.

Fraser responds by taking the literal interpretation and passing it back into the metahypothetical – an equally dishonest tactic of making a debate unwinnable by undefining its terms. It’s like stopping the other team winning at football by hiding the ball. The effect of debates like this is to create an equilibrium stasis where the word “God” is suspended between meaning and incoherence. If it is given a robust definition, it becomes a hypothesis and is empirically refutable. And since its origins were in our inability to explain phenomena (the origin of the universe, life, etc.) for which we now have decent scientific explanations then it is pretty certain that it will indeed be refuted. But if the idea is completely incoherent, then it isn’t possible to talk about it at all. So the word exists – fluidly semi-defined – in the mid-zone between these two states. The concept “God” is an idea about an idea about a cause of unexplained phenomena. It is therefore itself unexplainable.

We can examine the birth of a metahypothesis in real time. Richard Dawkins asked in The Selfish Gene what caused cultural elements to replicate. He speculated that it needed a replicator like a gene:

But do we have to go to distant worlds to find other kinds of replicator and other, consequent, kinds of evolution? I think that a new kind of replicator has recently emerged on this very planet. It is staring us in the face. It is still in its infancy, still drifting clumsily about in its primeval soup, but already it is achieving evolutionary change at a rate that leaves the old gene panting far behind.

The new soup is the soup of human culture. We need a name for the new replicator, a noun that conveys the idea of a unit of cultural transmission, or a unit of imitation. ‘Mimeme’ comes from a suitable Greek root, but I want a monosyllable that sounds a bit like ‘gene’. I hope my classicist friends will forgive me if I abbreviate mimeme to meme.

An effect needs a cause. And since we don’t know what that cause is, let us give it a name and then speculate as to what its properties must be. It is beyond funny that the world’s most famous atheist is here caught employing the same method of reasoning that gave birth to the idea of “God”. We will now debate for a thousand years whether memes exist or not. However, the idea is incoherent despite sounding convincingly sciencey. The idea of the “soul” sounded pretty sciencey in Aristotle’s day. Dawkins speculates that the idea of God is a meme, but he fails to notice that the idea of a meme is a meme, and therefore he is trying to lift himself off the floor by his bootstraps.

So... if we ban speculation about metahypotheses, does philosophical debate simply evaporate? Maybe! But it would probably also stop scientific progress in its tracks. If you are in the mood for a brain spin, you might consider whether the idea of a “metahypothesis” is itself a metahypothesis.

Taking this further, if we cannot hypothesise about hypotheses, then does science evaporate too?

06 April 2015

Wikipedia on Climate Change





wpwarm.jpg

 

How has the World's largest encyclopaedia been covering the Climate Change debate?

 


 
Above. A typical Wikipedia 'smorgasbord' of pseudo-facts. The alarming red hot globe, for example, is based not so much on temperature data but computer 'filling in' of data - notably in the Arctic and Antarctic regions. This technique is so obviously unsatisfactory that no reputable climate statisticians accept it. And the IPCC itself, although used as the source, correctly calls the various scenarios 'projections' not 'predictions'. Wikipedians, like politicians, don't know the difference! (See notes)

A Philosophical Investigation by Martin Cohen

February 23 2010

Being a Classic post 'reposted' from Pi-alpha


Put 'Global Warming' into Google, let alone Wikipedia, and you will be offered, as 'settled fact', the following: 

Global warming is the increase in the average temperature of Earth's near-surface air and oceans since the mid-20th century and its projected continuation.... An increase in global temperature will cause sea levels to rise and will change the amount and pattern of precipitation, probably including expansion of subtropical deserts.[8]… Other likely effects include changes in the frequency and intensity of extreme weather events, species extinctions, and changes in agricultural yields. Warming and related changes will vary from region to region around the globe, though the nature of these regional variations are uncertain.

That is because you will be directed to 'Wikipedia'1. The Wikipedia page goes on to predict glacial retreat, Arctic shrinkage including long-term shrinkage of Greenland ice sheet. Ocean acidification will lead to the extinction of between 18% to 35% of animal and plant species by 2050. Horrifying predictions of temperature rises are given in graphics, with a note that not all effects of global warming are accurately predicted by the climate models used by the IPCC. Ah ha! a small concession to the sceptics? Not at all, the encyclopaedia merely wants us to worry more because, "For example, observed Arctic shrinkage has been faster than that predicted." 

This is, of course, the 'full throttle' version of the theory of man-made global warming, as advanced by certain scientists and green groups. (Apart from the highly politicised IPCC summaries written by activists including government representatives with the aim of directing political policies, the sources are variously, RealClimate.org, James Hansen at the Goddard Institute, and so on. That is three names for essentially the same outfit.) 

In general, Wikipedia reprints the IPCC notes for policy makers, produced by it political steering committees, as a kind of holy writ. Actually, to say something is the 'view of the IPCC' is a shorthand, because many of the past and present authors of the IPCC reports do NOT agree with particular claims. Naturally, given their origins, the reports consist of endless weasel words and hair-splitting distinctions between degrees of plausibility. 'Very likely' to happen, 'quite likely', 'likely'. None of these complications bog down Wikipedia, where the science is all very straightforward and unremittingly alarmist. To confirm its accuracy, the Global Warming page boasts a gold star meaning it has been approved by the Wikipedia system as one of the best, the most objective and the most encyclopaedic. 

Well down the page, long after most people have stopped reading, below the scary graphs and charts, is the heading "Debate and skepticism". But this debate is confined to 'how to combat Global Warming' and calculating the benefits of limiting industrial emissions of greenhouse gases against costs. "Using economic incentives, alternative and renewable energy have been promoted to reduce emissions while building infrastructure", the encyclopaedia explains. 

But keep on reading, and there we are, at the very bottom of the page XXX words and 122 learned footnotes later, comes a dissenting note! "Some global warming skeptics in the science or political communities dispute all or some of the global warming scientific consensus, questioning whether global warming is actually occurring, whether human activity has contributed significantly to the warming, and the magnitude of the threat posed by global warming." 

That's all it says on the main page, but now - if we are curious, we might follow the link to see what these skeptics are saying. 

The 'Climate Skeptics' page starts neutrally enough: 

"Climate Skeptics include many leading researchers and scientists, such as Professor Bob Carter of James Cook University and Dr David Bellamy and then, under the heading "View of prominent sceptics" offers short quotes to show the sort of things at issue: 

From Climate Skeptics page

"Former UN Scientist Dr. Paul Reiter of the Pasteur Institute in Paris (who resigned from UN IPCC in protest): “As far as the science being ‘settled,’ I think that is an obscenity. The fact is the science is being distorted by people who are not scientists.”
UN IPCC scientist Vincent Gray of New Zealand: “This conference demonstrates that the [scientific] debate is not over. The climate is not being influenced by carbon dioxide.
Climate researcher Dr. Craig Loehle, formerly of the Department of Energy Laboratories and currently with the National Council for Air and Stream Improvements, has published more than 100 peer-reviewed scientific papers: “The 2000-year temperature trend is not flat, so a warming period is not unprecedented. … a 1500-year temperature cycle as proposed by [Atmospheric physicist Fred] Singer and Dennis Avery is consistent with Loehle climate reconstruction… a 1500-year cycle implies that recent warming is part of natural trend.”
Hurricane expert and Meteorologist Dr. William Gray: “There are lot’s of skeptics out there, all over the U.S. and the rest of the world. Global warming has been over-hyped tremendously; most of the climate change we have seen is largely natural. I think we are brainwashing our children terribly.”
UK Astrophysicist Piers Corbyn: “There is no evidence that CO2 has ever driven or will ever drive world temperatures and climate change. The consequence of that is that worrying about CO2 is irrelevant. Our prediction is world temperatures will continue to decline until 2014 and probably continue to decline after that.”
Meteorologist Art Horn: “There are thousands of scientists around the world who believe that this issue is not settled. The climate is not being influenced by carbon dioxide.”
Climate statistician Dr. William M. Briggs, who serves on the American Meteorological Society's Probability and Statistics Committee and is an Associate Editor of Monthly Weather Review: “It is my belief that the strident and frequent claims of catastrophes caused by man-made global warming are stated with a degree of confidence not warranted by the data."

This splendidly neutral page then concludes with some longer sceptical accounts including that of Professor John David Lewis of Duke University, USA, reporting that he has challenged many of the claims made by proponents of man-made climate change theory, in an article in the politically neutral journal Social Philosophy and Policy (Volume 26 No. 2 Summer 2009), saying: 'Those predicting environmental disasters today focus on particular issues in order to magnify the gravity of their general claims, and they push those issues until challenges make them untenable. Rhetorical skill and not logical argument has become the standard of success.' 

Then there is that review article, published in the Times Higher on the 03 December 2008, Professor Gwyn Prins, the director of the Mackinder Programme for the Study of Long Wave Events at the London School of Economics, which says that the 'principle product of recent science is to confirm that we know less, less conclusively - not more, more conclusively - about the greatest open systems on the planet'. 

And finally, Professor Mike Hulme's, a 'climate scientist' at the University of East Anglia's centre for such research, offered a comprehensive defence of scepticism in the December Wall Street Journal noting: "Science never writes closed textbooks. It does not offer us a holy scripture, infallible and complete." 

What a fine summary, if I might say so myself! So 'Wikipedia' gives us (as the old legal refrain goes) the truth, the whole truth and nothing but the truth, no? 


No, no, and no! This 'sceptical page', was one I knocked up as little test to see if complaints about climate change bias on the the 'open-to-all encyclopaedia were justified. Once posted, it lasted exactly one minute. No you read that right - one minute! 

  I finished writing the page at 22:34, on the 6 February., At 22:35, 6 February 2010, an editor operating under the usual stupid (but effective in terms of the propaganda function of WIkipedia) pseudonym, MuffledThud, added the template 'Requesting speedy deletion (CSD A10). (TW))'. And that was that! No more nasty Skepticism on Wikipedia! 

Now I am more a little bit more cognoscenti of WP than perhaps most users, so I attempted to defend my page four minutes later - that is before the page could be 'formally' deleted. This required pasting the gnome-like WIkipedia formula : ({{hangon}}). Did that save my page? Well, yes and no. This time the page stayed there for half an hour. But then at 23:09, 6 February 2010 Tony Sidaway 'a system operator', that is to say a Wikipedia editor who has been given extra powers over most of the rest, removed the page and replaced it with an electronic alias pointing at the 'Global Warming page', which as we have seen, covers the sceptical angle very thoroughly with all of that final, er,… one sentence. As a system operator, Tony leaves a short note on the strategy. "Redirect as per Global warming skeptic, stable for over two years", in the so-called 'page history'. Later on, someone thought it safer to make the redirect 'permanent' and to to make challenging it a 'ban able offence'. 

So why is it impossible to place on Wikipedia, just for the record, some of the 'other views', 'dissenting voices' if you wish, including as they certainly do, many distinguished scientists, professors and IPCC authors? 

After all, Wikipedia has room for another 3 million articles including ones on 'Fart Lighting' and 'Nipple clamps' (the encyclopaedia's origins start with a rather sordid 'web-portal' called Bomis) and lengthy accounts of what its editors have done that day. But indeed, it is not possible. Not only Tony Siddaway but a whole group of editors patrol the encyclopaedia immediately removing any views not consonant with their uncompromising thesis. 

Instead of the full range of views, as even those IPCC reports give a nod to, there is only one only page describing other views is headed unprepossessingly: 

 

List of scientists opposing the mainstream scientific assessment of global warming


Great title guys! Makes you want to read on! Mind you, there is a rather off-putting opening disclaimer: 

"This article lists living and deceased scientists who have made statements that conflict with the mainstream assessment of global warming as summarised by the Intergovernmental Panel on Climate Change and other scientific bodies."

That's just for starters. Read the first half page of background briefing next! 

Climate scientists agree that the global average surface temperature has risen over the last century. The scientific consensus was summarised in the 2001 Third Assessment Report of the Intergovernmental Panel on Climate Change (IPCC). The main conclusions relating directly to past and ongoing global warming were as follows:
1. The global average surface temperature has risen 0.6 ± 0.2 °C since the late 19th century, and 0.17 °C per decade in the last 30 years.
2. "There is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities", in particular emissions of the greenhouse gases carbon dioxide and methane.
3. If greenhouse gas emissions continue the warming will also continue, with temperatures projected to increase by 1.4 °C to 5.8 °C between 1990 and 2100. Accompanying this temperature increase will be increases in some types of extreme weather and a projected sea level rise of 9 cm to 88 cm, excluding "uncertainty relating to ice dynamical changes in the West Antarctic ice sheet". On balance the impacts of global warming will be significantly negative, especially for larger values of warming.
Those listed here have, since the Third Assessment Report of the IPCC, made statements that conflict with at least one of these principal conclusions. Inclusion is based on specific, attributable statements in the individual's own words, and not on listings in petitions or surveys. In February 2007, the IPCC released a summary of a Fourth Assessment Report, which contains similar conclusions to the Third. For the purpose of this list, a scientist is defined as a person who published at least one peer-reviewed article during their lifetime in the broadly-construed area of natural sciences.

Are you still interested? Well, don't be. None of the views summarised here are presented in a way to make any useful point. Add to which, there are apparently just three people, who as the page puts it, think that "Global warming is not occurring or has ceased". 

All right, let's have 'em!
  • Timothy F. Ball, former Professor of Geography, University of Winnipeg: " who sceptically observes: "There's been warming, no question. I've never debated that; never disputed that. The dispute is, what is the cause." but then disputes himself by saying "The temperature hasn't gone up. ... But the mood of the world has changed: It has heated up to this belief in global warming." (August 2006)
  • Robert M. Carter, geologist, researcher at the Marine Geophysical Laboratory at James Cook University in Australia who is allowed, graciously, to say "the accepted global average temperature statistics used by the Intergovernmental Panel on Climate Change show that no ground-based warming has occurred since 1998 ... there is every doubt whether any global warming at all is occurring at the moment, let alone human-caused warming."
and finally
  • Vincent R. Gray, coal chemist, who thinks that:""The two main 'scientific' claims of the IPCC are the claim that 'the globe is warming' and 'Increases in carbon dioxide emissions are responsible'. Evidence for both of these claims is fatally flawed."[9]"
That's the main business over - very quickly. But space is tight on those Wikipedia servers - send more money please! Next are two slightly longer sections note those who think the "Accuracy of IPCC climate projections is questionable" or that "Global warming is primarily caused by natural processes" or (contrariwise, the duffers!) that "Cause of global warming is unknown", before the page finishes with a section called, hilariously and in full: "Now deceased", thus rounding up the other sceptics. 

And although the page offers at the top:"This is an incomplete list, which may never be able to satisfy certain standards for completion. You can help by expanding it with reliably sourced additions."- it also has a silver padlock signifying that editing is not open to most users at all.
However, it is in the safe hands of a 'user-group' called the 'Climate Change Task-force', who have special powers to stop articles presenting views that they do not agree with. Or as a notice puts it on their 'home page': 

"A decision by the Wikipedia community has placed articles relating to climate change under article probation. Editors making disruptive edits may be blocked temporarily from editing the encyclopaedia, or subject to other administrative remedies, according to standards that may be higher than elsewhere on Wikipedia. Please see Wikipedia:General sanctions/Climate change probation for full information and to review the decision."

By 'higher standards' they mean 'lower standards', but then this is WIkipedia and people can barely write. The point though is clear, as Lawrence Solomon has described in his articles over at the Financial Post on Wikipedia, of which more in a moment. Wikipedia is open to everyone to edit, but only if they either write drivel (as most pages there are) or stick to the political line. In the case of 'Climate Change' the line is that there is not only no scientific debate left to be had, but no political debate either. 

The 'lower/ higher standards' mean that people who have been given extra administrative powers on the encyclopaedia, 'system operators', such as the ability to an other users or to 'protect pages' (which means prevent people editing them) - are formally granted dispensation to use these administrative powers on pages they also edit - and thus promote their own views. The 'Chinese Wall' that supposedly exists to stop administrators abusing their powers in content debates has been torn down for articles on 'climate change'. 

Take William Connolley, for example, a little known Greenie (one of the RealClimate.org crowd 2,) whose views make up - literally!- the science for the Wikipedia pages) who has however some special role in the Wikipedia elite. He has banned more contributors than most websites have readers!
Here's what the climate sceptic commentator, Lawrence Solomon, says about him, in an article posted on the Financial Post website on Saturday, May 03, 2008 

"Connolley is a big shot on Wikipedia, which honours him with an extensive biography, an honour Wikipedia did not see fit to bestow on his boss at the British Antarctic Survey. Or on his boss's's boss, or on his boss's boss's boss, or on his boss's boss's boss's boss, none of whose opinions seemingly count for much, despite their impressive accomplishments. William Connolley's opinions, in contrast, count for a great deal at Wikipedia, even though some might not think them particularly worthy of note." 

[From the Financial Post article 3

Connolley is … an administrator with unusual editorial clout. Using that clout, this 40-something scientist of minor relevance gets to tear down scientists of great accomplishment. Because Wikipedia has become the single biggest reference source in the world, and global warming is one of the most sought-after subjects, the ability to control information on Wikipedia by taking down authoritative scientists is no trifling matter.
One such scientist is Fred Singer, the First Director of the U.S. National Weather Satellite Service, the recipient of a White House commendation for his early design of space satellites; the recipient of a NASA commendation for research on particle clouds — in short, a scientist with dazzling achievements who is everything Connolley is not. Under Connolley's supervision, Singer is relentlessly smeared, and has been for years, as a kook who believes in Martians and a hack in the pay of the oil industry. When a smear is inadequate, or when a fair-minded Wikipedian tries to correct a smear, Connolley and his cohorts are there to widen the smear or remove the correction, often rebuking the Wikipedian in the process.

Lawrence Solomon adds, "Wikipedia is full of rules that editors are supposed to follow, as well as a code of civility. Those rules and codes don't apply to Connolley, or to those he favours."
Indeed they don't. Here are some of the occasions that William Connolley has used his administrative powers to block other users he disagreed with just on the Climate Change topic. (A page called BLOCK#Disputes records such minutiae for each administrator.) 

It's long, but sums up exactly the travesty of editing on the 'Encyclopaedia anyone can edit'. Remember too that, supposedly, 'blocks' are a tool there only for neutral 'uninvolved' administrators to stop 'vandals'. 

WILLIAM THE GREEN'S BUSY MONTH

1. In an edit war with User:Chris_Chittleborough on Hockey stick controversy William blocks Chris. Another 'administrator', nicknamed Chaser later says:"Will...you can't block users you're in disputes with. The policy is unambiguous and ArbCom [the Wikipedian cabal of the most powerful administrators] has indicated the same thing. This is the kind of thing that people get de-sysopped for." [Hop off, Chaser!]
2. In an edit war with User:Lapsed Pacifist on the page Shell to Sea, William blocks Lapsed for the reason "repeated re-insertion of unsourced material"
3. In an edit war with User:Jaymes2 on Global warming William blocks Jaymes2 for the reason, "repeated insertion of tripe"
4. In an edit war on Global Warming with User:Sterculius William blocks Sterculius for "Tendentious edits at GW"
5. In an edit war with User:Wedjj on Global Warming William blocks Wedjj for 8 hours, reason: "disruptive editing"
6. In an edit war with User:Supergreenred over Global Warming, William blocks User:Supergreenred
7. In an edit war with User:Britcom on List of scientists opposing the mainstream scientific assessment of global warming and Global Warming William temporarily blocks Britcom, reason for 'incivility'. Brit says: "Don't be a hypocrite WC"
8. In the same edit war with User:Britcom on List of scientists opposing the mainstream scientific assessment of global warming and Global Warming William blocks Britcom for 24 hours reason: Incivility
9. In an edit war with User:Wikzilla at Global warming William personally blocks Wikzilla twice for Three-revert rule violations.
10. In an edit war with User:ConfuciusOrnis at Climate change denial William blocks User:ConfuciusOrnis twice. William is chastised by admin User:FeloniousMonk for William abusing his administrative powers.
11. In an edit war with user:207.237.232.228 on Intergovernmental Panel on Climate Change William blocks 'anon' for three hours.
12. With User:DHeyward on Global Warming William blocks DHeyward, length: 8 hours, reason: "violation of 1RR on GW; in civil edit summaries"
13. In an edit war with User:Lapsed Pacifist on the page Shell to Sea William blocks Lapsed for 3 hours giving the reason as "incivility" for this edit.
14. For comments on List of scientists opposing the mainstream scientific assessment of global warming which William actively edits, William blocks 65.12.145.148 for incivility for this comment "A great read for all you cool aid drinkers."
15. William blocks User:HalfDome for incivility because of comments on the page Image talk:2000 Year Temperature Comparison.png, a page which he actively edits.
16. William again blocks User:HalfDome for incivility because of comments on the page Image talk:2000 Year Temperature Comparison.png.
17. William blocks User:Jepp for comments on List of scientists opposing the mainstream scientific assessment of global warming, an article William actively edits. Reason: "Inserting false information: incivility"
18. William blocks User:71.211.241.40 for comments on Global warming controversy.
19. William blocks User:Juanfermin for edits on the page List of scientists opposing global warming consensus, an article William edits regularly.
20. William blocks User:UBeR for comments on The Great Global Warming Swindle.
21. William blocks User:Peterlewis for comments on Historical climatology, an article William edits regularly.
22. William blocks User:69.19.14.31 for incivility on Global warming, an article William edits regularly.
23. William blocks User:Likwidshoe for incivility on IPCC Fourth Assessment Report, an article William edits regularly.
24. William blocks User:Kismatraval for "spam" on Global warming, an article William edits regularly.
25. William blocks User:69.19.14.29 for this comment "One thing is clear: this Wikipedia article and its fanatical guardians are a perfect example of how and why Wikipedia cannot be considered as a reliable source of knowledge."
26. William blocks User:Grimerking for 3rr on Global warming, an article William edits regularly.
27. William blocks User:Dick Wayne for posting youtube link on The Great Global Warming Swindle, an article William edits regularly.
28. William blocks User:DonaldDuck07 for "incivility" for comments on List of scientists opposing the mainstream scientific assessment of global warming, an article William actively edits.
29. William blocks User:Rotten for "incivility" for comments on The Great Global Warming Swindle, an article William actively edits.
30. William blocks User:219.64.26.28 for comments on Scientific opinion on climate change.

There's more, but that's enough to be going on with. The point is this: 

At Wikipedia, according to the bland and hypocritical publicity for the site, "anyone can edit a page" - jump right in, and edit my page, says Jimmy Wales, founder, et cetera et cetera, used to claim . And its central to the methodology of the integrity of the content that all editors are equal. Over time, the good edits are supposed to cancel out the bad edits. Is that true? Will they? No one will ever know, because in fact hardly anyone is even able to edit the 'Climate Change' or other controversial pages, and those who manage to, are immediately banned if they disagree with the 'super-editors' managing the content there.

Executive summary:



Wikipedia is not neutral, it is dangerous propaganda delivered by anonymous non-entities.
 
Does it matter though, what Wikipedia days? After all, we have the BBC and The Guardian newspaper all saying exactly the same thing in a more authoritative way. But indeed it does matter. The Guardian's environment writers use Wikipedia as a source for their stories, as its website editor, James Randerson, confirmed to me by telephone, volunteering (with endearing frankness) its use there as a supply of facts and sources, along with other details. I asked him, as the environment section's web specialist, if he was aware of the controversies surrounding the online encyclopaedia's coverage of Climate Change, specifically, that it was heavily skewed to one side of the debate? No, he said, he was not aware of that. And nor was The Guardian concerned either. As for the BBC, I have had dealings in the past with TV researchers, and rarely is there a group less inclined to look further than a convenient, ten minute source like Wikipedia. Certainly, later on, they will talk properly to experts, but the initial research will come straight off the net, and so will skew that selection of who they speak to. 

Sooo... does it matter? After all facts are facts, aren't they. But facts are not facts. Facts are versions of reality put forward by people with agendas. For example, the frightening temperature increases the page records, uses as its source the Goddard Institute of Space Studies, which is run by James Hansen, the 'big spider' at the centre of the Global Warming web who has such an 'extreme' position the matter that he has fallen out with most of the others in the pro-camp. Quoting them is like quoting Liverpool Supporters Club on 'who are the greatest' football team. Or maybe like using George Monbiot's vegetable patch as a marker for global climate change. 

Look at the details too- (in the small print) the 'record temperatures' result from spikes in measurements in the Arctic and 'parts' of the Antarctic - data sources that are considered so poor that the Met Office and other climate centres do not incorporate at all into their models. But the Goddard not only uses theses dubious statistics, as they say themselves, they then mathematically extrapolate them 'over the entire land mass' - obtaining many more record high temperatures! 

Well, what about using it to check sources, though? A quote is a quote isn't it? Not at WP. Nothing you read there is suitable for reproducing in a 'serious' newspaper - if you might lazily get away with it in a student essay or a top secret dossier for the British government on Iraqi nuclear weapons! Take the view attributed to Benny Peisner, about how he had been wrong to deny that there was a consensus amongst scientists on Global Warming as a settled fact. That's what it says he said on WP! But when Lawrence Solomon checked directly with Peisner, he found that he had said no such thing. The Wikipedia page had misunderstood or distorted his comments. Lawrence Solomon tried to correct the point, but a moment later, it was 'reverted' by 'Tabletop', who offered the explanation: "Note that Peiser has retracted this critique and admits that he was wrong". 

Despite this, it's not just The Guardian (a paper I used to occasionally write articles on Computers and Education for) uncritically regurgitating Wikipedia. All over the word, journalists are writing stories about global warming using the same strategy. 

A Day in the Life of an Environment Editor
10.00 Arrive at desk, switch computer on and have coffee
11.00 am Editorial meeting. Boss says write something (groans all round) about Global Warming.
12.00 Lunch
2.30 pm Look at Wikipedia
3.00 pm Ring or email someone mentioned there for comments
4.00 Tea and organic chocky biscuits
5.00 File 1000 words using WP and my vegetable patch as sources.

That's why Wikipedia's influence is greater than you might think, if you imagine it is just net-nerds who read Wikipedia you may be deluding yourself. Quite possibly you get a compulsory dose of it every morning in regurgitated form in your newspaper and watch it every evening on TV. 

Only a few media organisations have the 'resources' to do any 'research' into these matters - one's like the New York Times, which is a fervent backer of the cause, could it be in the interests of both the Democratic party and the Carbon Traders of Wall Street? - and the BBC. But the BBC held a meeting at which several climate experts were invited to see if there were any doubts or controversies about the climate change science, and these experts said certainly not! So the BBC has no worries. However, just to be on the safe side, it has officially designated the names of the experts it consulted a 'secret'. Like the temperature readings used by the University of East Anglia to arrive at the conclusion that the world is overheating, these sources can never be revealed. 

Now the 'science of global warming', which is to say, the notion that man-made CO2 has caused, and is set increasingly to do so, the planet to warm slightly, is certainly not all the 'sceptical way' either. But let's not get hung up on that. For any number of reasons, the world 'could be' warming up, just as the theory insists. If it is, we need a rational discussion of both the effects, the implications and possible mitigation strategies. 

None of these can start without a full and open exchange of views and evidence. Wikipedia has systematically distorted both - and it continues to do so. 

Here there are no controversies about inaccurate temperature records, manipulated temperature graphs, melting glaciers, african famines, dehydrating rain-forests, or 'complete lists of greenhouse gases' that miss out the one that causes 90 % of the greenhouse effect - water vapour*. 

Yet even giving the lobby its man-made global warming:
• if temperature records are inaccurate, then remedial activities will be directed to the wrong regions
• if glaciers are not really melting then emergency action to provide replacement fresh water supplies to a billion people in Asia is, to say the least, not necessary
• if the rain-forests are not really dehydrating then it is still worth preserving the rain-forests, rather than converting them to 'biofuels', as is the current policy
• if water vapour accounts for virtually all the greenhouse effect, then the economic value and utility of capturing other gases is functionally nil...

One could go on - but why bother? There is no debate, only propaganda. Whether Wikipedia is as we are asked to believe, just a rudderless ship being tossed here and there on the tides of prevailing opinion, I personally doubt. The bias is careful, subtle and very, very thorough. It involves wholesale abuse of the supposed principles of the site - the right of 'everyone' to edit pages and the expulsion of those who make changes that are 'off message' (like my new page on sceptical views). 

Let's leave the last word to Jimmy Wales, nominally at least, the benign dictator controlling the world's most consulted encyclopaedia. I asked him (by email) if anything about the coverage of Climate Change there had worried him, given that it was not neutral at all, and was generated in ways contrary to his claimed principle that 'all editors are equal'. In a characteristically unreflective reply, he wrote: 

"There exists a long line of people who, when their extremist agenda is not accepted into Wikipedia, accuse the community of bias."
Jimmy Wales, 15 Febuary 2010
Jimmy may or may not be worried about the goings on at Wikipedia. But the rest of us should be. 



Notes

About those frightening images... The 'source' is the Goddard Institute, and Gavin Schmidt, editor of realclimate.org (set up by the PR company that Al Gore's environmental advisor was a staffer for), and former home of Wikipedia editing supremo, William Connoley. Does Wikipedia note that Gavin Schmidt and Michael Mann - of the now discredited 'hockey stick' graph are both colleagues and chums? Or that the Goddard is run by James Hansen, one of Global Warming Theories' founding fathers, so to speak, who has such an 'extreme' position the matter that he has fallen out with most of the others in the pro-camp. Quoting them is like quoting Liverpool Supporters Club on 'who are the greatest' football team. Or maybe like using George Monbiot's vegetable patch as a marker for global climate change. Look at the small print too- Gavin and co admit that their 'record temperatures' result from spikes in measurements in the Arctic and 'parts' of the Antarctic - data sources that are considered so poor that the Met Office and other climate centrers do not incorporate at all into their models. But the Goddard not only uses these dubious statistics, as they say themselves, they then mathematically extrapolate them 'over the entire land mass' - obtaining many more record high temperatures!)]] 


1. Quotes from Wikipedia pages are from versions downloaded on 16 February 2010. The numbers in square brackets are left in to indicate the WIkipedia footnote gobbledegook. 

2. Lawrence Solomon,evidently confused by WIkipedia's jargon, makes some large over-estimates of the influence of Connolley. I'm grateful to the Wikipedia Review for additional details on William Connolley's activities. 

2. For those who are interested, the temperature records for the Siberia and China have been shown to have been deliberately falsified, while a much-quoted temperature-survey supposedly demonstrating only as small 'urban heat' effect contained key assertions that were impossible -that is, were flat lies. The key temperature graph of the IPCC report the so-called Hockey Stick graph, was inserted 3 times prominently by its inventor in one IPCC report, but then having been extensively discredited - notably for having 'ironed out' all evidence of past changes in temperature, not included at all in the next.The IPCC claim that all the ice in the Himalayas would have melted by 2035 was discredited when it was pointed out that it came from just one scientist, linked to the IPCC's chief, who had no evidence to back it up, and instead a personal interest in the advancing of the claim. The IPCC predictions of massive crop failure in Sub-Saharan Africa and the disappearance of the rainforests due to lack of rain followed the same pattern - one 'partisan' source, not peer-reviewed. Indeed, when spotted, they were flatly rejected by relevant specialists. But that debate has been suppressed - up to now!