Dangerous Climate Change Is Coming Print E-mail
Thursday, 30 April 2009 04:47

 
    [Illustrations, footnotes and references available in PDF version]   
 
“Dangerous Climate Change is Coming”
by Christopher Monckton | April 30, 2009

The Scare:
Two papers published in Nature in spring 2009 say that the rise in global temperature is unlikely to remain below the politically-defined threshold of “dangerous climate change”, if global economic growth continues at its current pace. The papers are based on computer simulations of the climate response to greenhouse-gas emissions.

Policymakers have adopted a goal of keeping the global rise in mean surface temperatures to no more than 2 C° (3.6 F°) above pre-industrial levels.

Myles Allen et al. simulate the mean “global warming” that would result from a given cumulative carbon emission. They conclude that a trillion tonnes of carbon emissions (about 3.7 trillion tonnes of CO2, roughly half of which has already been emitted) produces a “most likely” warming of 2 C° (3.6 F°).

Malte Meinshausen et al. take a slightly different tack by modelling the probability of global temperature rises across a range of greenhouse-gas emissions scenarios. They find that total emissions from 2000 to 2050 of about 1,400 gigatonnes of CO2 yields a 50% probability of exceeding 2 C° warming by the end of the 21st century. Emissions for the last seven years were almost 250 gigatonnes, implying that even without future increases in CO2 emissions the total emissions from 2000-2050 may well exceed this 50% probability.

The Truth:
Nature is one of many “scientific” journals that have openly declared an editorial prejudice in favor of a frankly alarmist viewpoint on the climate. In short, Nature adamantly refuses to publish any paper suggesting – however compelling the evidence and arguments – that anthropogenic “global warming” will not be as significant as the UN’s climate panel suggests. Nature’s selection process is, therefore, openly prejudiced ab initio. In reality, Nature is now a religious rather than a scientific journal.

As is now usual, the two papers foretelling “dangerous climate change” are based not on real-world observations but on computer games. This is the “X-Box 360” method of doing science. Syun-Ichi Akasofu, the discoverer of the science underlying the aurora borealis and one of the dozen most-cited scientists in the world, has pointed out that computer models of the climate such as those relied upon in the two papers in Nature are instructed from the outset to assume that the temperature response to CO2 enrichment of the atmosphere will be substantial. The Playstations do not tell us that there will be major warming as a result of our activities – we tell the Playstations.

Are we right to tell the models that climate sensitivity will be high? No. Lorenz (1963), in the landmark paper that founded chaos theory, said that because the climate is a mathematically-chaotic object (a point which the UN’s climate panel admits), accurate long-term prediction of the future evolution of the climate is not possible “by any method”. At present, climate forecasts even as little as six weeks ahead can be diametrically the opposite of what actually occurs, even if the forecasts are limited to a small region of the planet. For instance, in April 2007 the UK Met Office predicted that that summer would be the hottest, driest and most drought-prone since records began, just weeks before the commencement of the coldest, wettest and most flood-prone summer since records began. In Autumn 2008, the Met Office predicted a warmer-than-average winter, just weeks before the coldest winter in two decades began.

Therefore, both in theory and in practice, the predictive skill of computer models of climate has been proven to be limited. With a chaotic object, it is essential to know the complete mathematical description of the object at some chosen starting-point in its evolution. That means knowing the initial value of the millions of variables that define the climate – and knowing them to a precision that is simply not attainable in the real world.

Why, then, does anyone bother with computer models of the climate at all? They are really only useful for very short-term forecasts – a few days ahead at most. Why? Because one of the features common to all chaotic objects is that a very small perturbation in the initial value of just one of the many variables that define the object and determine its behavior can radically alter the future evolution of the object by changing the moment of onset, the duration, the magnitude, and even the sign of the “phase transitions” or sudden changes in a previously-linear behaviour that always occur in chaotic objects.

It has recently been calculated that to produce an accurate forecast of the climate even as little as ten years ahead would require all of the world’s computers to run not for ten years but for 100 billion billion billion years. The age of the universe is only 13.7 billion years. So anyone who claims that any computer model can produce reliable results 100 years ahead is making a claim that goes not only well beyond the laws of mathematics, as explained by Lorenz, but also well beyond the capacity of today’s computers.

If there is a great deal we cannot do in predicting the climate over the long term, is there anything we can do? Yes. It is known – and remarkably simple to demonstrate mathematically – that enrichment of the atmosphere with greenhouse gases will produce some warming, because outgoing long-wave radiation from the Earth’s surface that would normally escape to space is retained in the climate system, where it interacts with additional molecules of CO2, methane, nitrous oxide, or water vapor (the last being the most important greenhouse gas because of its sheer quantity).

However, it is remarkably difficult to calculate how much warming a given proportionate increase in the atmospheric concentration of even one greenhouse gas, such as CO2, will cause. We can add CO2 to a standard atmosphere in the laboratory and work out how much warming might occur, but translating such experiments from the lab to the real atmospheric column is not possible.

All of the UN’s models predict, for instance, that if greenhouse-gas enrichment is the driver of warming then the rate of warming in the tropical upper troposphere, about six miles up, will be 2.5-3 times the surface rate of warming. Yet this differential warming rate – the tropical upper-troposphere “hot-spot” – has never been observed in reality (Douglass et al., 2008). Professor Richard Lindzen of MIT, in a 2008 lecture, estimates that this single discrepancy between observation and prediction requires all of the UN’s estimates of the temperature response to CO2 enrichment to be divided by at least 3. In short, this single failure of the models reliably to predict an essential feature of the climate removes any notion of “dangerous” climate change.

How so? Because, as the two papers in Nature assume, a global temperature increase of 2 C° (3.6 F°) is generally taken as harmless (and, indeed, beneficial). Currently, the UN’s central estimate is that by 2100 global temperature will have risen by 3.9 C°. Divide this by 3 and the temperature increase to 2100 would be just 1.3 C° - a long way below the 2 C° threshold.

Indeed, there is no sound basis for assuming that a temperature increase of as much as 2 C° over the coming century would be dangerous. For most of the past 10,000 years, global temperature has been at least 2 C° and sometimes 3 C° greater than the present, and catastrophe has not ensued.

It is only by wrenching today’s climate out of its historic context that it becomes possible to suggest that small changes in temperature may produce disastrous consequences.

Recently, Paltridge et al. (2009) have established a further serious problem in the upper troposphere. It is drier than the models had predicted. Why does this matter? Because the UN’s climate panel assumes that positive temperature feedbacks will more than triple the initial warming caused by atmospheric CO2 enrichment.

The most important of the positive temperature feedbacks is that from water vapor because, by the Clausius-Clapeyron relation (one of the very few proven results in climatological physics), the space occupied by the atmosphere is capable of carrying near-exponentially more water vapor as it warms.

However, in the troposphere, the expected increase in water vapor concentration has not occurred. Therefore, much of the predicted water vapor feedback cannot be occurring either, substantially reducing the warming effect imagined by the UN’s climate panel. Indeed, Professor Lindzen goes so far as to say that the net effect of all temperature feedbacks is not positive but negative, requiring that, yet again, the predicted effect of CO2 on temperature must be divided by at least 3.

In addition to all of the above problems with the official quantification of climate sensitivity, the UN’s value for the Planck parameter, which converts radiative forcings to temperature before feedbacks are taken into account, is higher than any value in the mainstream literature, requiring a further downward adjustment of at least 50% in the UN’s temperature predictions.

There is a further substantial problem that the two papers in Nature have chosen not to address. For reasons that the UN’s climate panel admits it is unable to explain, even though the emission of CO2 to the atmosphere is taking place at unprecedented levels right at the top of the UN’s predictions, the concentration remaining in the atmosphere is only about half of what the UN predicts. This consideration alone dictates that all of the UN’s predictions of temperature increase to 2100 must be divided by approximately 2.

CO2 concentration is rising not exponentially, as the UN predicts, but in a straight line, and at a rate about half of that which the UN predicts. The UN is aware of the discrepancy between its predicted increases in CO2 concentration and what is occurring in reality, but is unable to explain the discrepancy.

Taking all of these factors together, and allowing for overlaps between them, it seems unlikely that the true temperature increase from atmospheric CO2 enrichment over the 20th century will be more than a quarter of the UN’s estimate. In short – you heard it here first – the anthropogenic contribution to global temperature over the whole of the 21st century is very likely to be less than 1 C° (1.8 F°).

Even that much warming may not actually occur. Contrary to the models’ predictions – global temperatures have been falling. The longer the current cooling persists, the more ground the temperature will have to make up if even our scaled-down projection of anthropogenic warming by 2100 is to occur.

End of Scare.



Add this page to your favorite Social Bookmarking websites
Reddit! Del.icio.us! Mixx! Free and Open Source Software News Google! Live! Facebook! StumbleUpon! Twitter! Joomla Free PHP
Last Updated on Thursday, 30 April 2009 04:54