|Temperature Change and CO2 Change - A Scientific Briefing|
|Written by Christopher Monckton|
|Wednesday, 07 January 2009 03:59|
For the Full Report in PDF Form, please click here.
[Illustrations, footnotes and references available in PDF version]
Summary for Policy Makers
THE CHIEF REASON for skepticism at the official position on “global warming” is the overwhelming weight of evidence that the UN’s climate panel, the IPCC, prodigiously exaggerates both the supposed causes and the imagined consequences of anthropogenic “global warming”; that too many of the exaggerations can be demonstrated to have been deliberate; and that the IPCC and other official sources have continued to rely even upon those exaggerations that have been definitively demonstrated in the literature to have been deliberate.
In short, science is being artfully manipulated to fabricate what are in essence political and not scientific conclusions – a conclusion that is congenial to powerful factions whose ambition is not to identify scientific truth but rather to advance the special vested interests with which they identify themselves.
We have demonstrated that, if CO2 concentration continues to rise more slowly than the IPCC had predicted, and if climate sensitivity to CO2 concentration is in any event well below the IPCC’s projected range, the likelihood of any “global warming” >2 °C/century to 2100 is vanishingly small.
We have also demonstrated that official sources have:
Temperature change and CO2 change: a scientific briefing
It may be the Sun: a strong anti-correlation between cosmic-ray intensity and radiosonde temperatures over the past 50 years. Source: Svensmark and Friis-Christensen, 2007.
THE CHIEF REASON for skepticism at the official position on “global warming” is the overwhelming weight of evidence that the UN’s climate panel, the IPCC, prodigiously exaggerates both the supposed causes and the imagined consequences of anthropogenic “global warming”; that too many of the exaggerations can be demonstrated to have been deliberate; and that the IPCC and other official sources have continued to rely even upon those exaggerations that have been definitively demonstrated in the literature to have been deliberate. In short, science is being artfully manipulated to fabricate what are in essence political and not scientific conclusions – a conclusion that is congenial to powerful factions whose ambition is not to identify scientific truth but rather to advance the special vested interests with which they identify themselves.
The imagined causes of manmade “global warming” have been exaggerated
The anthropogenic increase in the atmospheric concentration of carbon dioxide and other greenhouse gases since the mid-20th century is said to be the principal driver of the “global warming” thought to have been observed over that period. The present analysis will generally overlook all anthropogenic forcings (positive or negative influences on global temperature) other than that from carbon dioxide, since the UN’s climate panel finds the aggregate of all non-CO2 forcings to be slightly net-negative (see Monckton, 2008, in Physics and Society, July, for an evaluation establishing this uncontroversial point).
The official theory may thus be simplified to state that it is the anthropogenic increase in atmospheric carbon dioxide concentration that, if unchecked, will raise global mean surface temperatures by up to 6.4 degrees Celsius in the century to 2100. The theory depends upon two questionable assumptions: first, that atmospheric carbon dioxide concentration will increase twofold to threefold over the 21st century; secondly, that the effect of carbon dioxide is substantial enough to increase global mean surface temperature on the interval [2, 4.5] °C (central official projection 3.26 °C) for every doubling of the atmospheric concentration. We shall examine these two assumptions seriatim.
Rate of increase in carbon dioxide concentration: Figure 2 shows that, in the eight years since January 2001, global atmospheric CO2 concentration has increased at a near-linear 200 ppmv/century. By 2100, at this rate, CO2 concentration would be ~580 ppmv:
CO2 increases well below the projected path. The blue region shows the IPCC’s currently-projected range of increases in CO2 concentration; the blue curve beneath this region is NOAA’s deseasonalized global trend; the cyan line is the least-squares linear regression on that trend, equivalent to ~200 ppmv/century.
Careful examination of Figure 2 shows that the IPCC’s CO2 projections are exponential curves, so that the IPCC imagines the concentration will reach its projected interval [730, 1040] ppmv by 2100, central projection 836 ppmv. However, the observed trend is entirely below the IPCC’s predicted path. Furthermore, the residuals of the NOAA’s CO2-concentration trend are so close to the fit that the trend may itself be near-linear, in which event, even if humankind takes no action at all to curb CO2 emissions, the concentration by 2100 will be little more than 580 ppmv. Note that the IPCC does not even include its estimates of the CO2 concentration by 2100 in its 2007 Summary for Policymakers.
Climate sensitivity: the effect of CO2 on temperature: At its simplest, the IPCC’s guess is that the effect of changes in CO2 concentration on temperature is logarithmic: i.e. a multiple of the natural logarithm of the proportionate increase in CO2 concentration:
?Ts = c ln(C/C0),
where the bracketed term is the proportionate increase. From the fact that the IPCC’s projected temperature change in response to a doubling of CO2 concentration is [2, 4.5] °C, central projection 3.26 °C, we may calculate that the coefficient c in the CO2-change-to-temperature-change equation falls on the interval [2.9, 6.5], central projection 4.7.
If, ad argumentum, the IPCC’s central projection of the influence of CO2 on global temperature were correct, and taking today’s CO2 concentration as 385 ppmv, then at our projected 580 ppmv in 2100, we might instantly derive the corresponding increase in mean global surface temperature compared with today, thus –
?Ts = c ln(C/C0) ˜ 4.7 ln(580/385) ˜ 1.9 °C.
This value is little more than half the 3.6 °C that would result by 2100 if CO2 concentration were to increase at the IPCC’s central rate, giving 836 ppmv rather than 570 ppmv by 2100.
However, there are good grounds to doubt the IPCC’s current estimates of the effect of changes in CO2 concentration on temperature. Remarkably, none of the IPCC’s quinquennial climate assessments gives any account of the theoretical or empirical methods by which the impact of CO2 on temperature has been evaluated. Were laboratory experiments conducted? We are not told. By what method? We are not told.
What steps were taken to replicate the original experiments on which the IPCC’s values are predicated? We are not told. What are the methods by which each of the three key variables whose product is final climate sensitivity are evaluated? We are not told. How did the IPCC evaluate these methods theoretically or validate them empirically? We are not told. Can any of the three key variables – the CO2 forcing, the no-feedbacks climate sensitivity parameter, and the temperature-feedback multiplier – be directly measured? No.
In short, the chain of theoretical, empirical, and mathematical reasoning without which no scientist would regard the IPCC’s estimates of the effect of CO2 on temperature as having been reliably established in accordance with the long-established principles of the scientific method is largely absent from the official, quinquennial reports of the IPCC.
Neither of the most recent IPCC quinquennial assessments bothers even to devote a complete chapter to what is, after all, the central question in the entire debate – how does one evaluate the magnitude of the imagined effect of changes in CO2 concentration on temperature?
Disjointed and often mutually-contradictory particles of information are scattered about the IPCC’s documents. For instance, the 2001 report says a typical value of the final-climate-sensitivity parameter is ? = 0.5, yet the central estimate of ? implicit in the 2007 report is nearly double this value, and there is no discussion of the discrepancy.
The peer-reviewed literature is full of papers questioning the IPCC’s estimates of climate sensitivity to changes in CO2 concentration. Schwartz (2007), Wentz et al. (2007); Chylek et al. (2004, 2007); Lindzen (2008); Khilyuk & Chilingar (2007); Schwartz (2007); and Monckton (2008) all find final climate sensitivity to be <1 °C at CO2 doubling, for different reasons. Indeed, low, harmless, beneficial climate sensitivity is almost becoming a consensus in the scientific literature.
The likelihood of an anthropogenic temperature increase >2 °C by the year 2100 is vanishingly small, since so high an increase would require not only an exponential rate of increase in CO2 concentration very considerably in excess of the recently-linear increase that is observed in the real world, but also an effect of CO2 on temperature for whose overstated magnitude no theoretical demonstration or empirical verification is provided anywhere in the climate assessments of the IPCC.
Global mean surface temperature is not rising as fast as the IPCC has predicted
The maximum rate of global temperature increase – equivalent to ~1.8 °C/century – occurred in the 1920s-30s, when humankind could not have had very much to do with it. Since 1980, when data reliability was enhanced by satellite data series, the rate of increase has been equivalent to ~1.5 °C per century, twice the increase of 0.74 °C/century from 1900-2000.
However, for the past eight full years (see Figure 3), global mean surface temperatures have been falling on a trend equivalent to >1 °C/century.
Eight straight years’ global temperature downtrend: The authoritative SPPI composite index of global mean surface temperature anomalies, taking the mean of two surface and two satellite datasets, shows a pronounced downtrend for eight full years. Not one of the climate models relied upon by the IPCC had predicted this downturn. The pink region shows the IPCC’s projected rates of temperature increase: the thick red straight line shows the least-squares linear regression on the composite temperature anomalies.
A few years’ downtrend cannot be naively extrapolated. However, taken with the fact that the 30-year uptrend was at a rate below the uptrend observed in the 1920s and 1930s, the current downtrend notwithstanding the continuing and increase in CO2 concentration indicates a growing likelihood that CO2 cannot be influencing surface temperatures to the extent imagined by the IPCC.
No correlation, so no causation: Neither the global-temperature trend (red line) nor the global-CO2 trend (cyan line) falls within the regions that encompass the IPCC’s projected intervals. Furthermore, there is a startling absence of correlation between the CO2-concentration trend and the temperature trend, necessarily implying that – at least in the short term – there is little or no causative link between the two.
At present, therefore, there is remarkably little reason to suppose that global temperatures will rise in the 21st century at a rate any greater than the 0.74 °C/century observed in the 20th. The probability that temperatures will rise throughout the 21st century at a rate much above the 1.5 °C/century observed since 1980 (see Figure 5) is vanishingly small.
Behind the curve: For 29 years, temperatures have not risen as the IPCC had predicted.
The official, global temperature records overstate the true warming rate: McKitrick (2007) has demonstrated that the global-temperature datasets are unduly influenced by the directly-exothermic industrial activities of humankind. He has established a statistically-significant correlation between regional rates of temperature change and regional rates of economic growth. If there were no “urban heat-island” contamination of the global-temperature datasets, there would be no statistically-significant correlation. His startling conclusion is that, at least over land, the temperature datasets show temperatures rising at approximately double the true rate of increase – a 100% exaggeration of the true trend since 1980.
For many years, Meteorologist Anthony Watts has studied the placement of automatic temperature-monitoring stations. He has found that many of the stations are in formerly-rural areas that are now industrialized. Some stations are close to car parks, heating systems, air-conditioning outlets and other locations that cause artificial enhancement of the temperature readings. He has also studied the processing of the data from these stations to establish whether the correct adjustments for urbanization are being made. To ensure compatibility of recent with older data, the previous data should be held at their originally-recorded values, but, at sites that have become urbanized since earlier data were recorded, more recent data should be adjusted downward to compensate for the heat-island effect of surrounding urbanization.
However, in one notoriously-inaccurate dataset administered by a politicized scientist known to have close and long-established financial and political links with Al Gore and John Kerry, the reverse policy is followed. Older data are tampered with to reduce temperatures in the 1930s (which were in reality greater than today’s temperatures), while today’s readings (which ought to have been reduced to compensate for the effects of recent urbanization surrounding the stations) have been increased.
An instance of this tampering is the Santa Rosa station in the US, originally on a deserted beach by a lake, now surrounded by a busy boatyard, and the thermometer is very close to a dark-painted, upturned boat. This change in the local circumstances, however, has not been taken into account in the temperature readings (Figures 6a, 6b) –
Figures 6a (above) and 6b (below)
Blink comparator: Lift Figures 6a ad 6b out of this document and drop them into a PowerPoint presentation as successive slides, then toggle them using the arrow-keys so that first one and then the other repeatedly appears. Using the computer thus as a blink comparator, see the distortion of the raw Santa Rosa data after processing to decrease the 1930s temperature readings and to enhance the current readings to make today seem artificially warmer than the 1930s.
How to turn Santa Rosa from cooling to “warming”: just rejig the data.
The question arises: does this tampering with the data influence the global temperature dataset? The answer will be found by blink-comparison of Figures 7a and 7b, which compare the output of the NASA GISS global-temperature dataset as it was in 1999 to the output of the same dataset today –
Figures 7a (above) and 7b (below)
Blink comparator 2: compare these two successive PowerPoint slides, nine years apart, to see how more recent data have been tampered with to a greater extent than earlier data, so as to increase still further the apparent (but not real) rate of growth in global mean surface temperatures. The NASA GISS global-temperature dataset does not form part of the SPPI composite anomaly index.
We conclude that even the apparently-definitive, real-world global temperature datasets cannot be safely relied upon. Even if the tampering is justifiable, the amount of processing is very great and, accordingly, the datasets may – to an unknown but considerable extent – be measuring not the true change in temperature but rather the biases in processing.
Recent temperature changes in perspective
It is often said in official circles that the recent rate of increase in global mean surface temperatures is unprecedented, supposedly implying a clear anthropogenic signal (Fig. 8).
The endpoint fallacy: Dr. Rajendra Pachauri likes to use the above IPCC graph to demonstrate his contention that global temperatures are rising at an ever-increasing rate. The deception relies upon the careful selection of endpoints for each successive linear regression.
However, a glance at the IPCC’s own graph of global temperature changes over the past 150 years demonstrates that this is not the case (Figure 9):
Pre-anthropogenic precedents: From 1850-1880 and from 1910-1940, the warming rate was identical to that from 1970-2000. The two earlier periods of warming occurred before humankind could have had any appreciable influence. Note how the IPCC graph is truncated to conceal the post-1998 downtrend.
To illustrate the endpoint fallacy, we shall show the global mean surface temperature trends since 25 years ago, 20 years ago, 15 years ago, 10 years ago and 5 years ago –
The successive trends show an accelerating decline, so that in the past five years global temperatures have been falling at a rate equivalent to >3 °C/century. The table illustrates the extreme sensitivity of the temperature trend to the choice of endpoints.
Plainly, a longer perspective is needed. Akasofu (2008) finds that for 300 years, between the end of the 70-year Maunder Minimum in 1700 and the end of the 70-year solar Grand Maximum in 2000, global temperatures rose at 0.5-0.7 °C/century.
For 275 years of this 300-year period, humankind cannot have had any significant influence over the trend. In the past 25 years there has been no anthropogenic acceleration of this long-running trend. The increase in temperatures over the past 300 years parallels, and was probably chiefly caused by, a very rapid increase in solar activity, as measured by sunspot numbers (Fig. 10):
Solar activity, as measured by sunspot numbers, increased sharply between the end of the sunspotless Maunder Minimum in 1700 and the peak of the 70-year Solar Grand Maximum in the early 1960s. During the Grand Maximum, solar activity was greater, and for longer, than during almost any similar previous period in the 11,400 years since the end of the last Ice Age. Source: Hathaway et al., 2004.
Travelling still further back, temperatures throughout the world were warmer during the mediaeval warm period a millennium ago than they are at present. The IPCC’s 1990 report showed the warm period clearly in a graph, but by the 2001 report a new graph had been substituted, in which the mediaeval warm period had been made to disappear.
This deception was achieved by the use of defective bristlecone-pine temperature proxies, which the IPCC had earlier recommended should not be used for pre-instrumental temperature reconstructions because the breadth of the tree-rings is sensitive not so much to changes in temperature as to changes in carbon dioxide concentration and in rainfall. Though other proxies were also used in the calculation of the IPCC’s 2001 graph, those proxies that showed the clearest apparent increase in temperature in the late 20th century were given almost 400 times as much weighting as those that revealed the presence of the mediaeval warm period.
Furthermore, the authors of the graph removed all of the proxy data for the mediaeval warm period itself, while writing that they had included it, and substituted some “estimates” of their own without admitting that that was what they had done. The effect of the “estimates” was to erase the mediaeval warm period. The true data were then hidden on the computer of the graph’s compilers in a file labeled “CENSORED_DATA”. If the “estimates” were replaced by the censored data, the mediaeval warm period promptly reappeared. If the defective bristlecone-pine proxies were removed and the remaining proxies used on their own, the mediaeval warm period reappeared.
Furthermore, the algorithm that generated the defective graph could be relied upon to eradicate the mediaeval warm period and to show unprecedented warming at the end of the 20th century even if random red noise were fed into it rather than genuine proxy temperature data.
McIntyre & McKitrick (2005) exposed some of these defects in a paper published in Geophysical Research Letters. Thereupon an investigation was mounted by the US National Academy of Sciences, which found that the graph had “a validation skill not significantly different from zero”: however, this conclusion – to the effect that the graph was worthless – did not appear in the press release issued by the NAS, which instead said the graph’s finding was “plausible”.
An investigation was also conducted by three statisticians acting for the US House of Representatives. This investigation, too, found the graph valueless, and also noted that a suspiciously large number of papers apparently supporting the graph’s conclusions, that had appeared in the journals following McIntyre & McKitrick’s paper, had been written by former co-authors of those who had fabricated the defective graph.
Recently, the graph’s inventors have published a further paper purporting to provide evidence that their original graph was correct. Once again, however, if the unsuitable bristlecone-pine tree-rings are removed, the remaining proxies reveal that the mediaeval warm period was real, was global, and was warmer than the present.
Notwithstanding the defects in the graph, which has been more completely exposed and shown to be false than almost any scientific result over the past century, the IPCC continues to rely upon it, and to deny the existence of the mediaeval warm period.
The concentration on the past one or two thousand years that has arisen from the controversy over the defective graph purporting to abolish the mediaeval warm period has concealed a fact that swiftly re-establishes the correct perspective.
For if one considers the entire Holocene period since the end of the last Ice Age 11,400 years ago, temperatures have been warmer than the present almost throughout (e.g. Curry & Clow, 1997: figure 11):
Warmer than today: most of the period since the end of the last Ice Age has been warmer than the present by several degrees Celsius.
Seen in the geological perspective of the last 17,000 years, the 300 years of recent warming, nearly all of which must have been natural, for we could not have had any significant influence except in the past 25 years, are manifestly insignificant.
One should also ask whether each of the recent interglacial periods was warmer than the present. Interglacial periods have occurred with apparent regularity approximately every 125,000 years. The record, from Petit et al. (1999), is in Figure 12:
Today’s temperatures are unexceptional: it was warmer in each of the past four interglacial periods than it is today. Note the close correlation between CO2 concentration and temperature: but it was temperature that changed first.
We have briefly considered, and treated as intriguing but peripheral, Professor Lindzen’s infrared-iris theory and Dr. Svensmark’s cosmic-ray theory. We have then demonstrated that, if CO2 concentration continues to rise more slowly than the IPCC had predicted, and if climate sensitivity to CO2 concentration is in any event well below the IPCC’s projected range, the likelihood of any “global warming” >2 °C/century to 2100 is vanishingly small.
We have also demonstrated that official sources have relied upon questionable and occasionally downright dishonest methods to inflate the observed rate of temperature increase, to create the false impression that the rate of increase is itself rising when an identical argument can be used to demonstrate that it is falling, to diminish earlier and warmer temperatures in this century, to abolish the mediaeval warm period, and to divert attention away from the fact that throughout almost all of the Holocene, and throughout all four previous interglacial periods, surface temperatures were considerably warmer than they are today.
For reasons of length, the present paper cannot consider the numerous and flagrant official as well as unofficial distortions, inflations and exaggerations of the supposed consequences of “global warming”: the present analysis has been confined only to the analysis of its imagined causes. This note should, however, be sufficient to convince the open-minded and diligent reader that, if so many artful steps have been and are being taken to falsify and exaggerate the scientific truth, perhaps the truth is not as those who are so ingeniously and persistently tampering with the science and the data would have us believe.
Monckton of Brenchley
Carie, Rannoch, Scotland, PH17 2QJ
30 December 2008
|Last Updated on Wednesday, 07 January 2009 06:46|