James Annan on 2.5 deg C

By | January 3, 2008

James Annan on 2.5 deg C

SPPI

[Illustrations, footnotes and references available in PDF version]

For the Full Report in PDF Form, please click here.

I’ve been seeking an engineering-quality exposition of how 2.5 deg C is derived from doubled CO2 for some time. I posted up Gerry North’s suggestion here , which was an interesting article but hardly a solution to the question. I’ve noted that Ramanathan and the Charney Report in the 1970s discuss the topic, but these are hardly up-to-date or engineering quality. Schwartz has a recent journal article deriving a different number and, again, this is hardly a definitive treatment. At AGU, I asked Schwartz after his presentation for a reference setting out the contrary point of view, but he did not give a reference. I’ve emailed Gavin Schmidt asking for a reference and got no answer.

James Annan, a thoughtful climate scientist (see link to his blog in left frame), recently sent me an email trying to answer my long-standing inquiry. While it was nice of him to offer these thoughts, an email hardly counts as a reference in the literature. Since James did not include a relevant reference, I presume that he feels that that the matter is not set out in existing literature. Secondly, a two-page email is hardly an “engineering quality” derivation of the result. By “engineering quality”, I mean the sort of study that one would use to construct a mining plant, oil refinery or auto factory – smaller enterprises than Kyoto.

Part of the reason that my inquiry seems to fall on deaf ears is that climate scientists seem to be so used to the format of little Nature and Science articles that they seem not to understand what an engineering-quality exposition would even look like.

Anyway on to James who writes:

I noticed on your blog that you had asked for any clear reference providing a direct calculation that climate sensitivity is 3C (for a doubling of CO2). The simple answer is that there is no direct calculation to accurately prove this, which is why it remains one of the most important open questions in climate science.

We can get part of the way with simple direct calculations, though. Starting with the Stefan-Boltzmann equation,

where S is the solar constant (1370 Wm^-2), a the planetary albedo (0.3), s (sigma) the S-B constant (5.67×10^-8) and T_e the effective emitting temperature, we can calculate = 255K (from which we also get the canonical estimate of the greenhouse effect as 33C at the surface).

The change in outgoing radiation as a function of temperature is the derivative of the RHS with respect to temperature, giving

This is the extra Wm^-2 emitted per degree of warming, so if you are prepared to accept that we understand purely radiative transfer pretty well and thus the conventional value of 3.7Wm^-2 per doubling of CO2, that conveniently means a doubling of CO2 will result in a 1C warming at equilibrium, *if everything else in the atmosphere stays exactly the same*.

But of course there is no strong reason to expect everything else to stay exactly the same, and at least one very good argument why we might expect a somewhat increased warming: warmer air can hold more water vapour, and I’m sure all your readers will be quick to mention that water vapour is the dominant greenhouse gas anyway. We don’t know the size of this effect precisely, but a constant *relative* humidity seems like a plausible estimate, and GCM output also suggests this is a reasonable approximation (AIUI observations are generally consistent with this, I’m not sure how precise an estimate they can provide though), and sticking this in to our radiation code roughly doubles the warming to 2C for the same CO2 change. Of course this is not a precise figure, just an estimate, but it is widely considered to be a pretty good one. The real wild card is in the behaviour of clouds, which have a number of strong effects (both on albedo and LW trapping) and could in theory cause a large further amplification or suppression of AGW-induced warming. High thin clouds trap a lot of LW (especially at night when their albedo has no effect) and low clouds increase albedo. We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large, given our current state of knowledge. GCMs don’t do clouds very well but they do mostly (all?) suggest some further amplification from these effects. That’s really all that can be done from first principles.

If you want to look at things in the framework of feedback analysis, there’s a pretty clear explanation in the supplementary information to Roe and Baker’s recent Science paper. Briefly, if we have a blackbody sensitivity S0 (~1C) when everything else apart from CO2 is held fixed, then we can write the true sensitivity S as

where the f_i are the individual feedback factors arising from the other processes. If f_1 for water vapour is 0.5, then it only takes a further factor of 0.17 for clouds (f_2, say) to reach the canonical S=3C value. Of course to some extent this may look like an artefact of the way the equation is written, but it’s also a rather natural way for scientists to think about things and explains how even a modest uncertainty in individual feedbacks can cause a large uncertainty in the overall climate sensitivity.

On top of this rather vague forward calculation there are a wide range of observations of how the climate system has responded to various forcing perturbations in the past (both recent and distant), all of which seem to match pretty well with a sensitivity of close to 3C. Some analyses give a max likelihood estimate as low as 2C, some are more like 3.5, all are somewhat skewed with the mean higher than the maximum likelihood.

There is still plenty of argument about how far from 3C the real system could plausibly be believed to be. Personally, I think it’s very unlikely to be far either side and if you read my blog you’ll see why I think some of the more “exciting” results are seriously flawed. But that is a bit of a fine detail compared to what I have written above. Assuming I’ve not made any careless error, I think what I’ve written is entirely uncontentious among mainstream climate scientists (I certainly intended it that way).
Feel free to post and/or pick at as you please (maybe you’d like to LaTeX the maths first).

James

A Few Comments

As noted above, the above note contains only one (not very useful) reference and fails my request for something in the literature.

Annan says:

if you are prepared to accept that we understand purely radiative transfer pretty well and thus the conventional value of 3.7Wm^-2 per doubling of CO2

I do accept that we know radiative transfer of CO2 “pretty well”. I’m not as convinced that all the details of water vapor are understand as well. IPCC TAR GCMs all used a HITRAN version that included an (undisclosed) clerical error in water vapor NIR that amounted to about 4 wm-2 or so. This error had been identified prior to IPCC TAR, but not in time to re-do the GCMs. The error was not disclosed in IPCC TAR. The water vapor continuum seems to have a certain amount of hair on it yet.

Worse, as far as I’ve been able to determine, radiative transfer theory is not itself sufficient to yield the “conventional value of 3.7 Wm^-2 per doubling of CO2?. Getting to that value requires assumptions about the atmosphere and lapse rates and things like that – I’m not saying that any of these calculations are poorly done or incorrect, only that they are not simply a matter of radiative transfer.
Next, James identifies a second important assumption in the modern calculations:

constant *relative* humidity seems like a plausible estimate and GCM output also suggests this is a reasonable approximation
It may well be a “plausible estimate” but something better than this is required. I cannot imagine someone saying this in an engineering study. Lots of things “seem plausible” but turn out to be incorrect. That’s why you have engineers.
Annan goes on to say “GCM output also suggests this is a reasonable approximation”. I’m not sure entirely what he means by this as he did not provide any references. I interpret the statement to mean that GCMs use the constant relative humidity assumption and yield plausible results. Could one vary the constant relative humidity assumption and still get reasonable results from a GCM or a re-tuned GCM? I don’t know. Have people attempted to do so and failed? I don’t recall seeing references to such null experiments AR4 or elsewhere, but might have missed the discussion as it’s not a section that I’ve read closely so far.
In an interesting Crowley paleoclimate article (that I’ve not discussed yet but will at some point), he questions this particular assumption on the basis that allowing for varying lapse rates could explain otherwise puzzling paleo data.

Obviously in an engineering quality assumption, the constant relative humidity assumption would need to be thoroughly aired. I think that this is probably a very important topic and might take dozens of pages (if not a few hundred). A couple of sentences as done here by Annan is merely arm-waving through the problem.

Clouds
Annan says quite candidly:

The real wild card is in the behaviour of clouds, which have a number of strong effects (both on albedo and LW trapping) and could in theory cause a large further amplification or suppression of AGW-induced warming. High thin clouds trap a lot of LW (especially at night when their albedo has no effect) and low clouds increase albedo. We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large, given our current state of knowledge. GCMs don’t do clouds very well but they do mostly (all?) suggest some further amplification from these effects. That’s really all that can be done from first principles.

If we go back to the Charney Report in 1979, clouds were even then identified as the major problem. Given the seeming lack of progress in nearly 30 years, one wonders whether GCMs are really the way to go in trying to measure CO2 impact and whether irrelevant complications are being introduced into the assessment. There was an interesting discussion of cloud feedbacks at RC about a year ago, in which Isaac Held expressed astonishment when a lay commenter observed to him that cloud feedbacks in the models were all positive – Held apparently expecting the effects to be randomly distributed between positive and negative.
James says:

We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large
This is a pretty disquieting statement. If we don’t know this and if this is needed to assess doubled CO2, how does one get to an engineering-quality study?
As far as I’m concerned, James’ closing paragraph about feedbacks is tautological: if you know the feedback ratio, you know the result. But you don’t know the feedback ratios so what has James done here other than re-state the problem?

Thus, James’ exposition, while meant kindly, is not remotely close to answering my question. So the search for an engineering-quality explanation remains.
As I’ve said on many occasions, I do not jump from the seeming absence of a reference to the conclusion that such an exposition is impossible – a jump that readers make much too quickly in my opinion. Murray Pezim, a notorious Vancouver stock promoter, actually had a couple of important mineral discoveries (e.g. Hemlo). I do think that the IPCC has been seriously negligent in failing to provide such an exposition. Well before the scoping of IPCC AR4, I corresponded with Mike MacCracken and suggested that IPCC AR4 should include an exposition of how doubled CO2 leads to a 2.5-3 deg C overall temperature increase – the sort of exposition that readers here are thirsting for.

He undertook to pass the suggestion on to Susan Solomon. However, this idea was apparently rejected somewhere along the process. The first chapter of AR4 consists instead of a fatuous and self-congratulatory history of climate science that has no place whatever in a document addressed to policy-makers.

A side-effect of this IPCC failure is perhaps the dumbing down of the AGW debate, giving rise to shallow and opportunistic expositions like An Inconvenient Truth, in which we get polar bears, hockey sticks, Katrina, all artfully crafted to yield a promotional message. This places thoughtful climate scientists in a quandary, since, by and large, they agree with the AIT conclusion, but not the presentation and the details, and have tended to stay mute on AIT.

Source:
Climate audit http://www.climateaudit.org/?p=2528#more-2528