Bryan Lawrence : Climate Sensitivity and Politics

Bryan Lawrence

... personal wiki, blog and notes

Climate Sensitivity and Politics

James Annan has a post about his recent paper with J. C. Hargreaves1 where they combine three available climate sensitivity estimates using Bayesian probability to get a better constrained estimate of the sensitivity of global mean climate to doubling CO2.

For the record, what they've done is used estimates of sensitivity based on studies which were

  1. trying to recreate 20th century warming, which they characterise as (1,3,10) - most likely 3C, but with the 95% limits lying at 1C and 10C,

  2. evaluating the cooling response to volcanic eruptions - characterised as (1.5,3,6), and

  3. recreating the temperature and CO2 conditions associated with the last glacial minimum - (-0.6,2.7,6.1).

The functional shapes are described in the paper, and they use Bayes theorem to come up with a constrained prediction of (1.7,2.9,4.9), and go on to state that they are confident that the upper limit is probably lower too. (A later post uses even more data to drop the 95% confidence interval down to have an upper limit of 3.9C).

In the comments to the first post, Steve Bloom asks the question:

Here's the key question policy-wise: Can we start to ignore the consequences of exceeding 4.5C for a doubling? What percentage should we be looking for to make such a decision? And all of this begs the question of exactly what negative effects we might get at 3C or even lower. My impression is that the science in all sorts of areas seems to be tending toward more harm with smaller temp increases. Then there's the other complicating question of how likely it is we will reach doubling and if so when.

This pretty hard to answer, because it's all bound up in risk. As I said quoting the met office when I first started reporting James predictability posts,

as a general guide one should take action when the probability of an event exceeds the ratio of protective costs to losses (C/L) ... it's a simple betting argument.

So, rather than directly answer Steve's question, for me the key issue is what is the response to a 1.7C climate sensitivity? We're pretty confident (95% sure) that we have to deal with that, and so we already know we have to do something. What to do next boils down to the evaluating protective costs against losses.

Unfortunately it seems easier to quantify costs of doing something than it is to quantify losses, so people use that as an excuse for doing nothing. The situation is exacerbated by the fact that we're going to find it hard to evaluate both without accurate regional predictions of climate change (and concomitant uncertainty estimates). Regrettably the current state of our simulation models is that we really don't have enough confidence in our regional models. Models need better physics, higher resolution, more ensembles, and more analysis. So while it sounds like more special pleading ("give us some more money and we'll tell you more"), that's where we're at ...

.. but that's not an excuse to do nothing, it just means we need to parallelise (computer geek) doing something (adaptation and mitigation) with refining our predictions and estimates of costs and potential losses.

1: I'll replace the link with the doi to the original when a) it appears, and b) I notice. (ret).

Categories: climate environment

This page last modified Tuesday 07 March, 2006
DISCLAIMER: This is a personal blog. Nothing written here reflects an official opinion of my employer or any funding agency.