Limiting global warming to 2 °C – why Victor and Kennel are wrong

Ethical Markets  posts this debate on the 2  degree Limit  while we continue to  focus on what all 191 UN member countries already agree : to  shift away from fossil energy and non-renewables to  cleaner,inclusive sustainable green economies worldwide . This is what we track in our Green Transition Scoreboard®  , now showing $5.7 trillion of private investments  in green sectors worldwide already in the pipeline .Our mid-2014 Update “ Green Bonds Growing Green Infrastructure”  is now gaining media attention as in INSTITUTIONAL INVESTOR , Hazel Henderson,Editor

RealClimate logo

Limiting global warming to 2 °C – why Victor and Kennel are wrong

— Stefan Rahmstorf @ 1 October 2014

In a comment in Nature titled Ditch the 2 °C warming goal, political scientist David Victor and retired astrophysicist Charles Kennel advocate just that. But their arguments don’t hold water.

It is clear that the opinion article by Victor & Kennel is meant to be provocative. But even when making allowances for that, the arguments which they present are ill-informed and simply not supported by the facts. The case for limiting global warming to at most 2°C above preindustrial temperatures remains very strong.

Let’s start with an argument that they apparently consider especially important, given that they devote a whole section and a graph to it. They claim:

The scientific basis for the 2 °C goal is tenuous. The planet’s average temperature has barely risen in the past 16 years.

They fail to explain why short-term global temperature variability would have any bearing on the 2 °C limit – and indeed this is not logical. The short-term variations in global temperature, despite causing large variations in short-term rates of warming, are very small – their standard deviation is less than 0.1 °C for the annual values and much less for decadal averages (see graph – this can just as well be seen in the graph of Victor & Kennel). If this means that due to internal variability we’re not sure whether we’ve reached 2 °C warming or just 1.9 °C or 2.1 °C – so what? This is a very minor uncertainty. (And as our readers know well, picking 1998 as start year in this argument is rather disingenuous – it is the one year that sticks out most above the long-term trend of all years since 1880, due to the strongest El Niño event ever recorded.)

2degrees_smallGlobal-mean surface temperature 1880-2013 (NASA GISS data). Grey line shows annual values, the blue line a LOESS smooth to highlight the long-term evolution. The latter is well reproduced by climate models when driven by all the known forcings (see Fig. TS-9 of the IPCC AR5). Note that the annual values typically stray by only ~0.1 °C from this smooth evolution due to natural variability such as El Niño – Southern Oscillation. The year 1998 sticks out more than any other year above the blue line – even so 2010 and 2005 are the warmest years on record. Values are given relative to a preindustrial baseline, the exact definition of which may be debated but only adds a minor uncertainty – here it was chosen as the mean temperature of 1880-1900.

The logic of 2 °C

Climate policy needs a “long-term global goal” (as the Cancun Agreements call it) against which the efforts can be measured to evaluate their adequacy. This goal must be consistent with the concept of “preventing dangerous climate change” but must be quantitative. Obviously it must relate to the dangers of climate change and thus result from a risk assessment. There are many risks of climate change (see schematic below), but to be practical, there cannot be many “long-term global goals” – one needs to agree on a single indicator that covers the multitude of risks. Global temperature is the obvious choice because it is a single metric that is (a) closely linked to radiative forcing (i.e. the main anthropogenic interference in the climate system) and (b) most impacts and risks depend on it. In practical terms this also applies to impacts that depend on local temperature (e.g. Greenland melt), because local temperatures to a good approximation scale with global temperature (that applies in the longer term, e.g. for 30-year averages, but of course not for short-term internal variability). One notable exception is ocean acidification, which is not a climate impact but a direct impact of rising CO2 levels in the atmosphere – it is to my knowledge currently not covered by the UNFCCC.

emissions_to_impacts

From emissions to impacts.

Once an overall long-term goal has been defined, it is a matter of science to determine what emissions trajectories are compatible with this, and these can and will be adjusted as time goes by and knowledge increases.

Why not use limiting greenhouse gas concentrations to a certain level, e.g. 450 ppm CO2-equivalent, as long-term global goal? This option has its advocates and has been much discussed, but it is one step further removed from the actual impacts and risks we want to avoid along the causal chain shown above, so an extra layer of uncertainty is added. This uncertainty is that in climate sensitivity, and the overall range is a factor of three (1.5-4.5 °C) according to ipcc. This would mean that as scientific understanding of climate sensitivity evolves in coming decades, one might have to re-open negotiations about the “long-term global goal”. With the 2 °C limit that is not the case – the strategic goal would remain the same, only the specific emissions trajectories would need to be adjusted in order to stick to this goal. That is an important advantage.

2 °C is feasible

Victor & Kennel claim that the 2 °C limit is “effectively unachievable”. In support they only offer a self-citation to a David Victor article, but in fact they disagree with the vast majority of scientific literature on this point. The ipcc has only this year summarized this literature, finding that the best estimate of the annual cost of limiting warming to 2 °C is 0.06 % of global GDP (1). This implies just a minor delay in economic growth. If you normally would have a consumption growth of 2% per year (say), the cost of the transformation would reduce this to 1.94% per year. This can hardly be called prohibitively expensive. When Victor & Kennel claim holding the 2 °C line is unachievable, they are merely expressing a personal, pessimistic political opinion. This political pessimism may well be justified, but it should be expressed as such and not be confused with a geophysical, technological or economic infeasibility of limiting warming to below 2 °C.

Because Victor & Kennel complain about policy makers “chasing an unattainable goal”, they apparently assume that their alternative proposal of focusing on specific climate impacts would lead to a weaker, more easily attainable limit on global warming. But they provide no evidence for this, and most likely the opposite is true. One needs to keep in mind that 2 °C was already devised based on the risks of certain impacts, as epitomized in the famous “reasons of concern” and “burning embers” (see the IPCC WG2 SPM page 13) diagrams of the last IPCC reports, which lay out major risks as a function of global temperature. Several major risks are considered “high” already for 2 °C warming, and if anything the many of these assessed risks have increased from the 3rd to the 4th to the 5th IPCC reports, i.e. may arise already at lower temperature values than previously thought.

One of the rationales behind 2 °C was the AR4 assessment that above 1.9 °C global warming we start running the risk of triggering the irreversible loss of the Greenland Ice Sheet, eventually leading to a global sea-level rise of 7 meters. In the AR5, this risk is reassessed to start already at 1 °C global warming. And sea-level projections of the AR5 are much higher than those of the AR4.

Even since the AR5, new science is pointing to higher risks. We have since learned that parts of Western Antarctica probably have already crossed the threshold of a marine ice sheet instability (it is well worth reading the commentaries by Antarctica experts Eric Rignot or Anders Levermann on this development). And that significant amounts of potentially unstable ice exist even in East Antarctica, held back only by small “ice plugs”. Regarding extreme events, we have learnt that record-breaking monthly heat waves have already increased five-fold above the number in a stationary climate. (These are heat waves like in Europe in 2003, causing ~ 70.000 fatalities.)

And we should not forget that after 2 °C warming we will be well outside the range of temperature variation of the entire Holocene; the planet will be hotter than anything experienced during human civilisation.

If anything, there are good arguments to revise the 2 °C limit downward. Such a possible revision is actually foreseen in the Cancun Agreements, because the small island nations and least developed countries have long pushed for 1.5 °C, for good reasons.

Uncritically adopted?

Victor & Kennel claim the 2 °C guardrail was “uncritically adopted”. They appear to be unaware of the fact that it took almost twenty years of intense discussions, both in the scientific and the policy communities, until this limit was agreed upon. As soon as the world’s nations agreed at the 1992 Rio summit to “prevent dangerous anthropogenic interference with the climate system”, the debate started on how to specify the danger level and operationalize this goal. A “tolerable temperature window” up to 2 °C above preindustrial was first proposed as a practical solution in 1995 in a report by the German government’s Advisory Council on Global Change (WBGU). It subsequently became the climate policy guidance of first the German government and then the European Union. It was formally adopted by the EU in 2005.

Also in 2005, a major scientific conference hosted by the UK government took place in Exeter (covered at RealClimate) to discuss and describe scientifically what “avoiding dangerous climate change” means. The results were published in a 400-page book by Cambridge University Press. Not least there are the IPCC reports as mentioned above, and the Copenhagen Climate Science Congress in March 2009 (synthesis report available in 8 languages), where the 2 °C limit was an important issue discussed also in the final plenary with then Danish Prime Minister Anders Fogh Rasmussen (‘Don’t give us too many moving targets – it is already complex’).

After further debate, 2 °C was finally adopted at the UNFCCC climate summit in Cancun in December 2010. Nota bene as an upper limit. The official text (Decision 1/CP.16 Para I(4)) pledges

to hold the increase in global average temperature below 2 °C above pre- industrial levels.

So talking about a 2 °C “goal” or “target” is misleading – nobody in their right mind would aim to warm the climate by 2 °C. The goal is to avoid just that, namely keeping warming below 2 °C. As an upper limit it was also originally proposed by the WBGU.

What are the alternatives?

Victor & Kennel propose to track a bunch of “vital signs” rather than global-mean surface temperature. They write:

What is ultimately needed is a volatility index that measures the evolving risk from extreme events.

As anyone who has ever thought about extreme events – which by definition are rare – knows, the uncertainties relating to extreme events and their link to anthropogenic forcing are many times larger than those relating to global temperature. It is rather illogical to complain about ~0.1 °C variability in global temperature, but then propose a much more volatile index instead.

Or take this proposal:

Because energy stored in the deep oceans will be released over decades or centuries, ocean heat content is a good proxy for the long-term risk to future generations and planetary-scale ecology.

It seems that the authors are not getting the physics of the climate system here. The deep oceans will almost certainly not release any heat for at least a thousand years to come; instead they will continue to absorb heat while slowly catching up with the much greater surface warming. It is also unclear what the amount of heat stored in the deep ocean has to do with risks and impacts at our planet’s surface – if deep ocean heat uptake increases (e.g. due to a reduction in deep water renewal rates, as predicted by IPCC), how would this affect people and ecosystems on land?

Vital Signs

The idea to monitor other vital signs of our home planet and to keep them within acceptable bounds is of course neither bad nor new. In fact, in addition to the 2 °C warming limit the WBGU has also proposed to limit ocean acidification to at most 0.2 (in terms of reduction of the mean pH of the global surface ocean) and to limit global sea-level rise to at most 1 meter. And there is a high-profile scientific debate about further planetary boundaries which Victor & Kennel don’t bother mentioning, although the 2009 Nature paper A safe operating space for humanity by Rockström et al. already has clocked up 932 citations in Web of Science. The key difference to Victor & Kennel, apart from the better scientific foundation of these earlier proposals, is that these bounds are intended as additional and complementary to the 2 °C limit and not to replace it.

If one wanted to sabotage the chances for a meaningful agreement in paris next year, towards which the negotiations have been ongoing for several years, there’d hardly be a better way than restarting a debate about the finally-agreed foundation once again, namely the global long-term goal of limiting warming to at most 2 °C. This would be a sure recipe to delay the process by years. That is time which we do not have if we want to prevent dangerous climate change.

Footnote

(1) According to IPCC, mitigation consistent with the 2°C limit involves annualized reduction of consumption growth by 0.04 to 0.14 (median: 0.06) percentage points over the century relative to annualized consumption growth in the baseline that is between 1.6% and 3% per year. Estimates do not include the benefits of reduced climate change as well as co-benefits and adverse side-effects of mitigation. Estimates at the high end of these cost ranges are from models that are relatively inflexible to achieve the deep emissions reductions required in the long run to meet these goals and/or include assumptions about market imperfections that would raise costs.

Links

The Guardian: Could the 2°C climate target be completely wrong?

We have covered just the main points – a more detailed analysis [pdf] of the further questionable claims by Victor and Kennel has been prepared by the scientists from Climate Analytics.

Climate Progress: 2°C Or Not 2°C: Why We Must Not Ditch Scientific Reality In Climate Policy

Carbon Brief: Scientists weigh in on two degrees target for curbing global warming. Lots of leading climate scientists comment on Victor & Kennel (none agree with them).

– See more at: http://www.realclimate.org/index.php/archives/2014/10/limiting-global-warming-to-2-c-why-victor-and-kennel-are-wrong/#sthash.9yHXYkhS.dpuf

2°C Or Not 2°C: Why We Must Not Ditch Scientific Reality In Climate Policy

by Joe Romm Posted on October 1, 2014 at 5:01 pm

2 degrees

 

Global-mean surface temperature 1880-2013. Grey line shows annual values, and the smoothed blue line highlights the long-term evolution. (Via RealClimate using NASA data)

A new Comment piece in Nature argues we should “Ditch the 2 °C warming goal” as a basis for climate change policy. Here’s why the authors, political scientist David Victor and retired astrophysicist Charles Kennel, are wrong — and why “their prescription is a dangerous one,” as a top climatologist told me.

Their core argument, as Nature sums it up, is “Average global temperature is not a good indicator of planetary health. Track a range of vital signs instead.”

I’ll discuss below why our global temperature is a perfectly reasonable indicator of planetary health — or rather, of planetary sickness, since we have a fever. First, let’s dispense with the notion that tracking a “range of vital signs” would somehow make it easier for humanity to avoid catastrophe.

Consider that way back in 2009, “a group of 28 internationally renowned scientists identified and quantified a set of nine planetary boundaries within which humanity can continue to develop and thrive for generations to come. Crossing these boundaries could generate abrupt or irreversible environmental changes.” Unfortunately, we’ve already crossed some key ones, including climate change and rate of biodiversity loss:

Planetary Boundaries

 

The inner green shading represents the proposed safe operating space for nine planetary systems. The red wedges represent an estimate of the current position for each variable.

Oops. The thing is, five years ago, Nature actually published a major article (and responses) on these “planetary boundaries.” The key takeaways. First, the planet has already overshot multiple boundaries, including climate change (and is close to doing so in a couple more including CO2-driven ocean acidification).

Second, adding more vital signs just gives people more things to argue about, so it is hardly a recipe for faster or more streamlined international action. Indeed, the whole Victor and Kennel approach would be an excuse for more dawdling. They don’t just want to ditch the 2°C limit but they want to replace the entire effort aimed at developing a global plan to avoid that limit culminating in the December 2015 paris climate conference. Instead, they write, “New indicators will not be ready for the Paris meeting, but a path for designing them should be agreed there.”

Yes, instead of trying to get the world’s leading governments to agree on the commitments needed to avoid crossing the 2°C target, let’s just ask them to agree on a “path for designing” some new targets. So long Miami, New Orleans, and other coastal cities — it’s been good to know you!

As Michael Mann, director of the Earth System Science Center at Pennsylvania State, wrote me:

Giving up on the 2C warming limit, after so much work has been done to motivate this objective and meaningful target for defining dangerous climate change amounts to kicking the can down the road. It simply provides a crutch for those looking for yet another excuse for not doing the tough but necessary work to stabilize greenhouse gas concentrations below dangerous levels. Sure, it’s possible that we will fail to stabilize temperatures below 2C warming even given concerted efforts to lower our carbon emissions, but simply discarding this goal would make failure almost certain.

I’m sure the authors mean well, but their prescription is a dangerous one in my view.

TWIMC: The scientific reality is that we are already in overshoot!

footprint-biocapacity

 

Homo sapiens already use the equivalent of 1.5 Earths to support our consumption. (Global Footprint Network viaWWF)

So what exactly is wrong with the 2°C target that it should be ditched? Sadly, Victor and Kennel offer a bunch of beyond-dubious arguments:

The scientific basis for the 2°C goal is tenuous. The planet’s average temperature has barely risen in the past 16 years

These statements are not merely dubious, they are “disingenuous,” to use the word of Stefan Rahmstorf, Co-Chair of Earth System Analysis at the Potsdam Institute for Climate Impact Research, in his debunking post on RealClimate.

It is truly unfortunate that Victor and Kennel perpetuate the myth that there has been some sort of a pause in warming. Nearly a year ago, a journal article explained that a key reason there appears to be a pause is that one of the major temperature data sets ignores all warming in the Arctic (see here). Also, as Rahmstorf notes, it is very widely known that “picking 1998 as start year in this argument is rather disingenuous –- it is the one year that sticks out most above the long-term trend of all years since 1880, due to the strongest El Niño event ever recorded.”

What’s even more bewildering is even to the extent there has been a slowdown in surface air temperature warming during this cherry-picked period that has no bearing on the argument Victor and Kennel are making, as Rahmstorf shows:

They fail to explain why short-term global temperature variability would have any bearing on the 2 °C limit — and indeed this is not logical. The short-term variations in global temperature, despite causing large variations in short-term rates of warming, are very small — their standard deviation is less than 0.1 °C for the annual values and much less for decadal averages (see graph — this can just as well be seen in the graph of Victor & Kennel). If this means that due to internal variability we’re not sure whether we’ve reached 2 °C warming or just 1.9 °C or 2.1 °C — so what? This is a very minor uncertainty.

The argument by Victor and Kennel that there’s no “scientific basis” for the 2°C limit or that it was “uncritically adopted” by governments is thoroughly debunked at length by Rahmstorf (see also Mann’s “Defining Dangerous Anthropogenic Interference).”

Rahmstorf points out that the work of the Intergovernmental Panel on Climate Change (IPCC) underscores the need for the 2°C limit with their reviews of the climate science literature in 2007 (AR4) and over the last year (AR5):

One needs to keep in mind that 2 °C was already devised based on the risks of certain impacts, as epitomized in the famous “reasons of concern” and “burning embers” (see the IPCC WG2 SPM page 13) diagrams of the last IPCC reports, which lay out major risks as a function of global temperature….

One of the rationales behind 2 °C was the AR4 assessment that above 1.9 °C global warming we start running the risk of triggering the irreversible loss of the Greenland Ice Sheet, eventually leading to a global sea-level rise of 7 meters. In the AR5, this risk is reassessed to start already at 1 °C global warming. And sea-level projections of the AR5 are much higher than those of the AR4.

The argument by Victor and Kennel is especially disingenuous because all the most recent scientific observations and analysis point to the fact that a truly rational species would keep as far away as possible from 2°C warming.

Even since the AR5, new science is pointing to higher risks. We have since learned that parts of Western Antarctica probably have already crossed the threshold of a marine ice sheet instability (it is well worth reading the commentaries by Antarctica experts Eric Rignot or Anders Levermann on this development).

If anything, there are good arguments to revise the 2°C limit downward. Such a possible revision is actually foreseen in the Cancun Agreements, because the small island nations and least developed countries have long pushed for 1.5 °C, for good reasons.

The required response in the face of scientific reality is not “The Paris agreement should call for an international technical conference on how to turn today’s research measurements into tomorrow’s planetary vital signs.” It is, as Rignot notes, “Holy Shit” because “the time to act is now; Antarctica is not waiting for us.”

Finally, Victor and Kennel argue that a “nasty political problem” has emerged about the 2°C limit: “the goal is effectively unachievable.” The fatal flaw in that argument is the scientific and economic literature as summarized by the IPCC and agreed to by every major government in the world! I discussed this in my April post on the AR4 “mitigation” report. Indeed, the “cost” of the 2°C path is to reduce the median annual growth of consumption over this century by a mere 0.06%.

Now Victor and Kennel are certainly entitled their political opinion that the world isn’t going to adopt the 2°C path. But no matter how credible that opinion is, it should “not be confused with a geophysical, technological or economic infeasibility of limiting warming to below 2°C,” as Rahmstorf puts it. I’ll end where he ends:

If one wanted to sabotage the chances for a meaningful agreement in Paris next year, towards which the negotiations have been ongoing for several years, there’d hardly be a better way than restarting a debate about the finally-agreed foundation once again, namely the global long-term goal of limiting warming to at most 2°C. This would be a sure recipe to delay the process by years.

That is time which we do not have if we want to prevent dangerous climate change.

Paul H. Ray