by Judith Curry

Two heavyweight climate scientists have published very different ideas about how much the Earth is going to warm in the coming decades. – Washington Weather Gang

A post last December by the Washington Weather Gang Studies differ on climate change and warming severity, researchers trade jabs lays down the controversy:

Two heavyweight climate scientists have published very different ideas about how much the Earth is going to warm in the coming decades. And neither has much regard for the other’s estimate – casting light on a long-standing, thorny issue in climate science.

Future warming is likely to be on the high end of predictions says Kevin Trenberth of the National Center for Atmospheric Research who has been a lead author for the United Nations Intergovernmental Panel on Climate Change (IPCC).

But Michael Schlesinger, who heads the Climate Research Group within the Department of the Atmospheric Sciences at the University of Illinois, has just published a study with his group finding warming will be at the low end of projections.

In response to the recent Economist article (see this previous post), Michael Mann and Dana Nucitelli wrote an opinion piece for the ABC entitled How the Economist got it wrong. Excerpts:

It should be a red flag that an estimate of climate sensitivity would change by a factor of two based only on the addition of a decade of data. In reality, the climate sensitivity now is not half what it was a decade ago. So where did the Norwegian study go wrong?

One likely culprit is that the role of natural climate variability, which is particularly important on timescales of a decade or less, was not properly accounted for in the analysis. One recent article published in the Journal of Geophysical Research found that internal natural variability (for example, natural oscillations in the climate like those associated with the El Niño phenomenon) can result in a sizable discrepancy (errors approaching 1°C) between the true climate sensitivity and the value of climate sensitivity derived from the instrumental record alone.

Yet another recent study published in the journal Geophysical Research Letters has argued that previously unaccounted-for effects of low-level volcanic eruptions may have offset more of the warming than scientists realised over the past decade.

And still another study published recently in Geophysical Research Letters suggests that any slowing of surface warming during the past decade may have been associated with a recent accelerated penetration of heat into the deeper oceans.

It is further unfortunate that the piece provided so little of the larger scientific context necessary for readers to appreciate the current state of scientific knowledge about climate sensitivity. Most critically, the article didn’t address why it is that the consensus estimate of climate sensitivity remains around 3°C.

The instrumental temperature record alone, it turns out, is an especially poor constraint on climate sensitivity because it is so short, and because there are multiple natural and human factors at work over the past century. For this reason, there is an extremely wide spread of estimates of climate sensitivity when only information from the instrumental record is used. That spread includes estimates that are both lower and higher than the mid-range (around 3°C) estimate (see figure above).

However, there is a wealth of other sources of information that scientists have used to try to constrain climate sensitivity (see for example this discussion at the site RealClimate). That evidence includes the paleoclimate record of the past thousand years, the specific response of the climate to volcanic eruptions, the changes in global temperature during the last ice age, the geological relationship between climate and carbon dioxide over millions of years, and more.

When the collective information from all of these independent sources of information is combined, climate scientists indeed find evidence for a climate sensitivity that is very close to the canonical 3°C estimate. That estimate still remains the scientific consensus, and current generation climate models — which tend to cluster in their climate sensitivity values around this estimate — remain our best tools for projecting future climate change and its potential impacts.

JC comment: At issue here is a plethora of new papers that are using different methods to estimate climate sensitivity. This range of methods, and the range of outcome results, highlight the meta-uncertainty in the determination of climate sensitivity. By synthesizing and assessing these recent papers, can we increase our understanding of the limitations of various methods? Can we make any inferences as to whether Schlesinger is correct (climate sensitivity is on the low end of the IPCC range) or Trenberth is correct (climate sensitivity is on the high end)?

Lets take a look at some of these papers.

Schlesinger side of the debate

Here is the paper from Schlesinger’s group:

Causes of the global warming observed from the 19th century

M.J. Ring, D. Lindner, E.F. Cross, R.E. Schlesinger

Abstract. Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural causes, as contended by climate contrarians, or by human activities, as argued by the Intergovernmental Panel on Climate Change. This study updates our earlier calculations which showed that the observed global warming was predominantly human-caused. Two independent methods are used to analyze the temperature measurements: Singular Spectrum Analysis and Climate Model Simulation. The concurrence of the results of the two methods, each using 13 additional years of temperature measurements from 1998 through 2010, shows that it is humanity, not nature, that has increased the Earth’s global temperature since the 19th century. Humanity is also responsible for the most recent period of warming from 1976 to 2010. Internal climate variability is primarily responsible for the early 20th century warming from 1904 to 1944 and the subsequent cooling from 1944 to 1976. It is also found that the equilibrium climate sensitivity is on the low side of the range given in the IPCC Fourth Assessment Report.

Citation: M. J. Ring, D. Lindner, E. F. Cross and M. E. Schlesinger, “Causes of the Global Warming Observed since the 19th Century,” Atmospheric and Climate Sciences, Vol. 2 No. 4, 2012, pp. 401-415. doi: 10.4236/acs.2012.24035. [link to full manuscript]

Specifically with regards to climate sensitivity:

Additionally, our estimates of climate sensitivity using our SCM and the four instrumental temperature records range from about 1.5 ̊C to 2.0 ̊C. These are on the low end of the estimates in the IPCC’s Fourth Assessment Report. So, while we find that most of the observed warming is due to human emissions of LLGHGs, future warming based on these estimations will grow more slowly compared to that under the IPCC’s “likely” range of climate sensitivity, from 2.0 ̊C to 4.5 ̊C. This makes it more likely that mitigation of human emissions will be able to hold the global temperature increase since pre-industrial time below 2 ̊C, as agreed by the Confer- ence of the Parties of the United Nations Framework Convention on Climate Change in Cancun.

Nic Lewis

I just heard from Nic Lewis that his climate sensitivity paper has been published in Journal of Climate An objective, Bayesian improved approach for applying optimal fingerprint techniques to estimate sensitivity . Congratulations, Nic! A preview of some of the material was provided in this recent guest post by Nic.

Abstract. A detailed reanalysis is presented of a ‘Bayesian’ climate parameter study (Forest et al., 2006) that estimates climate sensitivity (ECS) jointly with effective ocean diffusivity and aerosol forcing, using optimal fingerprints to compare multi-decadal observations with simulations by the MIT 2D climate model at varying settings of the three climate parameters. Use of improved methodology primarily accounts for the 90% confidence bounds for ECS reducing from 2.1–8.9 K to 2.0–3.6 K. The revised methodology uses Bayes’ theorem to derive a probability density function (PDF) for the whitened (made independent using an optimal fingerprint transformation) observations, for which a uniform prior is known to be noninformative. A dimensionally-reducing change of variables onto the parameter surface is then made, deriving an objective joint PDF for the climate parameters. The PDF conversion factor from the whitened variables space to the parameter surface represents a noninformative joint parameter prior, which is far from uniform. The noninformative prior prevents more probability than data uncertainty distributions warrant being assigned to regions where data responds little to parameter changes, producing better-constrained PDFs. Incorporating six years of unused model-simulation data and revising the experimental design to improve diagnostic power reduces the best-fit climate sensitivity. Employing the improved methodology, preferred 90% bounds of 1.2–2.2 K for ECS are then derived (mode and median 1.6 K). The mode is identical to those from Aldrin et al. (2012) and (using the same, HadCRUT4, observational dataset) Ring et al. (2012). Incorporating forcing and observational surface temperature uncertainties, unlike in the original study, widens the 90% range to 1.0–3.0 K.

Nic has just posted at BishopHill explaining the new Bayesian method.

Troy Masters

In press at Climate Dynamics [link]:

Observational estimate of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5 models

Troy Masters

Abstract. Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period. The relationship between this 30–50 year radiative restoration strength and longer term effective sensitivity is investigated using an ensemble of 32 model configurations from the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting a strong correlation between the two. The mean radiative restoration strength over this period for the CMIP5 members examined is 1.16 Wm−2K−1, compared to 2.05 Wm−2K−1from the observations. This suggests that temperature in these CMIP5 models may be too sensitive to perturbations in radiative forcing, although this depends on the actual magnitude of the anthropogenic aerosol forcing in the modern period. The potential change in the radiative restoration strength over longer timescales is also considered, resulting in a likely (67 %) range of 1.5–2.9 K for equilibrium climate sensitivity, and a 90 % confidence interval of 1.2–5.1 K.

Troy has a blog post [here]. This is my first visit to Troy’s blog, I’ve just added it to my blogroll, I definitely want to follow what he is doing.

Trenberth side of the debate

The IPCC AR4 consensus range is:

Equilibrium climate sensitivity is likely to be in the range 2°C to 4.5°C with a most likely value of about 3°C, based upon multiple observational and modelling constraints. It is very unlikely to be less than 1.5°C.

The rationale for this range is summarized in a recent RealClimate post. The most recent two papers that I have found that are arguing for sensitivity on the high end are:

A less cloudy future: the role of subtropical subsidence in climate sensitivity

John Fasullo, Kevin Trenberth

Abstract. An observable constraint on climate sensitivity, based on variations in mid-tropospheric relative humidity (RH) and their impact on clouds, is proposed. We show that the tropics and subtropics are linked by teleconnections that induce seasonal RH variations that relate strongly to albedo (via clouds), and that this covariability is mimicked in a warming climate. A present-day analog for future trends is thus identified whereby the intensity of subtropical dry zones in models associated with the boreal monsoon is strongly linked to projected cloud trends, reflected solar radiation, and model sensitivity. Many models, particularly those with low climate sensitivity, fail to adequately resolve these teleconnections and hence are identifiably biased. Improving model fidelity in matching observed variations provides a viable path forward for better predicting future climate.

A blog post on the Fasullo-Trenberth paper appeared at Real Climate. Excerpt:

So how cool is it then that the recent paper by Fasullo and Trenberth estimates the net climate sensitivity without getting into the details of the cloud feedback then? Quite cool.

The Fasullo and Trenberth paper identified a relationship between the modeled seasonal change in relative humidity in the subtropical dry zones (the downwelling branch of the Hadley circulation, centered around 20-30°N and S) and the long-term feedback behavior of clouds in models. This is a very promising methodology because, if the relationship holds, we could evaluate climate models using observations of the seasonal cycle of relative humidity (which are much easier to obtain than cloud measurements). We don’t actually have to observe clouds at all! Fasullo and Trenberth use satellite data to estimate the present-day (1980-1990) May through August relative humidity and find that the CMIP3 models that best match the observations have strong moist zones in the tropical lower troposphere, strong dry zones in the subtropical upper troposphere, and high climate sensitivities. Thus, Fasullo and Trenberth conclude that the relative humidity observations are most consistent with higher climate sensitivities (around 4°C for a doubling of CO 2 ).

Published in Science, [link] to abstract.

Climate sensitivity, sea level, and atmospheric CO2

James Hansen, Makiko Sato, Gary Russell, Pusker Karecha

Abstract. Cenozoic temperature, sea level and CO2 co-variations provide insights into climate sensitivity to external forcings and sea level sensitivity to climate change. Climate sensitivity depends on the initial climate state, but potentially can be accurately inferred from precise paleoclimate data. Pleistocene climate oscillations yield a fast-feedback climate sensitivity 3 ± 1°C for 4 W/m2 CO2 forcing if Holocene warming relative to the Last Glacial Maximum (LGM) is used as calibration, but the error (uncertainty) is substantial and partly subjective because of poorly defined LGM global temperature and possible human influences in the Holocene. Glacial-to-interglacial climate change leading to the prior (Eemian) interglacial is less ambiguous and implies a sensitivity in the upper part of the above range, i.e., 3-4°C for 4 W/m2 CO2 forcing. Slow feedbacks, especially change of ice sheet size and atmospheric CO2, amplify total Earth system sensitivity by an amount that depends on the time scale considered. Ice sheet response time is poorly defined, but we show that the slow response and hysteresis in prevailing ice sheet models are exaggerated. We use a global model, simplified to essential processes, to investigate state-dependence of climate sensitivity, finding an increased sensitivity towards warmer climates, as low cloud cover is diminished and increased water vapor elevates the tropopause. Burning all fossil fuels, we conclude, would make much of the planet uninhabitable by humans, thus calling into question strategies that emphasize adaptation to climate change.

Sources of uncertainty

Beyond the meta-uncertainty issue surrounding methodology, several recent papers have discussed sources of uncertainty in determination of climate sensitivity. The basic sources of uncertainty are uncertainties in observations including forcing, uncertainties in model response, and natural internal variability.

What is the effect of unresolved internal climate variability on climate sensitivity estimates?

R. Olson, R. Sriver, W. Chang, M. Haran, N.M. Urban, K. Keller

Abstract. Many studies have attempted to estimate the equilibrium climate sensitivity (CS) to the doubling of CO 2 concentrations. One common methodology is to compare versions of Earth Models of Intermediate Complexity (EMICs) to spatially and/or temporally averaged historical observations. Despite the persistent efforts, CS remains uncertain. It is, thus far, unclear what is driving this uncertainty. Moreover, the effects of the internal climate variability on the CS estimates obtained using this method have not received thorough attention in the literature.

Using a statistical approximator (“emulator”) of an EMIC, we show in an observation system simulation study, that unresolved internal climate variability appears to be a key driver of CS uncertainty (as measured by the 68% credible interval). We first simulate many realizations of pseudo-observations from an emulator at a “true” prescribed CS, and then re-estimate the CS using the pseudo-observations and an inverse parameter estimation method.

We demonstrate that a single realization of the internal variability can result in a sizable discrepancy between the best CS estimate and the truth. Specifically, the average discrepancy is 0.84 °C, with the feasible range up to several °C. The results open the possibility that recent climate sensitivity estimates from global observations and EMICs are systematically considerably lower or higher than the truth, since they are typically based on the same realization of climate variability. This possibility should be investigated in future work. We also find that estimation uncertainties increase at higher climate sensitivities, suggesting that a high CS might be difficult to detect. In press, Journal of Geophysical Sciences – Atmospheres, [link] to abstract. Excerpts: Our results suggest that the process driving unresolved internal climate variability is a key factor behind the current uncertainty in climate sensitivity estimates. This suggests that CS is likely to remain uncertain in the world of error-free models and perfect observations, due to the confounding eect of the unresolved internal climate variability. The variability also appears to be a key factor in the second order uncertainty in climate sensitivity . This uncertainty represents the sensitivity of estimated CS pdfs to different realizations of the unresolved climate noise, and is measured by the mean deviation of estimated CS modes. Overall, our results suggest that internal climate variability presents a substantial obstacle to estimating climate sensitivity. It is thus far an open question whether this hurdle can be overcome with alternative approaches that perform joint state and parameter estimation [e.g., Annan et al., 2005; Evensen, 2009; Hill et al., 2012]. Switching from uniform to informative priors substantially reduces the CS uncertainty Historical observational constraints on climate sensitivity (e.g., global average upper ocean heat content, and surface temperature) are based on a single realization of internal climate variability process. Not considering the effects of the observational and model errors, this realization alone can introduce a considerable discrepancy between the best CS estimate and the true value. Given that scientific models often share similar assumptions and might not be independent, it is possible that the bias due to the internal variability can be in the same direction in studies using different models. As a result, current EMIC-derived CS estimates from these datasets may be systematically higher or lower than the true value. A way forward might be to use independent constraints from other time periods or information from a wider variety of spatially resolved datasets and reanalyses. I am impressed by the inclusion of a good section on Caveats regarding the limitations of their study.

Continuing on the theme of the impact of natural internal variability, on a previous thread I discussed the following paper

Tung, KK and J Zhou, 2013: Using data to attribute episodes of warming and cooling in instrumental records. PNAS, [link].

Key excerpts:

The presence of multidecadal internal variability superimposed on the secular trend gives the appearance of accelerated warming and cooling episodes at roughly regular intervals. Quantitatively, the recurrent multidecadal internal variabil- ity, often underestimated in attribution studies, accounts for 40% of the observed recent 50-y warming trend.

Model error and climate sensitivity with present day observations through model weighting

Daniel Klocke, Robert Pincus, Johannes Quaas

The distribution of model-based estimates of equilibrium climate sensitivity has not changed substantially in more than 30 years. Efforts to narrow this distribution by weighting projections according to measures of model fidelity have so far failed, largely because climate sensitivity is independent of current measures of skill in current ensembles of models. This work presents a cautionary example showing that measures of model fidelity that are effective at narrowing the distribution of future projections (because they are systematically related to climate sensitivity in an ensemble of models) may be poor measures of the likelihood that a model will provide an accurate estimate of climate sensitivity (and thus degrade distributions of projections if they are used as weights). Furthermore, it appears unlikely that statistical tests alone can identify robust measures of likelihood. The conclusions are drawn from two ensembles: one obtained by perturbing parameters in a single climate model and a second containing the majority of the world’s climate models. The simple ensemble reproduces many aspects of the multimodel ensemble, including the distributions of skill in reproducing the present-day climatology of clouds and radiation, the distribution of climate sensitivity, and the dependence of climate sensitivity on certain cloud regimes. Weighting by error measures targeted on those regimes permits the development of tighter relationships between climate sensitivity and model error and, hence, narrower distributions of climate sensitivity in the simple ensemble. These relationships, however, do not carry into the multimodel ensemble. This suggests that model weighting based on statistical relationships alone is unfounded and perhaps that climate model errors are still large enough that model weighting is not sensible.

This paper is published in Journal of Climate [link].

In context of the main topic of this post, the following excerpt is of particular significance, cutting to the heart of the argument by Fasullo and Trenberth:

[W]hile the motivation to narrow the distribution of climate sensitivity estimates is strong, our results dramatize the danger of focusing exclusively on this goal. Relationships between sensitivity and model fidelity in any ensemble emerge from an unknown mix of underlying similarity in model representation and error, statistical sampling error, and physical relationships also present in the natural world. This means that arbitrarily chosen error measures may arise from underlying similarity not present in the physical climate system. We argue that, because metrics developed from the full multimodel ensemble alone cannot be falsified by comparison to more general ensembles, they cannot be justified as a model likelihood purely on the basis of the strength of the statistical connection between that metric and climate sensitivity. Indeed, where observations have been used successfully to constrain model response statistical metrics have been bolstered by physical arguments. Much depends on the way weights are chosen since incorrect weighting (i.e., weighting not related to true model likelihood) can substantially reduce the benefits of using an ensemble of projections.

JC synthesis

Mann and Nuccitelli state:

When the collective information from all of these independent sources of information is combined, climate scientists indeed find evidence for a climate sensitivity that is very close to the canonical 3°C estimate. That estimate still remains the scientific consensus, and current generation climate models — which tend to cluster in their climate sensitivity values around this estimate — remain our best tools for projecting future climate change and its potential impacts.

The Economist article stated:

If climate scientists were credit-rating agencies, climate sensitivity would be on negative watch. But it would not yet be downgraded.

The combination of the articles by Schlesinger, Lewis, and Masters (not mentioned in the Economist article) add substantial weight to the negative watch.

In support of estimates on the high end, we have the Fasullo and Trenberth paper, which in my mind is refuted by the combination of the Olson et al., Tung and Zhou, and Klocke et al. papers. If a climate model under represents the multidecadal modes of climate variability yet agrees well with observations during a period of warming, then it is to be inferred that the climate model sensitivity is too high.

That leaves Jim Hansen’s as yet unpublished paper among the recent research that provides support for sensitivity on the high end.

On the RealClimate thread, Gavin made the following statement:

In the meantime, the ‘meta-uncertainty’ across the methods remains stubbornly high with support for both relatively low numbers around 2ºC and higher ones around 4ºC, so that is likely to remain the consensus range.

In weighing the new evidence, especially improvements in the methodology of sensitivity analysis, it is becoming increasing difficult not to downgrade the estimates of climate sensitivity.

And finally, it is a major coup for the freelance/citizen climate scientist movement to see Nic Lewis and Troy masters publish influential papers on this topic in leading journals.