Climate models are equations that describe climatically relevant processes and are solved on supercomputers. In addition to being invaluable tools for testing scientific hypotheses, these models have long provided societally important forecasts. The first climate models to numerically describe an evolving and interacting atmosphere, ocean and land surface on a grid covering the entire Earth date back to the 1970s (for example, refs 1–3). Since then, the planet’s surface has warmed, in large part because of increased emissions of greenhouse gases. Writing in Geophysical Research Letters, Hausfather et al.4 retrospectively assessed the forecasting skill of climate models published between 1970 and 2007. Their results show that the physics in these early models was accurate in predicting subsequently observed global surface warming.

A key point emphasized by the authors is that the forecasting ability of climate models is limited by unknowable future climate drivers. Many major drivers, such as increased concentrations of carbon dioxide in the atmosphere caused by the burning of fossil fuels, result from human activities and decisions. Early climate modellers included estimates for future climate drivers in their forecasts. However, they could not know, for example, how the world would industrialize or the associated emissions of CO 2 that would result.

The aberrant global synchrony of present-day warming

Hausfather and colleagues developed a method for evaluating the forecasts of early climate models without penalizing the models for their inaccurate estimates of unknowable future climate drivers. The authors examined 17 projections of global mean surface temperature (GMST) from 14 models. Before applying their method, they found that 10 projections were consistent with observations. But when inaccuracies in the estimates of climate drivers were taken into account, the authors discovered that 14 projections agreed with the data. Of the three that did not, two predicted higher-than-observed surface warming and one predicted lower-than-observed warming.

Developing credible climate models through an understanding of climatically relevant processes, observations and well-formulated equations is a considerable scientific and computational challenge. The equations that describe climate are complex and require substantial computing power to solve. As a result, climate models have always been run on the fastest supercomputers available. It is especially impressive that the earliest models assessed by Hausfather et al. produced accurate GMST forecasts, given the extremely limited computing power available then compared with that used today (Fig. 1).

Figure 1 | A Univac 1108 computer, from 1972. Hausfather et al.4 demonstrate that climate models published over the past five decades accurately predicted subsequently observed changes in Earth’s global mean surface temperature. These models include ones reported in the 1970s that used supercomputers, such as the Univac 1108, that had extremely limited power relative to those used today.Credit: CSU Archives/Everett Collection/Alamy

Although the authors’ findings show that climate models can accurately predict GMST, these forecasts are insufficient for understanding and preparing for the effects of ongoing climate change. For instance, regional climate change is especially subject to unpredictable climate variability, which greatly limits forecasting potential — even on decadal timescales when the climate drivers are known5. Moreover, on the basis of GMST forecasts alone, it is hard to predict, for example: to what extent sea level will rise; how ocean acidification caused by uptake of atmospheric CO 2 will influence marine ecosystems; and the frequency and magnitude of future fires, droughts and floods.

Scientists will have to continue to improve climate modelling and to increase their understanding of the effects of climate change, while keeping in mind the tension between the need for increased model resolution, greater representation of climatically relevant processes, and more simulations to characterize unpredictable climate variability. The successful forecasting of GMST by early climate models is impressive, but leaves much work to be done — as scientists, policymakers and stakeholders are all well aware.

The cost of a warming climate

Numerical models based on scientific equations describing the atmosphere are used daily to make decisions that save lives and money. As the climate continues to change owing largely to human activities, scientists need to use, improve and communicate the value of numerical models and the equations and knowledge that underlie them. Hausfather and colleagues’ work demonstrates that the physics in climate models has been providing accurate forecasts of GMST under increasing amounts of atmospheric CO 2 for decades. Such predictions are useful for estimating the maximum amount of CO 2 that can be released into the atmosphere over time to keep surface warming to a specified level.

Crucially, the authors’ results also show that a major source of uncertainty in GMST forecasts comes from climate drivers. And, of these drivers, it is emissions of greenhouse gases from human activity that will largely determine future surface warming. The findings indicate the usefulness of climate-model predictions of GMST in response to increasing greenhouse-gas emissions, despite unknowable future climate drivers. But scientists must also continue to develop climate models in concert with everything else available to them, to plan for a changed climate that requires much more than forecasts of surface warming.