March 24, 2012 — andyextance

Scientists have taken an important step towards overthrowing any remaining concerns about how reliable computer models of the Earth’s climate can be this week. US, UK and Japanese scientists have tested climate models against measurements from fossils that show what sea temperatures were like three million years ago. US Geological Survey scientist Harry Dowsett and his colleagues have often tested models one at a time with such ‘proxy’ measurements. But now they are looking as far back in time as possible, and studying a wide range of models.

“What sets this work apart is that we have carefully evaluated the confidence we have in our proxy data and have begun to look at multiple models,” Harry told Simple Climate. “This is an ongoing process, but these preliminary results using just four models show them to be in good general agreement with each other and the proxy data based estimates.”

Science relies on experiments to test its ideas. If the test’s results don’t fit the theory, then it’s back to the drawing board. Yet our climate is a system enveloping a whole planet. We can’t create a twin planet with slightly higher levels of CO2, for example, to test what will happen if we continue carelessly burning fossil fuels. Today, the next best tool to a twin planet is a computer model. Anyone with everyday experience of computers would be excused for being wary about how much we rely on them. So the sensible thing to do is check they’ve done their job right. But then you run into the problem of planet scale experiments again.

PRISM lights up climate history

Instead of the impossibly fantastic challenge of creating another Earth, scientists can test the models against past climate – as long as those measurements are reliable in the first place. A period called the Pliocene is the best measured past worldwide warm spell, Harry and his colleagues noted in a paper in the Nature Climate Change research journal on Sunday. So, as part of the Pliocene Research, Interpretation and Synoptic Mapping, or PRISM, project, the scientists looked at how good sea surface temperature data is for a 300,000 year period starting 3,300,000 years ago. During that time, temperatures were around 2°C higher than they are today, with similar CO2 levels in the atmosphere.

The scientists studied tiny fossils contained in tubes of rock drilled out of coastal land and the floor of the ocean at 95 sites across the world. “Planktonic foraminifers are single celled organisms whose distribution in the present day ocean has a very high correlation to surface temperature,” Harry explained. “That relationship was just as strong in the Pliocene. We also use other fossil groups and geochemical techniques where possible to get multiple sea surface temperature proxies at each locality investigated.”

Harry’s team considered the sample number and quality and how easy it was to distinguish the desired period at each site. They also looked at how temperature was measured and how well those techniques performed, and how many and how well fossils were preserved. Based on this information, they found 27 of the 95 gave an estimate of sea surface temperature with very high confidence and 32 with high confidence. They had a medium level of confidence in 33 estimates, and low confidence in just one.

The next stage was then to get modelling groups across the world to set up and run identical simulations of the same period, an effort known as the Pliocene Model Intercomparison Project (PlioMIP). To analyse their accuracy, Harry and his colleagues looked at the model outputs at each of the 95 sites. “If the model and data plot together, life is good,” he said. “If they do not, we either have a problem with the temperature and need to go back and look carefully at the confidence we place in that particular estimate, or we need to look at the climate model and understand why it isn’t giving the expected result.”

Experts play for high stakes

While the four models generally agreed with the PRISM data, they did vary from each other and the proxy record in the North Atlantic, but Harry thinks even that could be useful. “It’s good news that the models all produce very similar outputs that is in good overall agreement with the data,” he said. “The mismatch in the North Atlantic can be attributed to the high degree of variability in that setting today. We want to be able to understand the nuances of the different models on a regional scale so this initial result is a great experiment.”

He puts much of the credit for the positive results on the quality of the data and expertise of the modellers. “Beyond that, I’d credit the good agreement between models and data to the fact that the mid Pliocene has a strong and robust climate signal,” he said. “The fact that the signal is from a similar amount of warming to what’s expected by the end of the 21st century makes the work that much more important.”

This is just an early stage in the PlioMIP project, Harry underlined, as other models involved are still yet to finish. Analysing the results in depth will be a “Herculean effort” he added, and follow-up experiments are already being planned. But the scientist emphasises that the outcome will be highly valuable.

“This is collaborative science at its best,” Harry said. “We have the world’s experts in different types of climate proxy data and reconstruction working with international experts in the field of climate modelling. The data and model people do not always speak the same languages, but in PlioMIP we have made a truly integrated group and work very hard to understand the strengths and weaknesses of both the models and the data. It’s extremely challenging and at the same time rewarding work. The stakes are high – we are ultimately talking about the future of our planet.”