Some people who reject climate science seem to think climate scientists have never heard that the climate has changed in the past—as if scientists weren’t the ones who discovered those events in the first place. In reality, researchers are intensely interested in past climates because there is a lot to learn from them. You can see how sensitive Earth’s climate is to changes, for example, or how variable things can be even when the long-term average temperature is steady.

(“Climate has changed without humans before, so humans can’t be changing it now” is not a logically valid argument, FYI. It's the equivalent to arguing that we can't cause forest fires, since they occurred before we were around.)

Searching the past

Some historical records show evidence of a cooler period between the 1400s and early 1800s, which has come to be called the “Little Ice Age.” We know that glaciers from the Rockies to the Alps expanded during this time period, leaving piles of rocks behind when they eventually retreated. But was this really a global event? Or were there just regional downturns in temperatures that we erroneously connect out of a desire to make a simple story?

There are a number of similar events, including a Medieval Warm Period, Dark Ages Cold Period, Roman Warm Period, and even a copy-cat Late Antique Little Ice Age. Researchers have often attached caveats to these names over the years, noting that data based on things like tree rings shows different things in different locations. A major project recently produced a comprehensive global database of climate records for the last 2,000 years, which enabled a thorough test of all these events.

Using a handful of different statistical techniques (to make sure their results were solid), the researchers analyzed the global pattern of temperatures during each of these time periods. For each location, they identified the coldest or warmest 50-year stretch within that period, allowing us to easily see whether temperature changed synchronously around the globe.

The results showed that only one period was a truly global event—the modern warming caused by human activities. More than 98% of the globe experienced the warmest temperatures of the last 2,000 years during the 20th century. The Little Ice Age comes closest, but there were clearly significant regional forces at work. The Eastern Equatorial Pacific saw the coldest temperatures in the 1400s, while much of Western Europe and the United States were coldest in the 1600s. Everywhere else around the planet, it was the early 1800s that were coolest.

Using climate model simulations, the researchers find that the regional patterns in all these events—except the human-caused warming trend—are consistent with natural variability. (A separate study published at the same time, by the way, demonstrates that climate models seem to simulate the same magnitude of natural variability that the paleoclimate records show happens in the real world.)

It could be that these are simply variations caused by things like oscillating ocean currents, or it could be that these are regional responses to outside factors like solar activity or volcanic eruptions. But there were no outside factors strong enough to cause planet-wide change—which also helps explain why it has been hard to settle on start and end dates for these periods.

Ashes to ashes (to ashes)

The Little Ice Age, for example, overlaps with a period of low solar activity as well as a remarkable stretch of volcanic eruptions, which produce short-lived, sunlight-reflecting particles in the atmosphere. Previous studies have concluded that the volcanoes were likely the bigger factor, with some particularly large eruptions in the early part of the time period being the culprits that started the cooling off. A third new study published in this week’s set takes a look at the later part of the Little Ice Age, finding a big role for volcanoes there, too.

This team used the paleoclimate records, a reconstruction of global atmospheric conditions based on those records, and climate models to examine a run of major eruptions from 1800 to the 1830s. While individual eruptions will typically lower temperatures for the next two or three years, they found that several eruptions occurred close together, collectively causing a larger cooling.

Through an influence on the ocean’s uptake of heat, the effects built on one another. Africa experienced two decades of drought, and the Indian monsoon weakened, as well. Europe, on the other hand, received extra precipitation—dumping snow on those glaciers in the Alps and helping them expand.

The upshot is that the late 1830s and 1840s saw temperatures recovering in many places, as the world’s volcanoes finally gave it a rest. That’s actually a little inconvenient for us today, because it’s roughly when humans started burning fossil fuels with gusto. We often talk about global warming in terms of temperature increases above pre-industrial times, but the pre-industrial baseline temperature is a bit of a moving target given the volcanic goings-on. The difference isn’t huge—averages of different time periods only vary by 0.1°C or so—but it does make it difficult to anoint one number as truly preindustrial.

Still, the past holds more lessons than just defining the start of humanity’s great climate experiment. Decades of hard work collecting paleoclimate records is paying off by making some of those lessons pretty clear.

Nature, 2019. DOI: 10.1038/s41586-019-1401-2,

Nature Geoscience, 2019. DOI: 10.1038/s41561-019-0400-0 & 10.1038/s41561-019-0402-y

(About DOIs).