From the perspective of a living organism, the history of the Earth has included an odd combination of radical change and fortunate stability. Continents have wandered from polar to equatorial homes, mountain ranges have risen and crumbled away, and entire oceans have opened and disappeared. But even through ice ages and ice-free times, conditions have stayed within a fairly narrow window that allowed the continued existence of life.

That relative stability occurred despite the fact that our Sun has not been entirely helpful. The Sun’s brightness has slowly increased as it matures. While about 1,368 watts of solar radiation bathes each square meter of the Earth’s surface now, it was only around 1,000 watts when the Earth was a young planet. A little over 400 million years ago, before the first fish developed legs to explore the world above sea level, sunshine was about 50 watts per square meter less than it is today. It seems that a weaker Sun was compensated by a stronger greenhouse effect (due to higher atmospheric carbon dioxide concentrations), since the most recent few million years were actually among the coolest.

But now we're heading toward a situation where the Sun is brighter and we're going to have greenhouse gas levels not seen for millions of years. What might that combination produce?

To investigate the historical context for our global experiment with increasing atmospheric CO 2 , a team of researchers led by the University of Southampton’s Gavin Foster created a compilation of over a thousand records of past carbon dioxide levels from 112 different studies. While Antarctic ice cores have provided us with preserved samples of air going back almost a million years, the rest of the past has to be revealed through trickier “proxy” records in rock cores based on things like carbon isotope signatures.

Going down

The results show a general pattern that comes as no surprise, but this process offers a little more precision by combining everything together. Other than a dip around 300 million years ago (an ice age period), CO 2 has been on a gradual decline over the 420-million-year compilation. Every million years, CO 2 dropped by about 3.4 parts per million, on average. (For comparison, it has increased roughly 80 parts per million since 1960.)

In fact, the long-term cooling influence of declining CO 2 was slightly greater than the influence of the warming Sun, leading to climates that are dominated by ice. That's why subtle cycles in Earth’s orbit have been able to produce rhythmic ice ages over the last few million years.

Why would the CO 2 in Earth’s atmosphere change to roughly counteract a strengthening Sun, you ask? It is probably the result of the weathering of bedrock into sediment, the researchers answer. Some common minerals react chemically with carbon dioxide dissolved in rainwater, pulling CO 2 out of the air and converting it, eventually, into carbonate sediment and rock. Critically, this chemical reaction depends on climate conditions—weathering increases in a warmer climate (lowering atmospheric CO 2 ) and slows in a cooler climate (allowing atmospheric CO 2 to build up). It’s a sort of natural thermostat that helps stabilize Earth’s climate if you give it enough time.

Over the last 400 million years, life moved out of the oceans and onto dry land, repainting our planet with green to complement the blue of the oceans. That massive expansion of the biosphere would have had a significant impact on the weathering of bedrock—a process plants and animals accelerate. And more weathering means a stronger drawdown of atmospheric CO 2 .

The departures from the overall trend—like the “icehouse” conditions around 300 million years ago or the “hothouse” world around 200 million years ago—could be rooted in plate tectonics. Variations in the number of active volcanoes changes the amount of CO 2 they release from within the solid Earth. And the assembly of a “supercontinent” like Pangaea will push up a large area of mountains, which rapidly weather and remove atmospheric CO 2 .

Future tense

Turning from the past to the future, you can make some interesting comparisons between greenhouse gas emissions scenarios and this historical record. The worst-case, “business as usual” scenario used in the last IPCC report results in greenhouse gas concentrations equivalent to about 1,370 parts per million in the year 2100, before finally peaking around 2,000 parts per million in 2250.

You’d probably have to go back 200 million years to find the last time the carbon dioxide concentration was in the neighborhood of 2,000 parts per million. But if you take into account the fact that the Sun was a little less bright then, even that wasn’t directly comparable. The researchers calculated that the combination of sunshine and CO 2 at the end of this century would already be equivalent to the Eocene climate 50 million years ago, the warmest time period since the dinosaurs reigned. (Though it would take time for temperatures to rise enough to match that Eocene warmth.)

If we were to allow CO­ 2 to continue rising all the way to 2,000 parts per million, the researchers say this, together with the modern Sun, “exceeds what is recorded in the geological record for at least 99.9 percent of the last 420 million years." Fortunately, we already seem to be bending the trajectory of greenhouse gas emissions below that scenario, but it’s a remarkable thought, nonetheless.

As interesting as these comparisons may be, it’s important to remember that the primary threat of global warming is the rate of these changes. Warm climates have existed in the past, but they reached those temperatures due to geological processes that took hundreds of thousands or millions of years, giving life time to adapt. The rapid changes we could make to Earth’s climate are more analogous to mass extinction events than the gradual arrival of the “hothouse” the early dinosaurs knew.

Nature Communications, 2016. DOI: 10.1038/ncomms14845 (About DOIs).