It seems straightforward: Iron-rich dust floating on the wind falls into the sea, where it nourishes organisms that suck carbon dioxide from the air. Over time, so much of this greenhouse gas disappears from the atmosphere that the planet begins to cool. Scientists have proposed that such a process contributed to past ice ages, but they haven’t had strong evidence—until now.

“This is a really good paper, a big step forward in the field,” says Edward Boyle, a marine geochemist at the Massachusetts Institute of Technology in Cambridge. The research doesn’t directly measure the amount of dissolved iron in the waters due to dust in previous eras, Boyle says, but “they provide a much better case for what [nitrogen levels] have done in the past”—information that can reveal the ebb and flow of ancient life.

The notion that iron-rich dust could boost the growth of microorganisms that pull carbon dioxide from the air took hold in the late 1980s. During ice ages, when sea levels are low and broad areas of now-submerged coastal shallows are exposed, sediments rich in iron and other nutrients would dry out, the thinking went. Then, strong winds would loft that fine-grained, dehydrated dust and carry it far offshore, where it would nourish carbon dioxide–sucking phytoplankton at the base of the ocean’s food chain. Previous analyses of sediments that accumulated on sea floors during past millennia suggest that increases in iron-rich dust falling into surface waters boost biological productivity there, but those studies provide only a correlation in timing, says Alfredo Martínez-García, a paleoclimatologist at ETH Zurich in Switzerland.

Now, Martínez-García and his colleagues have developed a new way to probe past seafloor sediments. In core samples of bottom mud, they looked at the organic material bound to the carbonate skeletons of one particular species of the free-floating microorganisms called foraminifera. (That particular species is relatively large and easy to identify, so its remains are simple to separate from those of other “forams.”) The researchers were particularly interested in nitrogen, which the microorganisms would have consumed as nitrate dissolved in seawater. The heavier the overall ratio of nitrogen isotopes in a sample, the more the surface waters above that site would have been thriving with life, the new technique suggests. Carbon dating provided an age for each sediment sample.

Applying the new method, the researchers looked at a more-than-5-meter-long sediment core, representing about 160,000 years of accumulation, drilled from the deep sea floor off the southwestern coast of South Africa. Prevailing winds would have carried dust there from the eastern coast of South America when sea levels were low during ice ages, and from Patagonian deserts during interglacial periods, Martínez-García says. So, he notes, sediment accumulation at this site should provide a good test of the iron fertilization hypothesis.

Results show strong links among the amount of dust deposited in the region, biological productivity at the sea surface, and the amount of dissolved nitrate consumed by the forams, the researchers report online today in Science. Those relationships were true during the peaks of the last two ice ages, as well as during centuries-long spates of colder-than-normal climate at other times in the past 160,000 years, Martínez-García says.

The biochemical fingerprint that the team has identified explains only about half of the carbon dioxide variation that occurred between past glacial and interglacial periods, says Andrew Watson, a climate scientist at the University of Exeter in the United Kingdom. So, although iron fertilization may be a major factor influencing Earth’s climate, it doesn’t fully explain the coming and going of ice ages. Nevertheless, he notes, “this is the nicest data that I’ve seen yet.”