"Fifteen hundred years ago everybody knew the Earth was the center of the universe. Five hundred years ago, everybody knew the Earth was flat, and fifteen minutes ago, you knew that humans were alone on this planet. Imagine what you'll know tomorrow."

Men In Black's Agent Kay isn't exactly a great public philosopher, but I think he does do a good job of summing up the reason why some people don't like the idea of applying the science of climate change to the realm of political policy. Science changes, after all. Who's to say that 100 years from now we won't find the results of 21st century climate modeling as ridiculous as a map of a flat Earth?

This argument isn't totally off-base. Scientific theories are frequently overturned by new evidence. But, just as often, the new evidence changes one part of a theory, while leaving the big picture intact. That's because scientists use the same word–"theory"–to describe two very distinct classes of ideas. Gravity is a theory. But so is the existence of Gliese 581g–a wobble in the light given off by a distant star which may, or may not, turn out to be a planet. One of these things is not like the other. Of the two, new evidence is much more likely to disprove the existence of Gliese 581g.

Scientists still study what gravity is and how it works. It's a living theory, not a cold, unchanging edifice. In fact, there's a lot of weird, little anomalies that tell us we don't have this gravity thing totally figured out just yet. But as new evidence comes in, it tends to refine the details, not demolish everything we thought we knew. Einstein revolutionized the theory of gravity, but he didn't make apples start to fall up.

With that in mind, I want to tell you a story. There are a lot of climate myths out there—misconstrued facts and frank deceptions used to discredit good climate science. But one of those myths is particularly interesting to me, because it's a very good example of the difference between little lowercase "theories" and uppercase "Theories". The myth of global cooling is the kind of thing that happens when people get the two mixed up.

Let's start at the beginning, with a quick summary of the myth itself.

According to the standard version of this story, everybody in the 1970s thought that the Earth was actually getting colder, and that we were in for a new Ice Age. Animals like armadillos were migrating southward, fleeing the encroaching cold. The Arctic ice pack was unexpectedly thick. Scientists warned of massive crop failures, and wrung their hands over the fate of the millions who would die in our frozen future. They urged governments to take action, either by stockpiling food, or with more disturbingly drastic measures–such as intentionally melting the Polar ice caps. All the same people who, today, tell us that the Earth is heating up were, once upon a time, singing a very different tune. The implicit message about scientists that people get from this story: You just can't trust 'em.

It would be nice if the myth of global cooling were a fringe belief. But it's not.

Influential, big-name talkers push the story. Lots of average people listen to them. The author Michael Crichton worked it into his last novel. Senator James Inhofe told the tale in Congress. Rush Limbaugh believes in the myth. So does George Will. And, consequently, so does at least one of my uncles.

But they're all wrong.

In reality, global cooling was never a broadly accepted Theory. It's reasonable to assume that a good chunk of Americans never heard about it at all. And global cooling never had the support of most climate scientists, let alone scientists in other disciplines, like biology and public health, which are linked to climate change in many important ways today.

We know all of this thanks to the work of two scientists, Thomas Peterson and William Connolly, and a journalist, John Fleck. In 2008, they published a detailed history of this myth in the Bulletin of the American Meteorological Society. So that's another thing that makes the myth of global cooling stand out from the pack. Unlike a lot of myths, the path from fact to fiction is very well-documented.

A Myth is Born

The truth is, for a short period in the mid-1970s, the idea of global cooling was somewhat trendy–as measured in newspaper and magazine stories, but not scientific evidence.

In 1975, both Newsweek and Time ran articles about the coming Ice Age. The next year, National Geographic published a more detailed story about climate science, in general. It touched on global cooling as one of several possibilities for the future of climate.

But all of these stories were based on the same small handful of peer-reviewed papers. In fact, Peterson, Connolley, and Fleck found that, between 1965 and 1979, only 7 peer-reviewed papers were published supporting the idea of global cooling. (In contrast, during that same time period, 44 published peer-reviewed papers found that the Earth was getting warmer. And 20 were neutral on the subject.)

All those papers were the work of scientists who were, for the most part, trying to understand the basics of how the climate system worked, not expanding and refining an already accepted big idea. These were, in other words, lowercase "theories".

Cause and Effect

The issue was inputs.

These are the variable factors–like levels of greenhouse gases, or particles of dust and soot in the atmosphere–that can impact how the natural processes of the climate system play out. In the 1970s, scientists didn't understand variable inputs very well. They knew, based on ice cores and tree rings, that the Earth was probably coming due for a cold snap. In fact, the Northern Hemisphere had been cooler than average between 1940 and 1970. And they knew that particulate matter–the smoke of volcanoes, the soot of factories, the obvious air pollution–could reflect light from the sun and have a cooling effect.

But they also knew about the greenhouse effect.

This is the almost 200-year-old idea at the heart of the Theory of climate change. For a quick refresher, the greenhouse effect describes the cycle of heat transfer that keeps our planet from becoming a frigid ball of dirt, no more habitable than Mars. First, heat from the Sun passes through our atmosphere. Some is absorbed by the ground and oceans, and some of that heat gets reflected back towards space. But the gasses in our atmosphere don't let all that reflected heat out. Instead, atmospheric gasses bounce most of the heat back down again. It's kind of like turning on a laser pointer in a hall of mirrors. Because of the greenhouse effect, Earth is able to trap enough heat to sustain life-as-we-know-it. We've known about this effect since 1824.

Climate change is really just an exaggeration of the greenhouse effect. Carbon dioxide is better than a lot of other gasses at bouncing heat back down to Earth. The more carbon dioxide in the atmosphere, the more heat gets trapped, and the higher our global average temperature rises. We've known about the way rising carbon dioxide levels enhance the greenhouse effect since 1896.

By the 1970s, climate scientists knew cars, power plants, and other aspects of modern energy use were releasing unprecedented amounts of greenhouse gases into the atmosphere. The question everybody was trying to answer: Which input was more powerful? In other words, would particulate matter beef up a natural cooling trend to the point that the greenhouse effect was merely a pleasant distraction? Or, would the impact of carbon dioxide and its greenhouse gas cousins outpace both natural and anthropogenic cooling, and take us to a warmer world?

The Way It's Supposed To Work

In the 1970s, nobody really had a solid answer to those questions. In a given year, one scientist would publish a paper that supported cooling, while another two or three would publish results that favored warming. And journalists would report on all those papers.

In 1975, the same year that Newsweek and Time warned of a coming Ice Age, Peterson, Connolley, and Fleck found that The New York Times actually ran two climate science stories. The first was titled "Scientists Ask Why World Climate is Changing; Major Cooling May be Ahead." The second: "Warming Trend Seen in Climate; Two Articles Counter View that Cold Period is Due."

If you saw both Times stories, you'd have a pretty good idea that scientists weren't totally in agreement on this issue. But not all journalists provided that kind of context. Every peer-reviewed climate science paper was like a part of a mountain range. The only way to make sense of the topography was to zoom out, and look at the whole thing. But, some journalists had a tendency to report on each new study that came out as though it were an isolated hill of fact in the middle of an empty plain.

One group actually did review the big picture of climate science in 1975. That was the U.S. National Academy of Sciences. The NAS is sort of like a cross between a professional organization and a medieval court adviser. Not all the scientists in the United States are members. Instead, current members elect new ones, based on the quality, importance, and influence of their research. Think of it as the Science Hall of Fame. Getting in is a big deal. But it's more than just symbolic. That's because the NAS plays a role in American politics. Most politicians aren't trained scientists. Even if they are, they can't be expected to be experts on everything. So, instead, when politicians need to know what's going on in a particular field of science, they turn to the actual experts at the NAS. Every year, the Academy puts together many reports summarizing the state of scientific research on a wide array of topics and offers their advice about what politicians should do with that information.

The 1975 NAS report on climate science reflects the confusion that surrounded the field at that time. In fact, the introduction flat out says, "…we do not have a good quantitative understanding of our climate machine and what determines its course. Without the fundamental understanding, it does not seem possible to predict climate…" There wasn't anything close to a scientific consensus on climate in 1975. But that was about to change rapidly. Over the next five years, research methods improved, more papers were published, and all those little theories began to add up to a single big Theory–the Earth was getting hotter.

By 1979, it was already clear that the effect of greenhouse gases had a bigger impact than the effect of dust particles. When the NAS came back to the subject of climate science that year, the state of research had changed enough that their summary was now very different. Instead of uncertainty, the 1979 NAS report emphasized a message that was, basically, the same as what we still hear today: The Earth is warming, and that fact should not be ignored. The popular press liked the story of global cooling. But their interest in that story didn't reflect what scientists were actually thinking. There was no flip-flop of science here.

Instead, what happened in the 1970s was that science worked the way it's supposed to work.

Researchers identified an important question. They studied it. They figured out how to study it better. And, slowly, between roughly 1970 and 1980, they came up with a broad, generalized answer. They went from a jumble of lowercase theories to an uppercase Theory.

Since then, the uppercase Theory hasn't changed. No new evidence has surfaced to challenge it. Instead, researchers have busied themselves with the details—studying the lowercase theories within climate change to try and make that big Theory more specific. What they've learned has made them more and more certain that the big Theory is correct. So, in a way, the scientific consensus certainly has changed since 1975. But it changed from, "We don't know," to "Climate change is definitely happening.

Image: Frozen World, a Creative Commons Attribution (2.0) image from laszlo-photo's photostream