"Where did we come from?" It's a central human question that drives us to wonder about origins—of humans, life, the Earth, the Universe. The age of the Earth is central to that question, and it has been taken on by human cultures for millennia. But only in the last couple centuries have we obtained the means to unequivocally determine that age from actual evidence. The road was a long one.

In the late 1700s, geology was in its infancy. Rock layers (of any type) were only starting to be recognized as something other than deposits from a catastrophic, world-wide flood. James Hutton, a Scottish scientist, became enthralled with the fantastic histories he saw recorded in the rocks of his homeland. At a now-famous seaside outcrop on the eastern coast of Scotland, he saw nearly horizontal layers of red sandstone on top of completely vertical layers of a much different, gray sedimentary rock. He was the first to grasp the significance of that spatial relationship.

The gray rock must have started out as a sedimentary deposit, got buried, lithified into rock, scrunched up so the layers became vertical, risen to the surface, and then eroded down like a folded phone book cut horizontally. On top of this mess, some sand was deposited, buried, lithified into rock, and brought back up to the surface yet again. The sheer time all this must have required boggled Hutton's mind. Everywhere he looked he saw the evidence of vast expanses of time. "We find no sign of a beginning," he wrote, "no prospect for an end." His colleague and friend John Playfair commented that "the mind seemed to grow giddy by looking so far into the abyss of time."

Cooling with age

A French contemporary of Hutton's, Georges-Louis Leclerc, had the idea the temperature of the Earth might tell us its age. People knew that the Earth produced heat—they were well acquainted with the hot conditions in deep mines—and Leclerc thought that perhaps it was cooling from an original, molten state. He experimented (a bit crudely) with hot iron spheres of varying sizes, and eventually estimated that the Earth must be between 75,000 and 168,000 years old. The suggestion that the Earth could be so old was scandalous and got him in a bit of trouble. Author Bill Bryson writes in A Short History of Nearly Everything, "A practical man, he apologized at once for this thoughtless heresy, then cheerfully repeated the assertions throughout his subsequent writings."

The 1800s saw gargantuan strides in the field of geology, and it became well accepted in the scientific community that the Earth was probably a few million years old, and perhaps a few tens of millions. Charles Darwin, as well as others like T.H. Huxley and Charles Lyell, insisted the age of the Earth must be in the hundreds of millions. William Thomson (better known as Lord Kelvin) took issue with that idea, and revisited Leclerc’s idea of a cooling planet utilizing advances in physics. He calculated an age that started at 98 million years, but gradually crept down to 24 million years by 1897. He admitted that his calculation could be wrong if there were unknown sources of heat, but didn’t give that possibility much thought.

Just one year earlier, however, someone had discovered a potential source of heat. In 1896, Henri Becquerel had inadvertently discovered radioactivity when he left some uranium ore in a drawer on top of an undeveloped photographic plate, on which the uranium left a "shadow." Becquerel handed the phenomenon off to his student Marie Curie and her husband Pierre. What they found won the trio the 1903 Nobel Prize in physics.

Enter Ernest Rutherford, who had some key insights. He realized that radioactivity was energy released during the "disintegration" of atoms, and he discovered what is known as the half-life—the amount of time it takes half of the radioactive atoms in a sample to decay. He knew he had found Lord Kelvin’s unknown source of heat. The release of energy from radioactive elements deep in the Earth would have greatly increased the time needed for the planet to cool to its current temperature—by hundreds of millions of years, at least, he said. Better still, it became apparent to him that one could use radioactivity as a clock, with the half-life of a decay series enabling us to calculate the age of a rock

The next step came from Arthur Holmes, who set out to date some rocks with Rutherford's new technique. By 1913 he had pushed the formation of the Earth back to 1.6 billion years; by 1946 it was over 3 billion years.

Plate tectonics and an accurate age

By 1946, however, the battle over continental drift was in full swing. Alfred Wegener had found considerable evidence to support the idea that the continents had, in fact, formed a supercontinent in the past and had since split apart, but he couldn’t explain how it had happened. Some viewed it as a great puzzle, but many dismissed his hypothesis as nonsense. Of course the continents don’t go cruising around—that’s ridiculous!

Arthur Holmes proposed that the heat in the interior of the Earth, some of which was generated by radioactive decay, could be driving a convection that drags around the continents at the surface. It would be a long time before plate tectonics was widely accepted (exploration of the seafloor in the 1950s and 1960s would push the theory over the edge), but with a possible mechanism in place, the debate became a little more serious.

One of Holmes' students, a young man named Clair Patterson, began a project in 1948 to determine the ultimate age of the Earth once and for all. The problem at this point was not with the dating method; the problem was finding the right thing to date. In order to have the final word on the matter, you needed to date the oldest rock on the planet. How would you know when you found it? Patterson found a way around this. Instead of dating something in the ground, he dated something from the sky—a meteorite.

It was expected that most of the meteorites that fell to Earth were left over from the birth of the solar system, so they should have formed at the same time as the Earth. He chose a piece of the Canyon Diablo meteorite, which impacted about 50,000 years ago in Arizona. In 1953 he presented his results: the Earth, he said, was 4.55 billion years old. He did fine work—the discussion these days is centered on the second or third decimal place of that number.

Apportioning the heat

That may be settled, but there’s still quite a bit we don’t know about the thermal workings of the Earth’s interior, including the original question raised by Lord Kelvin’s work: how much heat is left over from the origin of the planet (known as "primordial" heat)? The answer could inform big questions about the nature of the convection in the Earth’s mantle, as well as in the outer core, where the motion that creates our planet’s magnetic field takes place. Estimates have been made based on the expected chemical composition of the interior, but we’ve never actually measured the radiogenic contribution.

Until now. A paper published last week in Nature Geoscience reports the results of recent work using neutrino detectors to quantify the radiogenic production of heat. The KamLAND detector in Japan can detect anti-neutrinos generated by the decay of uranium and thorium—the two elements responsible for most of the radioactive heating of the Earth. Using data gathered from other sources on potassium—the other major element—they were able to estimate that radiogenic heat accounts for half of the heat produced by the Earth. Lord Kelvin was missing half the story.

The results actually compared pretty well to the earlier estimate based on chemical composition. That model estimated that uranium and thorium (together) accounted for about 16 terawatts of the approximately 44 terawatts of heat produced by the Earth. The calculation from the neutrino work yields 20 terawatts for uranium and thorium. (Potassium produces about 4 terawatts.)

The authors note that additional neutrino detectors (and improvements in detector sensitivity) elsewhere around the world could figure out whether the production of these “geoneutrinos” varies from place to place. If it does, a network of detectors could give us a more detailed picture of the thermal structure of the mantle.

It may be some time before we can answer all these questions, but really, we’ve made pretty good progress—we’ve only been at it for 0.00001 percent of the Earth’s existence.

Nature Geoscience, 2011. DOI: 10.1038/NGEO1205 (About DOIs).