Gout—a disease of red, painful, swollen joints—has an unfair reputation as a disease that only affects the wealthy after a lifetime of overindulgence. In reality, it’s the legacy of evolutionary changes that took place more than 20 million years ago, which we’re still paying for now. Gout was once called the “king of diseases and the disease of kings”. It could equally be the “disease of apes”.

The substance responsible for the condition is uric acid, which is normally expelled by our kidneys, via urine. But if there’s too much uric acid in our blood, it doesn’t dissolve properly and forms large insoluble crystals that build up in our joints. That explains the painful swellings. High levels of uric acid have also been linked to obesity, diabetes, and diseases of the heart, liver and kidneys.

Most other mammals don’t have this problem. In their bodies, an enzyme called uricase converts uric acid into other substances that can be more easily excreted.

Uricase is an ancient invention, one that’s shared by bacteria and animals alike. But for some reason, apes have abandoned it. Our uricase gene has mutations that stop us from making the enzyme at all. It’s a “pseudogene”—the biological version of a corrupted computer file. And it’s the reason that our blood contains 3 to 10 times more uric acid than that of other mammals, predisposing us to gout.

How did it come to this? Why did we do away with such an important enzyme? And when?

To find out, a group of scientists led by Eric Gaucher at the Georgia Institute of Technology resurrected long-gone editions of uricase that haven’t been seen for millions of years.

Team members James Kratzer, Miguel Lanaspac and Michael Murphy compared the uricases in modern mammals to infer the sequences of ancestral varieties. “It’s like what a historical linguist does, when they study modern languages and try to understand how an ancient ones were pronounced,” says Gaucher. Then, the team actually built these ancient enzymes in their labs, and compared their ability at processing uric acid.

The oldest version, which was wielded by the last common ancestor of all mammals 90 million years ago, was the most active one. It outperformed all its modern descendants. Since that ancient heyday, things have gradually gone downhill.

Throughout mammalian evolution, and especially during primate evolution, uricase has picked up mutations that have made for progressively less efficient enzymes. In the last common ancestor of all apes, uricase had already been hobbled to the point of near-uselessness. The ape-specific mutations that turned our uricases into broken pseudogenes merely disabled something that was already FUBARed to begin with.

Why this slow, creeping decline? Gaucher suspects that the answer involves fruit.

The biggest drops in uricase’s efficiency coincided with a time when the Earth’s climate was cooling. The ancient fruit-eating primates of Europe and Asia faced a glut of food in the summer, but risked starving in winter when fruit was unavailable.

Here’s where uric acid comes in. Our cells produce the stuff when they break down fructose, the main sugar in fruit. In turn, uric acid stimulates the build-up of fat—a process that uricase counters. Indeed, when Gaucher’s team dosed human cells with the ancient, efficient uricases, they became less good at making fat when exposed to fructose. But with later inactive uricases, they produced a substantial amount of fat.

So, disable uricase and you risk building up high levels of uric acid, but you also become a champion at turning fruit into fat. For ancient primates facing an increasingly seasonal food supply, that trade-off may have been worth it.

It’s a nice story, although it only explains the final act of uricase’s downfall. Other factors almost certainly played a role in the enzyme’s gradual decline. For example, Michael Hershfield from Duke University notes that early primates lived in rainforests, had easy access to water, and could make a lot of urine—all the better for getting rid of surplus uric acid. He speculates that these conditions might have reduced the need for uricase enough to allow the enzyme to accumulate disabling mutations.

Still, Gaucher’s results provide some support for an old but unproven idea called the thrifty gene hypothesis. Proposed in 1962, it says that humans have genes that suited our ancestors during times of scarce food, but predispose us to diabetes and obesity in the modern age of free-flowing calories. Uricase is the first good example. Our broken version may have helped our primate ancestors to thrive but it leaves us prone to gout and other illnesses linked to uric acid, whose rates have soared in recent years.

The team’s resurrected enzymes may be able to help with that too.

For over 20 years, pharmaceutical companies have tried to develop treatments for gout by using working versions of uricase from other mammals. But you can’t simply inject a pig uricase into a human patient—our immune reaction would go nuts in the presence of such a foreign enzyme.

Hershfield’s group developed a workaround by fusing the pig uricase with the baboon version. The pig bit does the heavy metabolic lifting, and the baboon bit cloaks it from our immune system. In 2010, the US Food and Drug Administration approved this chimeric enzyme, known as Krystexxa, for treating severe chronic gout.

Gaucher thinks that we can find better solutions by looking to the past. His team found that the oldest of their resurrected enzymes is both more efficient than the raw pig-baboon chimera, and lasts longer in rats. And despite its ancient nature, it’s a closer match for human uricase than even the baboon version, so it might be even less provocative to the immune system. The team have now filed a patent for the ancient uricases and formed a start-up company to turn them into an actual drug.

Hershfield anticipates bumps along the way. “It took about 17 years from the time I conceived of developing a recombinant uricase for treating refractory gout to the time it received FDA approval,” he says. “I wish [them] success, but I suspect I may not be around to witness approval of their drug.” Gaucher counters that the existing drug has already paved the way for FDA approval: “We’ll either jump through fewer hoops or we won’t have to jump as high.”

He undoubtedly has a long way to go but it’s an enticing notion that an enzyme hasn’t been around since the dinosaurs ruled the world might help gout sufferers in the future. As Belinda Chang from the University of Toronto says in a related commentary, “We are all prisoners of our history, but perhaps we can find better solutions for the future by learning from the past.”