Part I of III in a series on the evolution of aging. (Part II, Part III)

Here’s an interesting experiment that I don’t recommend trying at home. Take a strain of mice, and choose what age you’d like them to live to. The average lifespan for mice in captivity is around two years, so let’s say you give them six months. Kill the first generation after six months, and do that with many successive generations, and after 50 or 100 you’ll end up with mice that die naturally at that age. Sounds rather Lamarckian, doesn’t it?

In fact, that’s not too far from how the senescence “set point” evolved for our own species, and for everyone else as well. To see how, imagine that the strain of mice you plan to experiment on is immortal and never senesces, and instead of killing them by a certain age, you kill them at random. The risk of being victimized by your tyrannical and arbitrary rule is the same at every stage of life, young or old, but nevertheless you’d end up with a population dominated by young mice, since a younger mouse has been exposed to less cumulative risk than an older one. You’d also find that there was a certain age that the mice essentially never reached.

Since there are so many more of them, young mice contribute more to the gene pool, meaning that the genes really duking it out on the battlefield of natural selection tend to be the ones that matter in youth. Evolution will work more slowly on genes affecting the smaller population of older mice, which gives harmful ones cover to slither in undetected, and before too long your strain of mice is dying of old age.

Now let’s say you halve the rate at which you randomly kill mice. Within a few generations, your mice will be living longer, because genes that grant them longer reproductive spans will now have a fitness advantage–but only to a up to point. You’d have to stop killing mice altogether if you wanted them to become immortal again, and maybe that would be for the best, suggest some friends concerned about your peculiar “hobby”.

All of the forces in the environment which conspire to do an organism in–in the case of your poor mice, that’s just you–produce a “hazard factor”, and over time this works to shape species to pursue certain strategies and not others. If a species occupies a niche where they can expect not to have much time on this earth, be it from predation, disease, strikes of lightning, or amateur scientists, it suddenly becomes very important that they reproduce early and often–they have no choice but to live fast, because they’ll probably die young. Even if they don’t, evolution will have forgotten them by old age.

By the same token, species in safer niches can afford to take things at a slower pace, strolling leisurely through life, spreading their resources out over many years of growth and development. At some point, they can even acquire the luxury of parenting.

Eat to live, live to eat

Safety and danger can take a lot of forms in the wild, and sometimes they might be better described as abundance or lack. For instance, while it’s thought that the very earliest primates evolved long lives thanks to the protection of the trees they lived in, the development that brought us the yet-longer living monkeys (anthropoids) was all about food. Instead of eating insects exclusively, these primates broadened their palates to include plants, which required improved visual processing, as well as learning and memory to distinguish the edible from the toxic. But because these monkeys could now rely on a more diverse (and therefore more resilient) food source, the investment in bigger brains was worth it.

The hominoid branch leading to apes developed yet bigger brains to selectively pursue ripe fruit, a scarcer but richer source of nutrients. It’s likely that the gains in intelligence now enabled these creatures to enter a cozier niche, where social relationships could flower and group living could provide protection from predators. A limited amount of hunting and high-skill foraging also began at this point.

Then the world changed radically: the air cooled, the Mediterranean sea dried up, and the forests retreated to make way for vast grasslands and the herds of long-legged beasts that feasted on them. Plucking juicy fruit right from the trees was no longer a strategy that could support the entire population, but a new and profitable vista had opened just beyond the edge of the jungle. Our hominid ancestors developed long strides to match those of their prey, which they followed across the savannas on two legs, covering distances hundreds of times longer than their nearest ape relatives could. And that’s when things really took off.

Growing by leaps and bounds

Hunting turned out to be very complex. It required well-informed judgments about weather and seasonal conditions, an understanding of animal behavior, and navigational skills just to locate prey at all, and then creativity and teamwork to make the kill, a daunting task for an animal whose only native weapons were a pair of hands. But since meat was so incredibly nutrient dense compared to plant foods, and kills were often large mammals with lots of flesh to spare, the rewards were ample. Once again, bigger brains started to look like a sound investment.

This dietary shift changed our lifestyle and decreased mortality, especially for children, in a few important ways. Only through cooperation could big game be brought down reliably, and as hunting parties grew larger, so did our living groups, making us less vulnerable to predation. And our energy-dense food source enabled surplus, which in turn fostered sharing–an unheard of practice in the realms of our closest relative, the chimpanzee–and made specialization possible.

Adult men in modern hunter gatherer tribes can bring in roughly twice as many calories as they consume by hunting. This means that, unlike female chimps, who have to produce more food while pregnant and lactating, hunter gatherer women can decrease their food production. It’s likely that this effect explains why humans end up with higher infant survival. And while juvenile chimps are entirely responsible for all their own food, human children never make substantial contributions until their late teens, shielding them from the high risk inherent in pursuing food on their own.

Our safe and profitable new lifestyles allowed yet more possibilities emerge. Now kids didn’t have to grow up so fast, and development could take longer. While there were certainly costs to a longer development period, like delayed productivity and reproduction, the returns to longer brain development and more learning were substantial later in life. At this point, with mortality low and a large early investment in development that was rewarded handsomely in adulthood, the most sensible thing for us to do was to simply start living longer, squeezing as much out of ourselves as we possibly could. And that’s just what we did.

Evolution has brought us to where we are now, living as much as twice as long as even the oldest chimps. It’s conceivable that it could take us even farther, given the right conditions and a very long time. But are any of us alive today really in a position to wait for evolution to hand us something new, when we could use what it’s already given us to find a solution ourselves?