Bias against inexperience

A researcher’s track record is traditionally judged by the number of publications. Funding is completely dependent on this arbitrary output, which creates a requirement to produce any publication, no matter how incremental or limited the finding. The well-known ‘publish or perish’ rule for scientists creates many problems, but little has been more damaging to the future of science and humanity than what it has done to young researchers over the past 30 years.

Prior publications and long track records are far easier to judge than new researchers with unorthodox, invalidated ideas. So there has been a massive shift in government funding toward older, more established researchers with proven histories of publication. Young researchers, no matter how creative their ideas, simply can’t compete in the current system.

The roots of the problem trace to the NIH's big budget increase in the early '90s, which drew so many new faculty to the field that the struggle for grant money became intense once funding flattened out. Since the early 1980s, the proportion of NIH funding that went to researchers under the age of 37 has dropped six-fold, with just three percent of the budget now given to these younger researchers. The hundreds of billions of extra taxpayer dollars that were injected into the NIH since then have gone to senior, experienced researchers.

Dr. Elias Zerhouni, director of the NIH from 2002 to 2007, admitted he was shocked by the trend away from younger researchers. “I couldn’t believe the data,” he said during a Baker Institute event in 2008. Zerhouni went so far as to say it’s “probably the greatest long-term policy issue that will threaten America’s future competitiveness." If current trends continue, by 2020, the NIH will fund more investigators over 70 years of age than those under 40.

Youth equals creativity

So what is causing this massive shift?

“We have a natural bias as a society in favoring what is established and proven, a track record of great papers, Academy of Science appointments and so on, [not] a 34-year-old [with no track-record]," Zerhouni concluded.

This huge age bias goes well beyond a lack of opportunity for new researchers. Younger, inexperienced scientists have generated the most creative and revolutionary breakthroughs in science. “A person who has not made his great contribution to science before the age of thirty will never do so,” said Albert Einstein. This would mean that the shift away from supporting them is almost certainly lowering the creative capacity of the research community.

Is Einstein correct? Does originality and creativity in science tend to come from younger researchers?

Further Reading How doubling the NIH’s budget created a funding crisis

Benjamin Jones from the Kellogg School of Management at Northwestern University has done an age analysis for all Nobel laureates in physics, chemistry, and medicine dating back to 1901. Of the 550 prize winners, about 40 percent were between ages 20 and 35 when they made their groundbreaking discoveries. Scientists aged between 35 and 45 years of age made another 42 percent. So in total, about 82 percent of all of the biggest discoveries were made by scientists younger than 45—that’s typically less than two decades after graduating with a PhD. In the last century or so, creativity and innovation tends to come from youth, not long track records.

Although unconventional and risky research can be pursued at any age, it seems to come much easier to younger scientists. That may be because they have more time to allocate to one idea and are less susceptible to the “curse of knowledge”—the cognitive bias that tends to make experience stifle one’s ability to come up with or accept new, unconventional, or creative ideas.

The 30- to 40-year-old period has often been described as “the golden years” for creative discovery, a perfect mix of time, enthusiasm, naivety, and just enough experience to produce optimal creativity. They’re not the golden years for funding anymore, though. The average age to receive NIH research grants has gone from 38 in 1980 to 51 today.

The benefits of risk

In the early 1970s, Roger Kornberg, a 27-year-old Stanford PhD, was working at the Laboratory of Molecular Biology in Cambridge, England. With a modest post-doctoral salary, Kornberg was given freedom to explore untried and risky areas of research. This would ultimately allow him to make a revolutionary discovery about how DNA is copied in cells.

Kornberg would win the Nobel Prize in chemistry in 2006 for that work. Yet he told The Washington Post that he’s convinced his groundbreaking Nobel Prize winning idea would never have been funded today.

"If the work that you propose to do isn't virtually certain of success, then it won't be funded,” he said. “Of course, the kind of work that we would most like to see take place, which is groundbreaking and innovative, lies at the other extreme."

There is no more important time for science to leverage its most creative minds in attempting to solve our global challenges. Although there have been massive increases in funding over the last few decades, the ideas and researchers that have been rewarded by the current peer-review system have tended to be safer, incremental, and established. If we want science to be its most innovative, it’s not about finding brilliant, passionate creative scientists; it’s about supporting the ones we already have.

Ben McNeil is a research scientist and founder of thinkable.org, a platform that connects researchers with sponsors who want to fund breakthrough thinking.