(Photo: Temple University)

Scientists working in the United States today are as active as they've ever been, but there's been a dramatic shift in how exactly they conduct their research. Or, more precisely, who's paying for that research.

A recent study following the careers of over 100,000 scientists for over 50 years found that half of university-hired scientists leave the academic life after just five years. That's a huge increase over prior years: According to the study, which was published in the Proceedings in the National Academy of Sciences, academic scientists in the 1960s stayed in the ivory tower for an average of 35 years.

How to explain this "revolving door," as lead author, Indiana University professor Staša Milojević, calls it? Well, there's money for one thing: The more profitable lure of industry is simply hard to resist for many would-be academics. In the field of robotics, where job opportunities outside academia are the most bountiful, there was a faster departure of scientists than in the field of astronomy, where industry demand is slightly lower. Still, despite this correlation, external opportunities strongly influenced the drop out rate from academia (including astronomy).

Another reason for this uptick in university exits has to do with the emergence of a "temporary workforce" of so called supporting scientists in the academy. Rather than replacing retirements with full-time faculty scientists, university departments are creating adjunct-like positions and filling them with scientists who, while fully credentialed, are working as postdoctoral students, lab techs, and research associates. Without the security of tenure or the benefits being acknowledged for their research (the study found a 35 percent increase in scientists not being credited by a study's primary author), these scientists see little reason to make a career in such a compromised academic setting (doing work more appropriate for graduate students).

A final explanation involves the simple principle of supply and demand: Universities are churning out far more Ph.D. students than there are tenure-track positions to fill. Because the labor of graduate students is dearly precious to university departments (allowing research scientists to focus on their own projects rather than teaching and grading undergraduates), admittance of Ph.D. students (and granting of degrees) exceeds not only legitimate academic positions, but it also ignores what Milojević called the "warning regarding possible scientific workforce bubbles" in general.

The implications of this revolving door are difficult to predict. But what's certain is that academic and industry science are notably different pursuits. Industry research tends to explore ideas that have the short-term promise of generating profit; it's driven by the expectation of tangible outcomes and measurable results. Academic research is more open to investigating questions for their own sake, inspired by curiosity and perhaps scholarly prestige, and more comfortable with long-term applications that may not be obvious in the moment but can pay off down the line (or not). At the risk of overstating the distinction, one is science for the sake of science; the other is science for the sake of economic gain.

From the scientists' perspective, there are advantages to both pursuits. Klodjan Stafa, a former academic neuroscientist who left to work for Estee Lauder and now councils other academics how to make that the transition to industry research, argues that working as an industry scientist is more meritocratic, collaborative, and rewarding of personal initiative (and, of course, more profitable) than the comparatively isolated and seemingly arbitrary nature of so much academic research. By contrast, Nick Feamster, a computer scientist at Princeton University, notes that "being a professor is perhaps the best job one can have"; professors are allowed the freedom to take research risks, mentor (and learn from) students, and gain easy access to cross-currents of scientific expertise, all of it endemic to work in the academy.

From the public's perspective, who benefits most as scientists leave the academy for the private sector remains to be seen. It's difficult to complain about industry research when it yields life-saving drugs, or even when it comes up with skin care products that make us feel healthier and more attractive. But industry doesn't nurture the likes of Vannevar Bush, the Massachusetts Institute of Technology scientist who, in 1945, published an essay in The Atlantic lamenting the private sector's obsessive research into methods of mass destruction. To resist, he hypothesized a machine that could store a wealth of collective knowledge in a device that he likened to "a piece of furniture." People at the time thought he was downright feather-headed, but, as evidenced by this story, his whimsical—and university-supported—curiosity has endured.