For much of the 20th century, many large US firms ran their own research labs. The idea was that they would conduct wide-ranging scientific research that, eventually, might lead to new products and technologies.

Those labs often made huge contributions to science. AT&T's Bell Labs racked up eight Nobel Prizes for inventions like the transistor or the discovery of cosmic background radiation. IBM's Zurich Research Laboratory was renowned for its work on superconductors. Xerox's PARC created a graphical user interface that later inspired the Macintosh. Even if they didn't always lead to new products right away, many of these discoveries were broadly valuable.

But those glory days are fading. The big corporate labs are downsizing or vanishing. Nowadays, as a new NBER paper details, large US companies are focusing more of their R&D efforts on the later stages of development, rather than on basic research:

US firms are focusing less on basic and applied research

The NBER paper, by Ashish Arora, Sharon Belenzon, and Andrea Patacconi, documents a shift away from scientific research by large companies since the 1980s. It's a trend that could have big implications for the future of innovation and economic growth — especially since the US government has also been cutting back on R&D in recent years.

"There's been anecdotal evidence that scientific research at large companies was on the decline," says Arora, a professor at Duke University's Fuqua School of Business. "You see all these stories about the closing of the research labs. But we wanted to see if it was systematic."

Company scientists are publishing fewer papers than they once did

The authors first looked at private-sector R&D and found that it has held steady in recent decades. But the fraction devoted to basic or applied research has been shrinking (see chart). In other words, companies are still investing as much as ever in developing new products and harnessing existing research. They're just investing less in producing the basic science that's often considered the core of innovation.

The NBER authors found, for example, that scientists at large companies are publishing fewer papers in peer-reviewed journals than they did a few decades ago. (Notably, the drop is largest in basic science, suggesting that this doesn't just reflect firms being more secretive about their applied research.)

The authors also studied acquisitions. It's possible that large firms are simply "outsourcing" basic research to smaller start-ups that then get bought up. The 1980 Bayh-Dole Act made it easier for universities to license and commercialize their research — and for scientists to start their own smaller companies. So maybe it's just a shift in which companies are doing the science.

But even when you factor in acquisitions, the authors found that corporate investment in basic research and scientific capacity still seems to be declining. The trend appears to be real. So what's behind it? And is it worth worrying about?

Why are companies investing less in science?

One possibility is that companies no longer find science quite as valuable as did 30 years ago. If today's companies are focused more on writing software programs and less on inventing new airplanes (say), then maybe they just have less of a need for basic research programs.

The authors of the NBER paper can't rule this explanation out, though they're skeptical. They note that companies in the US and Europe are still patenting lots of things. And those patents still rely heavily on cutting-edge scientific research. It's just that companies are doing less of that research themselves.

As the authors put it: "Large firms appear to value the golden eggs of science (as reflected in patents) but not the golden goose itself (the scientific capabilities)."

Another possibility is that the huge boom in basic research by US corporations over the past century was something of an aberration — fueled by particular historical circumstances.

In the early 20th century, universities weren't doing as much cutting-edge research themselves (though they were producing a flood of PhDs). And, at the same time, antitrust policy was forcing firms like DuPont and Eastman Kodak to diversify into new markets. So these companies started hiring up scientists and investing in R&D to develop innovative new products.

But while all this corporate research was hugely beneficial to society, it didn't always produce huge returns for shareholders. Many companies simply failed to commercialize their own discoveries. Xerox's PARC famously invented the graphical user interface — only to see Apple and Microsoft got rich off of it.

Perhaps learning those lessons, modern-day tech companies seem to focus far more on commercial applications than on basic research. As The Economist reported in 2007, the vast R&D budgets of companies like Microsoft, IBM, and Hewlett Packard now mostly go "into making small incremental improvements and getting new ideas to market fast." (One notable exception is Google, which famously tinkers with moon-shot ideas like self-driving cars or Google Glass.)

Other changes in the economy helped along the shift. The companies of old with big corporate labs were often highly diversified — like GE or DuPont — and sold a wide variety of products. It made some sense for these companies to invest broadly in basic science that could have all sorts of unforeseen applications. "In the glory days of DuPont," Arora says, "if you did high-pressure chemistry, that might be useful for fertilizers, or it might be useful for nylon."

But US companies have become more specialized and narrower in scope since the 1980s, the NBER authors observe. And as that happens, it becomes less economically rational to invest in basic research that might mostly benefit others. Firms that are narrower in scope, the authors find, tend to publish less in scientific journals.

The authors also suggest that globalization may be playing a role: companies in sectors with a higher percentage of Chinese imports have seen the biggest drop in investment in basic research. Possibly competition from China reduces the amount of money a company can devote to science. (It was easier for companies with less external competition, like AT&T or IBM in the 1950s and 1960s, to pour money into basic research that may or may not pay off.)

Can other sources of research pick up the slack?

So how big a problem is this? That's the big question. The optimistic case is that much of that old corporate research was inefficient, and it's actually a good thing that firms are focusing more heavily on commercializing products while leaving the basic research to universities and government funding.

'Unless public funding can make up the deficit, technical progress will slacken'

The pessimistic take is that all that corporate-funded research in the 20th century was hugely beneficial, and it will be missed. Here's how the authors put it: "Established companies can no longer emulate firms such as DuPont, AT&T, or Merck, whose investments in research in the past have significantly advanced the frontiers of human knowledge. Unless public funding can make up the deficit, technical progress will slacken and eventually reduce productivity growth."

If this pessimistic story is right, that could be a problem. Because another big source of money for scientific research in the United States — the federal government — is also stagnating. Federal funding for basic and applied research grew in the 1980s and 1990s, but that growth has stopped, and it's set to decline in the years ahead due to budget caps:

If these trends hold up, then both federal and industry research appear to be on pace to decline over time. That would leave states, universities, and research foundations (the "other" in the chart below) to pick up the slack:

I've written before about the coming decline in federal spending on R&D — it's set to stagnate in the years ahead thanks to budget caps imposed by Congress. When combined with the decline in corporate-funded science, it points to a potential stagnation in US basic research.

Some of the experts I talked to for that earlier piece offered a few different ways to look at this. The sanguine view is that other countries, like China and South Korea, are now tossing more of their own money at scientific research — in effect, supplementing the US. And that will have positive spillover effects. If China invents a cure for cancer, we all benefit.

Others worry, however, that the US economy could suffer if a greater share of research is happening elsewhere. A 2012 report by the National Science Foundation, for instance, found that US firms were now shifting much of their R&D work overseas. And the United States has recently developed a trade deficit in high-technology goods, after surpluses during the 1990s.

Many lawmakers seem to take the more alarmed view. President Obama in particular has insisted that the United States can't fall behind on R&D: "We've got to make sure that we've got the best science and research in the world," he's said. Yet getting R&D funding back to where it was at the height of the space race — as Obama has suggested — would take a big shift in policy.

Further reading: The coming R&D crash

Update: I've updated the charts in the last section. The first now shows AAAS's estimate of federal basic and applied research in absolute dollar amounts. The second shows NSF's estimate of research from various sources as a portion of the economy.