It's easy to knock the telcos nowadays, especially in the wake of the Net neutrality debates. AT&T and Verizon are lumbering old dinosaur behemoths from the post-war era who are steadily being displaced by the smaller, more agile, more innovative mammals of the Internet ageor so the standard narrative goes. But before we write off the telcos in general and AT&T in specific, I think it's worth looking at their legacy of innovation and how that legacy has evolved. It's also worth asking if the best metaphor for their troubles in the fast-moving innovation economy is really "dinosaurs being replaced by mammals." For my part, I think a better metaphor might be "old people blowing money in an effort to compete with the young, fabulous, and deeply in debt."

Making fun of the grandparents

When you're young and fabulous, it's easy to make fun of old people. Take this Business Week article, which looks at the state of research and development at the nation's two largest telcos, AT&T and Verizon, and concludes that said R&D is practically nonexistent. These old-line companiesrelics of the bygone telecommunications revolutionare now threatened by technological innovation, the author argues, because their business model is centered on extracting tolls from their existing pipes. Hence the noises about double-charging companies like Google (once for bandwidth, and then again for access to customers).

AT&T's Project Lightspeed, where the company plans to use its pipes to offer its customers moving pictures beamed directly into the home (this looks suspiciously like "television"), was allegedly put forth to Business Week by executives as the premier example of the company's innovative prowess. This was supposed to counter the notion that AT&T's only innovations nowadays are aimed at coming up with new ways to squeeze young Internet upstarts. The article's "greedy, out-of-touch old pensioners trying to nickel and dime the kids" narrative is attractive, but the real story is both more complicated and more troubling.

While AT&T may have shown some Business Week author a IPTV demo to pitch him on what the company is doing in the consumer services market, the company has by no means abandoned R&D. AT&T Labs is still open, and it still has multiple active research programs. In particular, AT&T researchers are working in the areas of voice recognition, network traffic analysis and shaping, the use of graphics processing units for nongraphics DSP algorithms, data mining, information security, wireless networking, and the list goes on (and on and on). The lab remains one of the largest and most productive in the country, in spite of numerous high-profile splits over the years and quite a bit of downsizing.

The real problem is that what AT&T is doing today is not your grandfather's R&D, and neither is the work coming out of Google's labs, or Microsoft's, or the labs of any of the other information economy wunderkinds.

Of pocket protectors and unlimited budgets

The Cold War, with its "Pentagon socialism", combined with large corporate monopolies that were expected to provide lifetime employment and pensions, made for something of a golden age for American technological innovation. This is the era that brought us the transistor and the predecessor to the Internet, an era where all the seeds of today's "information economy" were sown and carefully cultivated at great private and public expense.

The great labs of this eraBell Labs, Xerox PARC, and IBM's labswere places with massive budgets, where the world's top scientists were invited to pursue "blue sky" research into areas with no immediately apparent commercial applications. The facilities were state-of-the-art, and there was no pressure from management or shareholders to do anything but science for science's sake. To be able to fund such a lab was a mark of corporate prestige, and the labs themselves, along with their public counterparts like NASA, were major sources of national pride. For a company like Xerox or AT&T, what it meant to have a blue sky research lab was very much like what it means for a city to host a winning sports team; it was a source of pride and an anchor of collective identity. So much like the science that they produced, these labs were ends in themselves.

You might think of these private and public laboratories, with their hordes of young, energetic PhDs and blue-sky research programs, as producers of a kind of scientific capital. This painstakingly built fund of scientific capital that the postwar era left us was what the later generation of engineersthe fabled "two guys in a garage in Silicon Valley"drew on to produce the information revolution that began to burgeon even as the Soviet menace was disintegrating.

Dropping out of school and spending the family fortune

It's no coincidence that many of those who went on to lead the information revolution were dropouts from either PhD programs or top-notch undergraduate programs. Even those who finished their doctoral work didn't end up doing open-ended research at the new companies they either founded or joined. The information economy demanded go-getters who would put their energy towards turning basic science into marketable products, and that economy rewarded those who opted out of more traditional research careers with a mix of world-altering power and cold, hard cash. Thus many of the truly ambitious adjusted their career aspirations away from the blue sky research labs where their parents might have dreamed of working and focused instead on the new brass ring: the profitable start-up. Start-ups aimed not at producing scientific capital but at turning it into technological wizardry, and from there into real moneyor, rather, into stock value.

Now, I think it's important not to oversimplify things too much, or to caricature anyone. The more agile start-ups played an important structural role in making pure research careers less attractive. It's not that everyone was suddenly lured away from doing science by the promise of instant wealth. The competitive pressure that start-ups and new industries put on established businesses ultimately combined with trust-busting, structural changes in the economy, social shifts, and an array of other factors to turn expensive prestige items like research labs into unaffordable luxuries. Thus it stands that to one extent or another, all of the aforementioned labs have been downsized and/or transformed over the years into places where research programs must now yield commercial fruit.

In today's more agile economy, where workers hop from job to job and businesses spring up from nowhere to dominate an industry in the span of half a decade, there's no longer anything in the private sector like the enduring safety of the Ma Bell monopoly to lavishly support a blue sky research lab. The closest we have today is Google's "20 percent time," where engineers are encouraged to spend 20 percent of their time working on whatever research project strikes their fancy. But 20 percent isn't 100 percent.

With today's short-term corporate focus on maximizing shareholder value by inflating the stock price at all costs, the pressure to innovate comes from the boardroom and the marketing department. Hence all the men and women in R&D have to be able to make a case for the eventual marketability of what they're working on or risk being downsized. We've come a long way from men with pointy glasses and pocket protectors who spend decades just doing pure science on the corporate dime.

There's no doubt that the information economy continues to create a lot of wealth, but I think it's fair to ask if it's also creating enough science to replenish the stock of scientific capital that it's still burning through. I think it's clear that chaotic, market-driven change is a good way to bring ideas quickly and efficiently from concept to profitable product. However, such a rapid churning of the institutional and cultural landscape ultimately may not be conducive to the kind of steady, expensive, long-term investment in fundamental research that produces the really big ideas that somewhere, at some completely unforeseeable point in the future, change the world.

(And no, before you suggest it, the academy isn't all that insulated from rapidly changing market pressures anymore. Grant money is doled out to academics by private-sector corporations who are looking for a return on their investment. But this issue would take up a whole other article.)

Not your grandfather's Soviet competition, either

I think it's also worth asking ourselves if certain aspects of our current competitive environment aren't setting us up for a future drubbing by countries that are not only willing to spend the kind of money to fund blue sky researchboth in the private and the public sectorsthat America once spent in her generation-long effort to out-innovate the Soviets, but also don't have some of the structural problems that threaten to keep the two guys in the garage from ever bringing a product all the way to market. The South Koreans and the Chinese in particular have no qualms about building public broadband infrastructure and pumping state money into shiny new research facilities, and while their IP laws are (thankfully) tightening, they are not yet headed in the direction of an IP regime that actively stifles innovation and keeps new players out of the market. Meanwhile, back in America, a perfect storm of rent-seeking behaviors by entrenched players, a broken patent system, a lack of substantial corporate oversight, and old-fashioned executive greed threatens to drown the fabled "two entrepreneurs in a garage" just as surely as those two guys helped sink the blue sky research labs of the Cold War era.

In sum, I worry that not only is the information economy not replenishing the fund of scientific capital that it inherited from the great Cold War-era research labs, but that new start-ups are being actively locked out of the market by means of patent and trade secrets litigation so that a combination of old and new interests can fight over what's left of the shrinking pie.