Excitement about the approach of the Federal Communications Commission's National Broadband Plan, due March 17, is inspiring ever more dramatic calls for greater high-speed Internet connectivity in the United States. This month, FCC Chair Julius Genachowski declared that the agency wants 260 million Americans hooked up to 100 Mbps broadband by 2020. Not to be outdone, the Media and Democracy Coalition says that by that same year consumer access to "world-class networks" should equal the present rate of telephone adoption (90%+).

As these calls for ever higher benchmarks reach a fever pitch, it's worth remembering some of the grand proclamations of yesteryear. Take, for example, the TechNet group's 2002 recommendation that the government should commit to a goal of 100 Mbps to 100 million homes and small businesses by the end of the decade—in other words, now. The consortium included CEOs and executives from Cisco, Microsoft, and Hewlett Packard.

Principle number one, they declared, was that the US "should foster innovation and reduce regulations—especially with respect to broadband applications and services."

But in case you didn't notice, 100Mbps x 100 million didn't happen. About 75 to 77 million Americans currently access some kind of broadband, according to the latest data. That's only assuming, however, that you accept 200Kbps as a flavor of "high speed Internet." And a huge chunk of the population (over 30 percent) never go online at all—less because they're retired and not interested; more often because they can't afford the prices.

So why this shortfall of progress, especially compared to other countries? Some argue that everything is going fine. The US is just too spread out, that's all—and we'll catch up in due time. Others contend that we just haven't spent enough government or private sector money on the problem. But the big thesis these days is that we missed the boat by curtailing wholesale network access to the big telcos and cable ISPs. By making it more expensive for smaller providers to link to AT&T, Verizon, Comcast, or Time Warner Cable in order to build out their own middle-mile systems, the government condemned most consumers to two ISP choices, at best.

The FCC's own recently commissioned study by Harvard's Berkman Center declared that "there is extensive evidence to support the position, adopted almost universally by other advanced economies, that open access policies, where undertaken with serious regulatory engagement, contributed to broadband penetration, capacity, and affordability in the first generation of broadband."

We're not going to categorically proclaim that this is indeed the solution to the nation's broadband woes. But there's no question that the policy of the FCC for the last dozen years has been to make it more expensive and even harder for businesses and competitive service providers to get Internet or telephone access (which are increasingly the same thing) at regulated rates.

When the FCC announced it was letting Berkman do that survey, the Commission's National Broadband Plan coordinator Blair Levin declared that in so doing, the agency didn't want to "reinvent the wheel." But let's hold onto that wheel metaphor and review the extent to which the US has rolled back open access over the last dozen years. As you'll see, on just about every available platform, businesses, smaller telcos, and alternative ISPs have been given a back seat to the game.

Dedicated access

In 1996, when Congress passed its Telecommunications Act, everybody was jazzed about the dot-com boom. Policy makers assumed that investors would pour capital into building out the nation's middle mile broadband capacity, making it affordable for big corporations and wireless companies to rent lines for enterprise computing and backhaul—the circuits that link cell phone towers to network switches.

Sprint told us the company pays something like seven times for one of the thousands of special access lines it needs than what consumers pay for a single, much faster residential broadband account.

Instead, the boom fizzled. The FCC, however, kept working under the assumption that deregulation would encourage the construction of more capacity. It issued an order that gave the green light to the dismantling of "special access" price caps under certain conditions. If enough access-creating telecommunications infrastructure had "aggregated" or "colocated" in an urban area with more than 50,000 people—the agency would regard this as a sign of significant competition and lift price caps.

In addition, in 2000 the big carriers asked for, and got, yearly reductions in price cap levels based on agreed-upon percentages: three percent in 2000, and 6.5 percent for the next three years. Four incumbents—AT&T, BellSouth, QWest, and Verizon—received full price deregulation in over 100 major metropolitan areas. One of those companies, BellSouth, is now part of AT&T.

But five years later, the Government Accountability Office did an audit of 16 metropolitan areas and found very few signs of growth in facilities-based competition, signs of its shrinkage, and higher special access prices in various cities. And the GAO concluded that the FCC "does not regularly monitor and measure the development of competition, which will affect how FCC responds to emerging trends, and the actions it takes to encourage and foster such competition."

Fast forward to now, and Sprint told us the company pays something like seven times for one of the thousands of special access lines it needs than what consumers pay for a single, much faster residential broadband account. Meanwhile, a report issued last year concluded that special access charges now represent a huge chunk of incumbent telco business. The National Association of Regulatory Utility Commissioners found that in 1996, interstate special access represented less than five percent of Qwest's, Verizon's, and AT&T's total revenue. In 2007 they represented almost 30 percent of Qwest's, nearly 25 percent of Verizon's, and close to a fifth of AT&T's.