A DOZEN years have passed since David Isenberg, then a distinguished engineer at AT&T Labs, wrote his seminal essay “The Rise of the Stupid Network”. In it, he outlined how a new philosophy and architecture were changing the communications business, and pointed to some of the cataclysms ahead.

Far from being a scarce resource used intermittently, Dr Isenberg argued that future networks would be “always on”, with their intelligence located in the end-user's equipment rather than within the network itself. They would make no fancy routing or traffic-management decisions; they would just “deliver the bits”.

Unlike the telephone circuits of the day, which used their built-in smarts to determine where messages were to be delivered, the data would tell the network where they wanted to go. In short, the data would be boss.

Shutterstock

Uncloggable artery?

The stupid network Dr Isenberg had in mind was, of course, the internet we know today. Central to his vision was the radical notion that end-users—or customers—would be free to do as they pleased, and the network would make no assumptions about the kind or content of data being transmitted.

The engineering community applauded the idea. The phone companies (AT&T especially) thought it stank. And Dr Isenberg wound up working for himself.

Dr Isenberg will likely be watching this week's deliberations by the Federal Communications Commission (FCC) with interest. As your correspondent was scribbling away, the FCC was preparing for its second hearing on network-management practices. The meeting, held at Stanford University on April 17th, concerns whether internet service providers (ISPs) should be allowed to shape, filter or even block content travelling over their networks.

The hearing stems from a complaint filed last autumn alleging that Comcast, America's largest cable-TV company and one of its biggest ISPs, was blocking a perfectly legal file-sharing program called BitTorrent. Ever since, Comcast has been scrambling to prevent the FCC from rewriting its rules about peer-to-peer (P2P) software like BitTorrent, which is widely used to download video and other large multimedia files.

Comcast argues that throttling P2P traffic is justified because it's swamping its network. Unlike conventional networks, where central servers dish out files to peripheral client computers, a P2P network is an ad hoc collection of individual computers all acting as servers and clients simultaneously, sharing bits of a large file between themselves until each has a complete copy and leaves the party.

All very friendly and helpful, save for one thing. By definition, the peers are all on the periphery of the network—in users' homes and offices at the end of the so-called “last mile”.

That's where bandwidth is hugely asymmetric—being designed almost exclusively for downloading files, not uploading them as well. For instance, the maximum upload-speed of AT&T's premium 6mbps (megabit per second) service dawdles at 768kbps (kilobits per second).

P2P presented no problem before the web, when it was used for sharing articles on Usenet. Then Napster, Kazaa and Gnutella came along and changed everything.

And not just because of the popularity of such file-sharing programs with music fans. The sizes of the files they handled increased dramatically. Music tracks and podcasts used to be offered for streaming at 128kbps; versions at 256kbps or even 320kbps are now common.

Video has an impact, too. Though online video-rental and distribution has only recently begun in earnest, all those HDTV sets sold over the past few years will shortly make high-definition downloads the norm. Meanwhile, waiting in the wings is “4k video”, which promises four times the resolution of today's HDTV, and needs a whopping 6gbps (gigabits per second) to fill the screen.

Once again, alarmists are issuing dire warnings about the internet collapsing under the weight of its traffic. But that's nothing new: they've been doing so since the 1990s.

Bob Metcalfe, who invented the Ethernet protocol for local area networks, once claimed that the internet was about to be overwhelmed by e-mail traffic. That was in 1996. A year later, Dr Metcalfe not only admitted the error of his doomsday prediction, but literally ate his own words—grinding his speech from a year before with liquid in a blender and quaffing the lot to cheers from his audience.

The latest panic started with a scare-mongering story in the Wall Street Journal last year, which concerned the rise of internet video and the inability of the network to handle it, especially at network edges where the internet enters the home. The author talked of the “coming exaflood”, referring to the exabytes (ie, billions of gigabytes) of HD video users would soon be downloading.

Others in the industry have continued to fan the flames, with cable companies like Comcast wafting the hardest.

Unlike the individual DSL lines that telephone companies use to deliver broadband, cable operators provide services using a single loop shared by up to 450 households at once. Comcast claims it takes only about a dozen people simultaneously using a bandwidth-hogging program like BitTorrent for others on the loop to find their web activities grinding to a halt.

While neither the DSL nor the cable companies have beefed up their local connections as fast as the internet backbone operators have boosted their capacity, there's still enough bandwidth over the last mile for current traffic. And soon there will be a whole lot more—at least for Verizon, Sprint and even Comcast.

Verizon is spending $18 billion pushing its FiOS fibre network out into neighbourhoods. Despite its woes, Sprint is pressing ahead with WiMAX, the faster, longer-range version of WiFi. And Comcast is making good progress with the latest cable protocol, DOCSIS 3.0, which should allow it to offer 100mbps within a year or two.

Comcast has also made its peace with BitTorrent. The two companies are to collaborate on addressing some of the issues caused by P2P. It has also teamed up with an outfit called Pando Networks which has a nifty traffic-management technology known as P4P. The technology helps P2P file-sharers find each other more selectively, thus boosting their download speeds.

Comcast and others hope Pando's P4P can persuade the FCC to back away from implementing stronger net-neutrality rules. That could well forestall any further legislation on Capitol Hill that might force them to keep their networks open to all comers in the way Dr Isenberg intended.

Following a number of aborted bills over the past few years, the Internet Freedom Preservation Act that's currently working its way through committee has the express backing of Senators Barack Obama, Hillary Clinton and other congressional bigwigs.

The bill specifically bans blocking or degrading lawful content, and forbids ISPs from charging more for downloading things like video. The present political climate gives it the best chance yet of making it on to the statute book.