Stay on Top of Emerging Technology Trends Get updates impacting your industry from our GigaOm Research Community

Researchers at one of the world’s oldest universities, Cambridge, have come up with a prototype for a possible future internet infrastructure that does away with the need for servers. This could help solve the network capacity problems that arise out of the profusion of bulky online content such as video.

The way the internet currently works, content is mostly delivered to client devices such as PCs and smartphones from powerful computers called servers, which are generally housed in data centers. This represents a centralization of computing power and storage that some argue is becoming outdated, what with the beefy processors and (sometimes) capacious storage devices we carry around in our pockets these days.

The Cambridge University prototype would represent a dramatic revamp of that way of doing things. Part of a wider EU-funded project called Pursuit, the putative protocol operates more like the popular filesharing mechanism BitTorrent, in that users share information directly with one another, rather than through a server. Simplistically put, Person B might receive content from Person A’s device, then become a source for that data so Person C could then download it, and so on.

Fragments of the same data might be replicated all over the place, in order to make re-assembly as quick and efficient as possible. So, for example, if you want to watch a TV show online, you would get its fragments from people nearby who have already downloaded and watched it, rather than from the provider’s server or content delivery network.

What is particularly interesting about the Pursuit system is the way in which it would identify content. Instead of using web addresses as is currently the norm, the data would be “fingerprinted” to show the authenticity of its source. The Pursuit team has already shown proof-of-concept applications that can find content in this way.

I’ve actually been having quite a few discussions with people in the industry about this sort of idea, ever since the Edward Snowden revelations demonstrated how the centralization of data in the cloud can allow greater control over users. And one major issue that has been raised during these chats is that of the shift to mobile — this makes computing more ubiquitous, but it raises problems that don’t apply to the desktop context, such as relatively limited storage and bandwidth, and of course data caps and battery life.

Dirk Trossen, the technical manager for Pursuit, pointed out to me via email on Wednesday that today’s mobile storage can greatly exceed the desktop storage of 10 years ago. Also noting the profusion of connected storage devices, he said the increasing potential for distributed storage was key to Pursuit’s design:

“Each information item is individually addressable with items being possibly very large objects or (more likely) smaller chunks of something bigger. Hence, I can assemble larger objects by collecting the chunks, akin to P2P systems albeit realised at the level of the current internetworking protocol (making it very efficient). With that, I can take any storage into count that has (even a chunk of) my information that I want. Whether some storage is smaller than other does not really matter.”

So what about data caps and battery life? There, Trossen pointed out that it’s not necessary to seed data to others all the time, as data can come from alternative sources. “This ‘diffusion’ is easy to implement with simple rules in each end device, while the network at a very low level (compared to P2P systems) would take care of the delivery itself,” he said.

Pursuit clearly has many advantages when it comes to network efficiency and security, too, because without servers you don’t have to worry about denial-of-service attacks — there is simply no centralized point to overload. But, circling back to the issue that made me think about the potential of such distributed systems, what about surveillance and censorship?

According to Trossen, this comes down to the way in which Pursuit is deployed:

“Similar to today, if you designed the deployment appropriately, censorship and surveillance would become very difficult (using encryption, ‘hiding’ behind labels without using meaningful names or changing the name to label association rapidly. However, censorship and surveillance can also become easy by centralising the main components. All this, however, is similar to today’s internet. “The surveillance unearthed by Snowden was enabled at large by the centralisation of main components of today’s internet (in U.S. jurisdiction). There are certain architectural measures one can do to circumvent that but it’s hard nonetheless. I don’t think that it would be much different in a Pursuit world, if you don’t have the societal push for reduced surveillance. In short: censorship and surveillance in a policy/society problem.”

Pursuit is certainly an ambitious project, because — unlike PARC’s similar “content centric networking” idea — it’s a replacement for the TCP/IP internetworking protocols, not something designed to run alongside them. For that reason alone, any widespread implementation is likely to be a good way off, but with the amount of online content exploding like it is, it does seem increasingly likely that the future internet will be a whole lot more distributed than it is today.