This month we focus on data centers built to support the Cloud. As cloud computing becomes the dominant form of IT, it exerts a greater and greater influence on the industry, from infrastructure and business strategy to design and location. Webscale giants like Google, Amazon, and Facebook have perfected the art and science of cloud data centers. The next wave is bringing the cloud data center to enterprise IT... or the other way around!

Continuing its long tradition of data center experimentation in the name of efficiency, Microsoft announced it has been testing an unusual new data center concept: placing servers underwater out in the ocean.

Close to half of today’s society lives near large bodies of water, and since physical distance creates the ultimate speed limit for transferring data, storing data under the sea, close to major population centers, is a logical way to optimize delivery of cloud services.

“Half of the world’s population lives within 200 km of the ocean, so placing data centers offshore increases the proximity of the data center to the population, dramatically reducing latency and providing better responsiveness,” Microsoft said on the website dedicated to the research effort, called Project Natick.

Microsoft hasn’t shied away from experimenting with novel ideas for data center infrastructure in the past. In Wyoming, for example, the company tested a data center powered by fuel cells that converted methane from a waste processing plant to electricity. In another experiment, Microsoft researchers tested small fuel cells installed directly into IT racks.

What the company learned from Project Natick may lay the groundwork for deploying data center capacity underwater at scale, cooled by seawater and potentially even powered by tidal energy. “While every data center on land is different and needs to be tailored to varying environments and terrains, these underwater containers could be mass produced for very similar conditions underwater, which is consistently colder the deeper it is,” the company said.

Another potential benefit is quick deployment. It took 90 days to build and deploy the test system, which is much faster than the typical process of getting permits for a brick-and-mortar data center, designing, and building the facility.

Project Natick server rack being placed inside the shell for underwater deployment (Photo: Microsoft)

Around August of last year, Microsoft researchers deployed the test system off the coast of California. It was a rack of standard servers sitting in a cylindrical steel shell (10 feet by 7 feet). Heat exchangers were outside of the shell, providing the servers with free cooling. In December, the 38,000-pound container was out of the water and back at the company’s campus in Redmond, Washington.

Microsoft has not released any results of the experiment, saying only that they were “promising.” At this stage, the project is more about collecting data than developing a specific solution. There are still major hurdles to actually implementing something like this.

“While at first I was skeptical with a lot of questions. What were the cost? How do we power? How do we connect?” Christian Belady, general manager for data center strategy at Microsoft, said in a statement. “However, at the end of the day, I enjoy seeing people push limits.” Belady says. “The reality is that we always need to be pushing limits and try things out. The learnings we get from this are invaluable and will in some way manifest into future designs.”

Here's a short video about Project Natick, produced by Microsoft: