Prineville, Oregon. 2011.

In August of 2011, Jay Parikh, the Vice President of Infrastructure Engineering at Facebook, received a call. As Parikh recounted to The Register in June of 2013, he remembered the conversation going something like this:

“Jay, there’s a cloud in the data center,” “What do you mean, outside?” “No, inside.” … It was raining in the data center.

Data centers house servers and other networked computer equipment in large warehouses, storing a large percentage of the information on the internet. They also provide the computational power necessary to support ‘cloud computing,’ a system of distributed resources that lets a user off-load computational tasks. Predictably, data centers produce a remarkable amount of heat, with power use densities over 100 times that of a normal office building. The cost of air-conditioning alone can be immense, but without full-time climate control the racks of equipment would critically overheat in a matter of minutes.

The Prineville, Oregon Facebook facility was new, and had been built with a chiller-less air conditioning system, which promised to be more energy efficient than traditional cooling systems by using outside air.

From the official Facebook report:

This resulted in cold aisle supply temperature exceeding 80°F and relative humidity exceeding 95%. The Open Compute servers that are deployed within the data center reacted to these extreme changes. Numerous servers were rebooted and few were automatically shut down due to power supply unit failure.

Data centers are arranged in alternating ‘hot’ and ‘cold’ rows, with the cold rows generally serving as human-access points, where the hot rows are generally for fan exhaust. The ‘extreme changes’ described in the report above were caused by an accidental feedback loop of high temperature and low humidity air from the hot rows entering a water-based evaporative cooling system. When this air returned to the servers on the cold rows, it was so wet that it condensed. A cloud was raining on the cloud.

Parikh continued, “For a few minutes, you could stand in Facebook’s data center and hear the pop and fizzle of Facebook’s ultra-lean servers obeying the ultra-uncompromising laws of physics.”

There are multiple reasons for this formation of the cloud (and subsequent failure of servers), and Facebook went on to amend its official guidelines to guarantee a lower inside humidity, and to recommend a rubber seal around all power supplies — effectively water-proofing them from any future weather systems. But managing humidity in data centers has always been a puzzle, and Facebook’s complete oversight of the increased complexity that comes from using outside air seems unlikely.

This weather event was precipitated by the use of a new type of cooling system (the chiller-less system) which had broken the hermetic seal of the server farm. Suddenly the building was not set apart from the outside environment — instead, it was breathing, exchanging air with the local climate. The Prineville facility was built complete with intake and outtake vents, which granted it a porosity somewhat unique in 2011. It may not have been the first data center to be tethered to the outside world in such a way (chiller-less cooling was new, but not unheard of) but it was the biggest. It is almost certainly the only one to reproduce a local weather pattern from outside humidity, and perhaps subsequently it is also among the first to be located so severely in its geography.

Of course, no data center is truly absent from its locale. The Prineville facility is also connected to the Prineville electric grid; Prineville locals drive their cars to work and park under the Prineville sky before heading in to monitor the stacks; the data center is visible from Google Maps as distinctly inside of Prineville (at the time of writing, it had a respectable 4.3 star rating on the service). But every data center is also everywhere, serving data that escapes its geographic confines at a dizzying speed, populating packets that circle the globe with little care to origin. The information that comprises our websites, email storage, and personal photographs — as well as the computational power that generates our map routes, friend requests, and predictive text — doesn’t feel like it is coming from Prineville, or Switzerland, or the Faroe Islands. It feels present, palpably right there — always.