In 1989 Microsoft shipped its first version of its SQL Server database, bootstrapped a multimedia division and promoted a guy named Steve Ballmer to the role of senior vice president. It also built its first data center. By today's standards it was a modest operation: a 89,000-square-foot facility in Building 11 of its Redmond, Washington, campus.

Since then the company has spent more than $15 billion, building out much more massive facilities that power its internet-based products such as Bing, Skype and Windows Azure.

Microsoft is one of a handful of internet giants that are reinventing the way data centers work. Along with Google, Amazon, Facebook and a few others, they're building massive, highly reliable data centers that do the back-end computing for billions of PCs and mobile devices. They design their own computers, hack out their own networking protocols and even switches. They experiment with state-of-the art cooling and power generation systems.

Much of this work is a collection of jealously guarded trade secrets. But every now and then these companies give us a peek behind the curtain.

At Microsoft, the large-scale build-outs really started in 2006, when the software company began work on a 500,000-square-foot data center in Quincy, Washington. Up until that time, Building 11 aside, Microsoft had mostly rented space in large data-center co-location facilities. With Quincy, though, it designed everything from the ground up.

"Our footprint was growing significantly and we were in the height of the online services and web services shift," says David Gauthier, Microsoft's director of datacenter architecture & design management. "We realized, man we're going to have to figure out how to build and design our own data centers."

Since then, it's built facilities in Chicago, Dublin, San Antonio, Texas, and Boydton, Virginia. And in a few months, it's set to open a brand new $112 million facility in Cheyenne, Wyoming. That's where the company is also tinkering with a $5.5 million experiment called the Data Plant: It consists of a portable data center powered by methane harvested from the local sewage plant.

Experiments like the Data Plant are fun, but Microsoft's big bet has been in using software to rejigger the way that its data centers are built, Gauthier says.

Take the Chicago data center. On the first floor there, Microsoft has dropped several dozen server containers, called ITPACs

The ITPACs are a unique Microsoft design. They can hold as many as 2,500 servers inside and the latest versions are cooled with high-tech misters. Microsoft can drop them wherever it likes – near a sewage treatment plant, as in Cheyenne, or on an outdoor lot, as in Boydton. At the Chicago data center, the ITPACs don't use backup generators.

"Those containers and the tens of thousands of servers behind them have no generators behind them," says Gauthier. "When we have any sort of blip or issue on that, we're able to fail those loads over to another data center without really changing the user impact or the user experience at all. It's happened a few times; everything's worked to plan."

Now Microsoft doesn't want to run every kind of service on these generator-less containers, but figuring out what type of programs can handle this type of failover and what kind of programs can't is where folks like David Gauthier earn their pay.

That means that they write software that does things like moving computing work from a data center that's threatened by a big storm to another data center that isn't at risk of a blackout.

In a world where Microsoft's 1 billion customers expect its web services to never go down, Microsoft's is clearly tinkering with the hardware. But the company's big bet is that it can keep everything humming along using really smart software. "We're moving the availability into software," says Gauthier.

We'd expect nothing less from Microsoft.