WARREN, Michigan—General Motors has gone through a major transformation since emerging from bankruptcy three years ago. Now cashflow-positive, the company is in the midst of a different transformation—a three-year effort to reclaims its own IT after 20 years of outsourcing.

The first physical manifestation of that transformation is here at Warren, where GM has built the first of two enterprise data centers. The $150 million Warren Enterprise Data Center will cut the company's energy consumption for its enterprise IT infrastructure by 70 percent, according to GM's CIO Randy Mott. If those numbers hold up, the center will pay for itself with that and other savings from construction within three years.

Mott recently announced that GM's efforts to make the Warren data center's construction eco-friendly—including its energy-saving measures, solar-powered electric car charging stations in the parking lot, and recycling of 99 percent of construction waste—have earned the center a LEED "gold" certification from the US Green Building Council. Less than five percent of data centers have LEED certification.

The data center is part of a much larger "digital transformation" at the company, Mott said. GM is consolidating its IT operations from 23 data centers scattered around the globe (most of them leased) and hiring its own system engineers and developers for the first time since 1996. Within the next three to five years, GM expects to hire 8,500 new IT employees with 1,600 of them in Warren. "We're already at about the 7,000 mark for internal IT from our start point of about 1,700," Mott said.

On September 5, Jeff Liedel, GM's Executive Director and CIO for Infrastructure Engineering, gave Ars a personal tour of the Warren data center. While GM's needs are far different from that of big Internet companies such as Amazon, Facebook, or Google, it's clear that the automaker has cribbed from their notes on how to build data centers and how to create spaces that drive collaboration.

While it will be years before GM reaches the capacity of the Warren data center, the company has already begun construction on another data center at the company's Milford proving grounds. Mott reiterated the expectation that the Warren site will pay for itself within three years. The savings won't just come from the 70 percent reduction in energy consumption, but there's a "holistic" effect of closing the 23 data centers around the world from GM's days of IT outsourcing.

"In the IT business, you only get to build a data center once every 20 years or so," Liedel said. "We needed to get this right."

Building a cloud, under one roof

The first step in that transformation, Liedel said, was converting everyone running its IT operations to GM employees. Next came centralizing control over the company's widely-scattered IT assets.

So far, three of the company's 23 legacy data centers have been rolled into the new Warren data center. That's eliminated a significant chunk of the company's wide-area network costs. "We have 8,000 engineers at (Vehicle Engineering Center) here," Liedel said. And those engineers are pushing around big chunks of data—the "math" for computer-aided design, computer aided manufacturing, and a wide range of high-performance computing simulations

"Now with the data center on the same campus, we're not paying for the WAN bandwidth we had before," Liedel explained. "We've got dark fiber here on the campus, and the other major concentration of engineers is at Milford at the Proving Ground." Milford and Warren are connected over fiber via dens wave division multiplexing, providing 10 channels of 10-gigabit-per-second bandwidth.

GM has also centralized control of all of its data centers at Warren in a new 180,000 square foot IT Operations and Command Center adjacent to the data center. From the OCC, GM's IT operations team oversees the entire GM global network—including the growing cloud infrastructure within the Warren data center itself.

"Cloud is an integral part of the data center," Liedel said. "We offer compute infrastructure as a service to the developers in the IT organization. I've been in IT for 25 years now, and I think 'We're waiting for a server' is a really lame excuse for being late with an IT rollout. The long pole should be the requirements, code or testing."

For GM, cloud means automated provisioning of virtual servers within its existing computing hardware (which is largely made up of Hewlett-Packard blade servers in HP C7000 enclosures), using tools from HP, IBM, and VMware. The automated provisioning system GM has put in place allows Liedel's team to turn around requests for new servers in about two hours. That's not as fast as Amazon, but GM has some needs that require a bit more time to handle.

"We build firewall rules, set which what protocol ports are open, and configure load balancers and do an active-active replication between data centers," Liedel explained. "And we perform security scans and signoffs. There are things you need to handle with applications, like 'do you need sticky bits on the load balancer, so it hits the same server every time you ask for something in a session' and 'do you need SSL setups?' Some of those things, the tools don't do that yet. But a lot of what we do is automated."

There are currently about 2,500 virtual servers running in the Enterprise Data Center, 88 percent of which run on GM's two standard provisioned operating systems: SUSE Linux and Microsoft Windows 2008 R2. Where the cloud provisioning system breaks, said Liedel, is in that other 12 percent "where they leave the cloud and we have to build custom infrastructure."

GM isn't using software-defined networking as part of its cloud. "We use a lot of VPNs and a lot of LAN segments on the same hardware for traffic control as well as security," Leidy said. That's particularly true for a group of systems Liedel showed me racked up alongside each other in one of the data center's thermally isolated "pods"—GM's Enterprise Data Warehouse.

GM's data warehouse consists of a cluster of IBM servers running a Hadoop cluster, a Teradata analytics "appliance," and another rack of HP compute servers running visualization and reporting tools on VMware virtual server instances. (That includes server software from Cognos.) Liedel said that GM uses Hadoop to normalize its masses of data and then uses the Teradata analytical engine to perform deep analysis on it.

Keeping it cool

Aside from shedding the leases and leased lines, one of the biggest sources of savings at Warren is expected to be its energy efficiency. Part of that comes from the design of the data center itself, and part of it comes from being in Michigan.

"We take advantage of the cold weather we have eight to nine months a year," Liedel explained. The data center's cooling system includes three "waterfall" evaporation chillers, with three monster-sized refrigerating chillers as backup that sometimes heat the water up. "Sometimes when it's colder, we have to heat the water to 53 degrees," Liedel said—because at lower temperatures, the water would cause moisture in the data center's air to form condensation.

Two chillers are enough to handle the current capacity of the data center, but there are three for redundancy. There's also infrastructure in place to add more chillers as the capacity of the data center expands. Alongside the evaporative waterfalls, there are huge "thermal storage" tanks. The tanks hold a reserve of 30 minutes worth of cooling water in the event the cooling system loses power—it can take up to 30 minutes to get cooling water back to the right temperature if the system goes offline, Liedel said.

Another part of the data center's cooling efficiency comes from the "in-row" coolers. Six of these coolers—essentially chill-water cooled air conditioning units—are installed in each of the data center's thermal containment pods. These pods are groups of racks arranged in a rectangular block with a common "hot aisle" behind them. Each pod has a heat containment "roof," a retractable plastic sheet that covers the hot aisle.

"Servers will adjust their current draw and their cooling fans based on their load," said Liedel. In data centers with central HVAC, that can play havoc with cooling, since some areas will be hotter than others. But with individual in-row coolers only three racks at most away from a set of servers, it's easier to deliver cooling just to the areas where it's needed and run the HVAC for the whole facility much more efficiently. If cooling fails in one of the data center's pods, its roof automatically retracts to allow the general HVAC to pick up the load.

Almost no batteries required

Aside from its energy efficiency, GM's Warren Data Center picks up green cred in the way it handles its emergency power. Instead of using an array of lead-acid batteries to provide current in the event of an interruption of power, the data center is equipped with uninterruptible power supplies from Piller that use 15,000 pound flywheels spinning at 3,300 revolutions per minute.

The stored momentum in the flywheels can turn an emergency generator in order to provide power for up to 15 seconds while the data center's two giant 5,800 horsepower diesel generators come online. There are also two 6,400-gallon diesel fuel tanks—enough for the generators to produce three megawatts of power for up to 48 hours. These tanks can be continuously refueled to keep the data center up and running indefinitely.

All of the energy coming into the data center, both from the utility lines and from the backup generators, is used to power "motor generators." Rather than being passed directly into the power grid of the data center, the outside power is used to run these internal generators, mostly to prevent outside spikes or variations from being passed on to the power distribution system.