When it comes to computing, the "cloud" may rain efficiency benefits.

Researchers at Lawrence Berkeley National Laboratory and Northwestern University unveiled a modeling tool yesterday that estimates the energy savings of moving local network software and computing into the server farms that make up the cloud.

The tool, available to the public online, is called the Cloud Energy and Emissions Research Model (CLEER). It aims to give scientists a better understanding of how energy use changes as the world moves away from storing and processing information in local networks and moves toward outsourcing these tasks to centralized facilities.

Though the word "cloud" evokes images of a clean, simple and environmentally friendly process, the systems that support it are massive industrial facilities, densely packed with processors and hard drives, that devour energy by the megawatt. Data centers use between 1 and 2 percent of the world's electricity and, with dead trees that make paper giving way to magnetic disks, energy use and consequently emissions from the Internet is poised to surge further (ClimateWire, Jan. 9).

Nonetheless, moving to the cloud could still save huge amounts of energy.

In a case study using the CLEER simulation, researchers found that if all American businesses moved their email programs, spreadsheet applications, customer management software and the like to centralized off-site servers, companies would shrink their computing energy footprints by 87 percent, enough to satiate the 23 billion kilowatt-hour annual appetite for the city of Los Angeles.

"The main gains in cloud computing come from consolidation," explained Lavanya Ramakrishnan, a scientist at Berkeley Lab who co-authored the study. Many businesses have servers and computing hardware on-site, which are often inefficient and underused, soaking up electricity while sitting idle.

Pooling these resources in a central location means companies can effectively buy computing power in bulk and servers can spend more time doing actual work per processing unit, reducing the overall need for more computers.

A counterintuitive finding

It seems to make intuitive sense, but researchers said they had some difficulty confirming their suspicions that the cloud saves energy. "There is a gap here where there is not enough data," Ramakrishnan said.

Another issue is that there are so many variables at play. "The savings are really going to vary depending on the system you're studying and what your baseline is," said Eric Masanet, another co-author and professor at Northwestern's McCormick School of Engineering.

Whether watching a video uses less energy when it is streamed compared to a DVD depends on your computer, how you use it, the quality of your Internet connection and server loads, along with a host of other factors.

"The analyses that we need for understanding the net implications of these new technologies can be quite complex," Masanet said. "There are a lot of moving parts that determine whether it's a good or bad thing."

The CLEER Model starts to chip away at this problem, aggregating available models of how data moves through the Internet. It then calculates the energy used to deliver the ones and zeros as well as the carbon intensity behind it, since not all electrons are created equal; electricity from dirty fuels like coal or from renewable sources, like solar and wind power, changes the overall environmental impact from the cloud.

There are also some limitations to CLEER. "We didn't include things like cost and latency and other things that come into play when you're making a business decision," Masanet said. However, researchers can rebuild and reconfigure the model from the bottom up as better data come in.

Eventually, Internet companies could tell you just how efficient their data centers are using tools like CLEER, increasing energy transparency and letting consumers shop for the most efficient option.

Reprinted from Climatewire with permission from Environment & Energy Publishing, LLC. www.eenews.net, 202-628-6500