SAN FRANCISCO — Data centers’ unquenchable thirst for electricity has been slaked by the global recession and by a combination of new power-saving technologies, according to an independent report on data center power use from 2005 to 2010.

The report, by Jonathan G. Koomey, a consulting professor in the civil and environmental engineering department at Stanford University, found that the actual number of computer servers declined significantly compared to 2010 forecasts because of this lowered demand for computing and because of the financial crisis of 2008 and the emergence of technologies like more efficient computer chips and computer server virtualization, which allows fewer servers to run more programs.

The slowing of growth in consumption contradicts a 2007 forecast by the Environmental Protection Agency that the explosive expansion of the Internet and the computerization of society would lead to a doubling of power consumed by data centers from 2005 to 2010.

In the new study, prepared at the request of The New York Times, Mr. Koomey found that electricity used by data centers worldwide grew significantly, but it was an increase of only about 56 percent from 2005 to 2010. In the United States, power consumption increased by 36 percent, according to Mr. Koomey’s report, titled “Growth in Data Center Power Use 2005 to 2010.”