Still, they are the standard setters in terms of the amount of electricity needed for a computing task. “The public thinks these massive data centers are energy bad guys,” said Eric Masanet, the lead author of the study. “But those data centers are the most efficient in the world.”

The study findings were published on Thursday in an article in the journal Science. It was a collaboration of five scientists at Northwestern University, the Lawrence Berkeley National Laboratory and an independent research firm. The project was funded by the Department of Energy and by a grant from a Northwestern alumnus who is an environmental philanthropist.

The new research is a stark contrast to often-cited predictions that energy consumption in the world’s data centers is on a runaway path, perhaps set to triple or more over the next decade. Those worrying projections, the study authors say, are simplistic extrapolations and what-if scenarios that focus mainly on the rising demand for data center computing.

By contrast, the new research is a bottom-up analysis that compiles information on data center processors, storage, software, networking and cooling from a range of sources to estimate actual electricity use. Enormous efficiency improvements, they conclude, have allowed computing output to increase sharply while power consumption has been essentially flat.

“We’re hopeful that this research will reset people’s intuitions about data centers and energy use,” said Jonathan Koomey, a former scientist at the Berkeley lab who is an independent researcher.