In the lifetime of a college freshman, laptops and all kinds of mobile devices have shrunk from boxy, heavy machines into sleek gadgets. Everything has gotten tiny, including the amount of electricity they need to compute.

In fact, the improvements in the electrical efficiency of computing are nothing short of astonishing. More importantly, they are the necessary precondition for the mobile world of computing you know and love. Let me run through a quick thought experiment so you can see why.



Imagine you've got a shiny computer that is identical to a Macbook Air, except that it has the energy efficiency of a machine from 20 years ago. That computer would use so much power that you'd get a mere 2.5 seconds of battery life out of the Air's 50 watt-hour battery instead of the seven hours that the Air actually gets. That is to say, you'd need 10,000 Air batteries to run our hypothetical machine for seven hours. There's no way you'd fit a beast like that into a slim mailing envelope.



This is one fascinating consequence of a trend that Stanford consulting professor Jonathan Koomey has discovered in the history of computing. For the last 60 years, "the electrical efficiency of computation has doubled roughly every year and a half," according to Koomey's latest paper in the IEEE Annals of the History of Computing.

Of course, Koomey told me that my thought experiment is ultimately unrealistic. It's difficult to separate out the energy efficiency improvements in computation from the overall improvements in chips because many of the changes that make more powerful chips also make more efficient chips.

Still, in a world where people think of energy efficiency as more of a bonus than something necessary, it's important to remember that using less electricity to do the same task opens up the ability to make new types of products. Apple wouldn't be making magical devices if the entire industry hadn't done the yeoman's work of increasing computing's electrical efficiency.





