The energy efficiency of computing is doubling every 18 months

IN 1965 Gordon Moore, a co-founder of Intel, first observed that integrated circuits, better known as silicon chips, seemed to conform to a predictable law: since their invention in 1958, the density of components in each chip had doubled each year, and this trend was, he suggested, likely to continue for at least a decade. In 1975 Dr Moore modified his prediction, observing that component density was doubling every two years. In practical terms, the result is that personal-computer performance doubles every 18 months, and has done so for decades, a prediction commonly known as Moore's law. As computers have become mobile devices, however, their users are increasingly concerned about battery life as well as raw performance. So they will welcome a new analysis, by Jonathan Koomey of Stanford University and his colleagues, which seems to have uncovered a deeper law relating to the energy-efficiency of computers, dating back to the era of vacuum tubes. The researchers found that the electrical efficiency of computing has doubled every 1.6 years since the mid-1940s. “That means that for a fixed amount of computational power, the need for battery capacity will fall by half every 1.6 years,” observes Dr Koomey. This trend, he says, “bodes well for the continued explosive growth in mobile computing, sensors and controls.” Some researchers are already building devices that run on “ambient” energy harvested from light, heat, vibration or TV transmitters. As the energy-efficiency of computing continues to improve, this approach will become more widespread. Dr Koomey's team published their results in IEEE Annals, an industry research journal. Inevitably, industry observers are already calling this new finding “Koomey's law”.