Moore's Law: The Future Is Energy Efficiency

As the industry continues to shrink chips, energy efficiency is one way to expand Moore's Law beyond its current form. Let's explore how.

In the first 20 years of microprocessor development, it was all about running as fast as possible. However, for the last 10 years, the industry has focused on doing nothing as efficiently as possible.

That sounds funny until you realize that there is more power to be saved when the machine is idle than when it is running.

Here's an example to think about: When the Apollo 13 mission crew ran into trouble, the key to getting the astronauts home alive was turning off all non-essential systems to save energy and only turning on systems precisely for the time they were needed. That happens automatically in modern microprocessors and system on a chip (SOCs) designs: We don't wait for the emergency call from Mission Control.

The mantra in modern silicon is to run fast then shut down, and for the chip to do it intelligently, with the ability to enable and manage power to a functional block at a time. It is either mission-critical and running optimized, or it's off and saving power.

[Bye-bye batteries? See Internet Of Things: Batteries Not Required.]

What does this have to do with Moore's Law, which states that computing power doubles every 18 months? While there has been plenty of debate as to whether Moore's Law is running out as the industry continues to shrink chips down, energy efficiency is one way to expand it beyond its current form. How can this happen? Let's explore the idea.

Internet of Things: more power

As we obsessively check our handheld devices, not many of us think about the amazing IT infrastructure that it takes to put all of that information into our hands.

On another level, even fewer people think of all the power it takes to run this infrastructure. The closest many of us come to considering energy efficiency is hoping that our device battery will make it to the end of the day.

From streaming video or music, to sharing photos, to social media, to tracking our workouts, and reviewing restaurants, we are more connected than ever before. Smartphones took the world by storm starting with the iPhone in 2007. As developers made applications for anything we could think of, our smartphones became an indispensable part of our lives. For example, as reported in 2012, an astonishing 90% of 18- to 29-year-olds sleep with their smartphones.

Next on the horizon are the wearable technologies, such as Google Glass, smartwatches, and an assortment of fitness and health monitoring devices. They are all vying for a foothold in the market.

All of this is happening at the dawning of the super-connected Internet of Things, with a multitude of Internet-connected devices or appliances, as well as another trend called "Surround Computing," where we will be immersed in computational power that will anticipate our needs and seamlessly deliver the information relevant to our environment. Surround Computing is really a superset of IoT because it also describes how we will interact naturally with the technology. But, as technology enables us in these new and exciting ways, important questions are being raised about the energy needed to power this growing infrastructure.

How much energy powers IT?

"Worldwide, 3 billion personal computers use more than 1% of all energy consumed, and 30 million computer servers use an added 1.5% of all electricity at an annual cost of $14-18 billion. Expanded use of the Internet, smartphones, and the network to connect everything is causing all of those numbers to escalate," according to the MIT Energy Initiative

Similarly, the US Department of Energy estimates that US "IT and telecommunications facilities annually consume roughly 120 billion kilowatt hours of electricity -- or 3% of all US electricity use."

The good news is that, "as the performance of computers has shown remarkable and steady growth, doubling every year-and-a-half since the 1970s, the electrical efficiency of computing (the number of computations that can be completed per kilowatt-hour of electricity used) has also doubled every year-and-a-half since the dawn of the computer age," according to an MIT Technology Review article by Dr. Jonathan Koomey, an author and consulting professor at Stanford University.

I believe passively cooled laptops, mobile phones, and tablets are part of this trend, which has led to rapid reductions in the power consumed by battery-powered computing devices.

However, Koomey also noted that "the power needed to perform a task requiring a fixed number of computations has been observed to fall by half every 1.5 years (or a factor of 100 every decade)."

If this sounds familiar, it should, which brings us back to Moore's Law and the observations made by Gordon Moore in 1965.

Moore's Law accurately predicted that the number of transistors on a CPU would double every two years. The trend of peak-power efficiency follows the same pattern because as we pack more transistors into a processor the distance that electricity has to travel through the device gets smaller, the transmission speed increases, and this lowers the amount of power needed to perform a given unit of computing.

But, this near-steady pace of improvement in energy efficiency has actually been slowing over the last decade and now significantly trails the Moore's Law prediction. The question now is how best to get back on track.

Next Page

Mark Papermaster is chief technology officer and senior vice president at AMD, responsible for corporate technical direction, and AMD's intellectual property and system-on-chip product research and development. His more than 30 years of engineering experience includes ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.

1 of 2