Rice professor's discovery may save your iPhone battery Professor works to revolutionize computer chips

Rice University professor Krishna Palem﻿ ﻿created a microchip that uses 30 times less electricity while running seven times faster than today’s technology. Rice University professor Krishna Palem﻿ ﻿created a microchip that uses 30 times less electricity while running seven times faster than today’s technology. Photo: Michael Paulsen, Chronicle Photo: Michael Paulsen, Chronicle Image 1 of / 1 Caption Close Rice professor's discovery may save your iPhone battery 1 / 1 Back to Gallery

Engineers have long lived by a simple, seemingly obvious rule when designing new computers: The machines have to deliver correct answers.

If asked to compute 2+2, a computer must answer 4. But what if computers didn’t always have to answer correctly?

Nearly a decade ago, a Houston computer scientist posed this heretical question. Today, it has led to a movement dubbed "probabilistic computing," which he believes will revolutionize the future of computing.

On Sunday, Krishna Palem, speaking at a computer science meeting in San Francisco, announced results of the first real-world test of his probabilistic computer chip: The chip, which thrives on random errors, ran seven times faster than today’s best technology while using just 1⁄30th the electricity.

Just think: One need never again worry about draining an iPhone battery in a day or even a week.

“The results were far greater than we expected,” said Palem, a Rice University professor who envisions his chips migrating to mobile devices in less than a decade.

Results in the lab often don’t translate into real-world applications. But Palem believes his most recent results will go a long way toward muting skepticism about probabilistic computing, which at one time was nearly universal among computer scientists.

Al Barr, a computer scientist at the California Institute of Technology, said he and a number of experts are now eager to pursue applications of probabilistic computing in areas such as computer-generated graphics. “Initially there was definitely a lot of skepticism,” he admitted.

Around 2000, when Palem began thinking about the future of computer chip technology, power consumption wasn’t a big consideration. Only speed mattered.

But today, the energy consumed by information technology — a January news story likened the energy used in just two Google searches to boiling a kettle of tea — has become a major consideration. “Today, the global carbon footprint of information technology is non-negligible,” Palem said.

Filling up the chips

Conventional computer chips are power hungry by the nature of their design.

To feed the ever-greater processing demands of modern computer applications, engineers have continually squeezed more and more transistors onto semiconductor computer chips. This allows computers to process more calculations at once and, therefore, perform faster.

The trend toward a doubling of the number of transistors that can be placed on an integrated circuit was first noticed by Intel co-founder Gordon E. Moore in the 1960s and since has been dubbed Moore’s Law.

But most computer scientists believe they will soon hit a wall where they can no longer cram more transistors into ever tinier spaces.

The high density of transistors on existing chips also leads to a lot of background “noise.” To compensate, engineers increase the voltage applied to computer circuits to overpower the noise and ensure precise calculations.

Palem began wondering how much a slight reduction in the quality of calculations might improve speed and save energy. He soon realized that some information was more valuable than other information.

For example, in calculating a bank balance of $13,000.81, getting the “13” correct is much more important than the “81.” Producing an answer of $13,000.57 is much closer to being correct than $57,000.81.

While Palem’s technology may not have a future in calculating missions to Mars, it probably has one in such applications as streaming music and video on mobile devices, he said.

Much as the brain automatically fills in missing words in incomplete sentences, Palem said, the brain compensates for a few errant pixels in a mobile phone’s video screen. “In effect, we are putting a little more burden on the CPU in our heads and a little less burden on the CPU in our pockets.”

All of this worked well enough in mathematical theory and simulations, but Palem couldn’t prove his concept until he built and tested a chip. The first test results came back late last year.

All of this worked well enough in mathematical theory and simulations, but Palem couldn’t prove his concept until he built and tested a chip. The first test results came back late last year.

“At first, I almost couldn’t believe them,” he said. “I spent several sleepless nights verifying the results.”

Probabilistic computer chips have already caught the attention of industry, especially with the end of Moore’s Law looming for conventional chips.

“This logic will prove extremely important, because basic physics dictates that future transistor-based logic will need probabilistic methods,” said Shekhar Borkar, director of Intel’s Microprocessor Technology Lab.