Bartholemew Cooke

Kwabena Boahen stands five feet ten. His head is shaved clean, revealing delicate veins that creep like vines up the base of his cranium. Inside, his brain weighs three pounds and runs on twenty watts of power — a third of the power of your average desk lamp. Compare his brain with a laptop computer and the brain wins hands down: Bit for bit, whether you're doing arithmetic or solving crosswords, the brain is a million times more power efficient than your computer.

Boahen, forty-six, knows this. He grew up in Ghana, got his first computer when he was sixteen. Parked it on a desk under the eave of his family's painted concrete house, beside a courtyard of banana and mango trees. He programmed it to encrypt words. But as soon as he read about how the computer worked, he was disgusted. Doing even the simplest task required thousands of energy-heavy transistors to spew out digital 1's and 0's. It was just the kind of invention that would come from America, where energy was cheap and limitless.

Still, he wanted to build computers. So while many of today's chip engineers sat in movie theaters drinking down the futuristic promises of Star Trek, Boahen read "how it works"books with diagrams of diesel engines and dismantled tape recorders. His parents called his pastime konkaka — a word, in the local Twi dialect of his father, for the banging of hammers.

Today, in his Stanford lab, far from the Ghanese university where he learned Fortran by scrawling computer programs in paper notebooks, he spends days staring at mazes of purple, green, and yellow lines: the floor plans of chips that his neuromorphic engineering group is designing. A single transistor, laid out in silicon and metal oxide, fills the computer screen. In real life that transistor is one-fortieth as wide as a red blood cell. The chip contains twenty-three million of them. In fact, a red blood cell is an apt metaphor, because Boahen is not simply building a new type of computer. He's reinventing it as a brain.

To understand why, you need to understand the computer on your desk. Its chips contain up to a billion transistors. When one misfires, Microsoft Word might hiccup, or a spreadsheet is suddenly corrupted. Your computer's performance is wholly dependent upon accuracy. Yet it does amazingly well — its transistors misfire maybe one in every 100 quadrillion times. But we pay dearly for that accuracy in energy.

Rather than 1's and 0's, your brain uses something messier called population coding of information, in which intelligence emerges from the capacity of hundreds of thousands of extremely low-energy brain cells to synchronize their pulses, like spectators at a stadium doing the wave. Some wave early, some late, some not at all — but the overall wave is unmistakable and extremely energy efficient.

To emulate this, Boahen straddles the worlds of biology and electronics. He follows the results coming out of neurobiology labs around the world, where researchers are tracing cell-by-cell neural circuits in different brain areas. His students sometimes do experiments themselves: One afternoon a monkey stares at a computer screen. It watches, waits for something to move. Its eyes suddenly shift and a hair-thin electrode planted in its brain picks up the neural chatter. Boahen's students eavesdrop on the electric staccato that the neurons sputter. Then they test their ideas by designing neural chips.

Boahen's neural chips are built of the same transistors as your Pentium. But unlike your Pentium, they are expected to misfire one in ten times — not one in 100 quadrillion. So far, his lab has built silicon chips that mimic the retina and several brain areas, including the visual cortex. A former classmate of Boahen's from Cal Tech, Tobi Delbrück, has built a neural vision chip that allows a robot to play soccer goalie. These gadgets are already a hundred to a thousand times as efficient as standard computers, and improving. But understanding intelligence, the ultimate goal, will require more than these baby steps. It will require building larger neural computers than have ever been built before with millions of silicon neurons.

To Boahen, making computers more efficient isn't about being green. It's about removing the limits of what computers can do for society and solving practical problems. Sixty years ago, Alan Turing, the father of artificial intelligence, predicted that we would soon ask computers questions, and they would answer us as people do. It hasn't happened, largely because of one huge problem: A computer with as much number-crunching ability as the brain would devour around sixty million watts of electricity — equal to a hydroelectric power plant. "Unless we are way more efficient, there's no way we can do it," says Boahen. "Even if we knew how to program it, it's just physically not possible."

It's a problem that Krishna Shenoy, an engineer turned biologist who sits thirty feet from Boahen at Stanford, knows intimately. Shenoy studies a different set of monkeys, whose brains have been implanted with computer chips, allowing them to move a cursor on a computer screen. The hope is that one day they can lead to neural implants for severely paralyzed people who can move only their eyes. But today's chips don't always provide enough control. And Shenoy can't add much more computing power because the chips would use too much electricity and give off too much heat. A computer chip that replaced just 1 percent of the cells in your brain would disgorge more heat than a propane grill.

The energy problem impinges on more mundane areas, too. Take, for example, the next generation of supercomputers needed to maintain our nuclear stockpile and study muons, bosons, and the universe — they could consume up to 100 million watts of electricity. That's $100 million of electricity per year, devoted to one computer. And if you want to cool it so it doesn't burn its building down, then throw in at least another $30 million of electricity per year.

And then there's something as simple as an Internet search. Google your ex-girlfriend and you expend as much energy as you do burning a match. It adds up. Worldwide, industrial data centers devour more than 150 billion kilowatt-hours of electricity a year — equal to almost fourteen million households — an amount that has doubled in just five years. And this doesn't even take into account the more intuitive search services that Google might develop down the line. Pasting a photo into Google and commanding "Find me houses that look like this in Austin" could eat up a thousand matchsticks in a single click.

In fact, some epic challenges lie ahead for the computer industry in general. For more than forty years, the number of transistors on a computer chip doubled every couple years, from twenty-three hundred in 1971 to two billion today. But that amazing run is bogging down as engineers spend more time solving technical problems caused by heat and quantum effects. Boahen hopes that neural computing can provide an answer to that looming train wreck.

On his desk sits a green plastic electric board with silver lines of electric circuitry printed on it — a neuromorphic computer called Neurogrid. Sixteen chips, each as wide as a nickel, are plugged into the board. It is designed to mimic the cortex, the wrinkled fabric of gray matter that covers the brain. Neurogrid contains a million silicon neurons. It is lighter than a dinner plate and fits in a coat pocket. It sips just two watts of power. It emanates about as much warmth as you feel when you hold a newborn's hand. Boahen will use this neural supercomputer to probe the basic but mysterious calculations that the brain uses to do everything from recognizing faces to comprehending speech.

It took Boahen three years to make Neurogrid. And as proud as he is of it, he knows it's only the beginning. "What if you could build a fly and have a computer that's small and powerful enough to control it?" he says. "A fly can go all day with a single grain of sugar — so that's pretty damn efficient."

A fly has 250,000 neurons. A honeybee, with its navigation skills, has a million. A rat has fifty-five million. This is where Boahen needs to be. But the bigger challenge is understanding the brain's messy arithmetic — conferring upon those inert silicon wafers the miracle of common sense. It will take decades of hammering. But if you did that, then you could create anything. Neuro-prostheses. Brain chips for robots. Maybe even flies.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io