A thermal image of IBM's so-called TrueNorth computer chip (left) next to other chips feeding data to the brainlike TrueNorth chip.

IBM's latest brainlike computer chip may not be "smarter than a fifth-grader," but it can simulate millions of the brain's neurons and perform complex tasks using very little energy.

Researchers for the computer hardware giant have developed a postage-stamp-size chip, equipped with 5.4 billion transistors, that is capable of simulating 1 million neurons and 256 million neural connections, or synapses. In addition to mimicking the brain's processing by themselves, individual chips can be connected together like tiles, similar to how circuits are linked in the human brain

The team used its "TrueNorth" chip, described today (Aug. 7) in the journal Science, to perform a task that is very challenging for conventional computers: identifying people or objects in an image. [Super-Intelligent Machines: 7 Robotic Futures]

"We have not built a brain. What we have done is learn from the brain's anatomy and physiology," said study leader Dharmendra Modha, manager and lead researcher of the cognitive computing groupat IBM Research - Almaden in San Jose, California.

Modha gave an analogy to explain how the brainlike chip differs from a classical computer chip. You can think of a classical computer as a left-brained machine, he told Live Science; it's fast, sequential and good at crunching numbers. "What we're building is the counterpart, right-brain machine," he said.

Right-brained machine

Classical computers — from the first general-purpose electronic computer of the 1940s to today's advanced PCs and smartphones — use a model described by Hungarian-American mathematician and inventor John von Neumann in 1945. The Von Neumann architecture contains a processing unit, a control unit, memory, external storage, and input and output mechanisms. Because of its structure, the system cannot retrieve instructions and perform data operations at the same time.

IBM's TrueNorth chip can simulate simulate millions of the brain's neurons. (Image credit: IBM Research)

In contrast, IBM's new chip architecture resembles that of a living brain. The chip is composed of computing cores that each contain 256 input lines, or "axons" (the cablelike part of a nerve cell that transmits electrical signals) and 256 output lines, or "neurons." Much like in a real brain, the artificial neurons only send signals, or spikes, when electrical charges reach a certain threshold.

The researchers connected more than 4,000 of these cores on a single chip, and tested its performance with a complex image-recognition task. The computer had to detect people, bicyclists, cars and other vehicles in a photo, and identify each object correctly.

The project was a major undertaking, Modha said. "This is [the] work of a very large team, working across many years," he said. "It was a multidisciplinary, multi-institutional, multiyear effort."

The Defense Advanced Research Projects Agency (DARPA), the branch of the U.S. Department of Defense responsible for developing new technologies for the military, provided funding for the $53.5 million project. [Humanoid Robots to Flying Cars: 10 Coolest DARPA Projects]

After the team constructed the chip, Modha halted work for a month and offered a $1,000 bottle of champagne to any team member who could find a bug in the device. But nobody found one, he said.

The new chip is not only much more efficient than conventional computer chips, it also produces far less heat, the researchers said.

Today's computers — laptops, smartphones and even cars — suffer from visual and sensory impairment, Modha said. But if these devices can function more like a human brain, they may eventually understand their environments better, he said. For example, instead of moving a camera image onto a computer to process it, "the [camera] sensor becomes the computer," he said.

Building a brain

IBM researchers aren't the only ones building computer chips that mimic the brain. A group at Stanford University developed a system called "Neurogrid" that can simulate a million neurons and billions of synapses.

But while Neurogrid requires 16 chips linked together, the IBM chip can simulate the same number of neurons with only a single chip, Modha said. In addition, Neurogrid's memory is stored off-chip, but the new IBM system integrates both computation and memory on the same chip, which minimizes the time needed to transmit data, Modha said.

Kwabena Boahen, an electrical engineer at Stanford who led the development of the Neurogrid system, called the IBM chip "a very impressive achievement." (Several of Boahen's colleagues on the Neurogrid project have gone on to work at IBM, he said.)

The IBM team was able to fit more transistors onto a single chip, while making it very energy efficient, Boahen told Live Science. Greater energy efficiency means you could compute things directly on your phone instead of relying on cloud computing, the way Apple's voice-controlled Siri program operates, he said. That is, Siri outsources the computation to other computers via a network instead of performing it locally on a device.

IBM created the chip as part of DARPA's SyNAPSE program (short for Systems of Neuromorphic Adaptive Plastic Scalable Electronics). The goal of this initiative is to build a computer that resembles the form and function of the mammalian brain, with intelligence similar to acat or mouse.

"We've made a huge step forward," Modha said. The team mapped out the wiring diagram of a monkey brain in 2010, and produced a small-scale neural core in 2011. The current chip contains more 4,000 of these cores.

Still, the IBM chip is a far cry from a human brain, which contains about 86 billion neurons and 100 trillion synapses. "We've come a long way, but there's a long way to go," Modha said.

Editor's Note: This article was updated at 4:18 p.m. ET Oct. 2. The estimated number of neurons in the human brain is 86 billion, not 86 trillion.

Follow Tanya Lewis on Twitter and Google+. Follow us @livescience, Facebook & Google+. Original article on Live Science.