Stay on Top of Emerging Technology Trends Get updates impacting your industry from our GigaOm Research Community

IBM’s pursuit of a chip far more powerful, small and light than the existing options takes its inspiration from a familiar source: the human brain. The company revealed the second generation of its SyNAPSE chip today that lead researcher Dharmendra Modha described as “a supercomputer the size of a stamp, the weight of a feather.” And it consumes the same amount of power as a hearing aid.

“I’m holding in my hand a new machine for a new era,” Modha said in an interview. “When we started the SyNAPSE project many people thought it was impossible. As we look to the future, possible can become real. This could open up a whole new frontier of scientific exploration and commercial exploitation.”

The chip, which appeared on the cover of the journal Science today, contains 1 million programmable neutrons and 256 million programmable synapses, and is capable of 46 billion synaptic operations per second per watt. Neutrons represent computation while synapses take the place of integrated memory.

The SyNAPSE chip can also be scaled up — just attach more chips to each other and they become more powerful.

IBM achieved these feats by throwing out the old model for chip structure and acting more like a brain. As my colleague Stacey Higginbotham described an earlier SyNAPSE chip in 2011:

Today’s chips run into a problem called the Von Neumann bottleneck, which is when the chip cannot feed the data in the memory to the processing core fast enough. Without the data the chip idles and the incredible clock speeds we’ve built into chips are somewhat wasted. The neurosynaptic chip throws that model away and relies instead on tracking relationships between events and determining if those events lead to action. When the “neurons” on the chip fire, it sets of a binary response that the processors in each neuron evaluate. When enough feedback comes from that neuron or neurons nearby the system then “understands” where that information fits in, and the chip can make a decision to react to that series of stimuli. Fundamentally, this chip learns.

So IBM has an impressive chip on its hands. But the real challenge will be getting people to use it. IBM has created a programming language and curriculum to aid the chip’s adoption, but faces competition from other emerging designs. Dharma could not reveal when the chip will be available commercially, but said IBM is working on it.

If the chip is adopted, Modha sees applications as diverse as cloud computing and wearable electronics. Small devices like robots could roam long distances crunching large amounts of data without needing to recharge. Autonomous cars and mobile devices could use the chip to see surroundings in real time.

Modha clarified that the chip is not “any brain, the brain or a brain.” With current technology, a chip that fully replicated the brain would consume as much energy as several cities. A true computer brain is still years away.

“I want to see this used,” Modha said. “I want to see the next generation of it. We want to do more and really keep pushing the boundary farther and farther.”