Brain limits

Energy Limits to the Computational Power of the Human Brain

This article first appeared in Foresight Update No. 6, August 1989.

A related article on the memory capacity of the human brain is also available on the web.

The Brain as a Computer

A second possible basic operation is inspired by the observation that signal propagation is a major limit. As gates become faster, smaller, and cheaper, simply getting a signal from one gate to another becomes a major issue. The brain couldn't compute if nerve impulses didn't carry information from one synapse to the next, and propagating a nerve impulse using the electrochemical technology of the brain requires a measurable amount of energy. Thus, instead of measuring synapse operations per second, we might measure the total distance that all nerve impulses combined can travel per second, e.g., total nerve-impulse-distance per second.

Other Estimates

A second approach is to estimate the computational power of the retina, and then multiply this estimate by the ratio of brain size to retinal size. The retina is relatively well understood so we can make a reasonable estimate of its computational power. The output of the retina--carried by the optic nerve--is primarily from retinal ganglion cells that perform center surround computations (or related computations of roughly similar complexity). If we assume that a typical center surround computation requires about 100 analog adds and is done about 100 times per second [3], then computation of the axonal output of each ganglion cell requires about 10,000 analog adds per second. There are about 1,000,000 axons in the optic nerve [5, page 21], so the retina as a whole performs about 1010 analog adds per second. There are about 108 nerve cells in the retina [5, page 26], and between 1010 and 1012 nerve cells in the brain [5, page 7], so the brain is roughly 100 to 10,000 times larger than the retina. By this logic, the brain should be able to do about 1012 to 1014 operations per second (in good agreement with the estimate of Moravec, who considers this approach in more detail [4, page 57 and 163]).

The Brain Uses Energy

The total energy consumption of the brain is about 25 watts [2]. Inasmuch as a significant fraction of this energy will not be used for useful computation, we can reasonably round this to 10 watts.

Nerve Impulses Use Energy

A nerve cell has a resting potential--the outside of the nerve cell is 0 volts (by definition), while the inside is about -60 millivolts. There is more Na+ outside a nerve cell than inside, and this chemical concentration gradient effectively adds about 50 extra millivolts to the voltage acting on the Na+ ions, for a total of about 110 millivolts [1, page 15]. When a nerve impulse passes by, the internal voltage briefly rises above 0 volts because of an inrush of Na+ ions.

The Energy of a Nerve Impulse

To translate Ranvier ops (1-millimeter jumps) into synapse operations we must know the average distance between synapses, which is not normally given in neuroscience texts. We can estimate it: a human can recognize an image in about 100 milliseconds, which can take at most 100 one-millisecond synapse delays. A single signal probably travels 100 millimeters in that time (from the eye to the back of the brain, and then some). If it passes 100 synapses in 100 millimeters then it passes one synapse every millimeter--which means one synapse operation is about one Ranvier operation.

Discussion

While the software remains a major challenge, we will soon be able to build hardware powerful enough to perform more such operations per second than can the human brain. There is already a massively parallel multi-processor being built at IBM Yorktown with a raw computational power of 1012 floating point operations per second: the TF-1. It should be working in 1991 [6]. When we can build a desktop computer able to deliver 1025 gate operations per second and more (as we will surely be able to do with a mature nanotechnology) and when we can write software to take advantage of that hardware (as we will also eventually be able to do), a single computer with abilities equivalent to a billion to a trillion human beings will be a reality. If a problem might today be solved by freeing all humanity from all mundane cares and concerns, and focusing all their combined intellectual energies upon it, then that problem can be solved in the future by a personal computer. No field will be left unchanged by this staggering increase in our abilities.

Conclusion

References

1. Ionic Channels of Excitable Membranes, by Bertil Hille, Sinauer, 1984.

2. Principles of Neural Science, by Eric R. Kandel and James H. Schwartz, 2nd edition, Elsevier, 1985.

3. Tom Binford, private communication.

4. Mind Children, by Hans Moravec, Harvard University Press, 1988.

5. From Neuron to Brain, second edition, by Stephen W. Kuffler, John G. Nichols, and A. Robert Martin, Sinauer, 1984.

6. The switching network of the TF-1 Parallel Supercomputer by Monty M. Denneau, Peter H. Hochschild, and Gideon Shichman, Supercomputing, winter 1988 pages 7-10.

7. Myelin, by Pierre Morell, Plenum Press, 1977.

8. The production and absorption of heat associated with electrical activity in nerve and electric organ by J. M. Ritchie and R. D. Keynes, Quarterly Review of Biophysics 18, 4 (1985), pp. 451-476.

Acknowledgements