Inspired by the human brain, UC San Diego scientists have constructed a new kind of computer that stores information and processes it in the same place. This prototype “memcomputer” solves a problem involving a large dataset more quickly than conventional computers, while using far less energy, the scientists say in a study.

The memcomputer prototype is a specialized proof of concept, but can be improved into a general-purpose computer, say researchers led by Massimiliano Di Ventra, a UCSD professor of physics. Such memcomputers could equal or surpass the potential of quantum computers, they say, but because they don’t rely on exotic quantum effects are far more easily constructed.

Besides solving extremely complex problems involving huge amounts of data, memcomputers can potentially teach us more about how the brain operates, Di Ventra said. While the brain is often compared to a computer, the two are organized and operate much differently.

The study was published Friday in the journal Science Advances. Di Ventra was senior author; first author was Fabio Lorenzo Traversa, also of UCSD.


Conventional computers store data in one location designated for memory, and transfer it to processors located elsewhere to computer answers. But the human brain combines storage and processing in one place, treating these as one combined entity.

First proposed a few years ago, memcomputers likewise combine the storage and processing functions. Called a “collective state,” this complex signal actually contains the problem solution, which in theory can be easily extracted. The prototype demonstrates this can be accomplished.

Memcomputers theoretically surpass conventional computers in their ability to handle certain very large datasets. That’s because for conventional computers, the complexity of the problem increases exponentially, while the dataset grows in a linear fashion. But for memcomputers, the complexity also increases in a linear fashion.

The difference between exponentiality and linearity can be illustrated by examining the differences between the numbers 10 and 100, as expressed in the number of digits in each number and their actual values. The first has two digits and the second three, a linear increase. But the increase in actual numerical value from 10 to 100 is exponential.


Problems that involve exponential increases in complexity quickly become impractical for conventional computing.

Networks of memcomputing units act unlike a computer but like a brain in that they can function well even when some parts don’t work, Di Ventra said by email.

Neurons often fail to function properly at the individual level, but the network can tolerate many errors and still function. By contrast, just one error in a program or a component can cause a computer to crash.

Studying this fault-tolerant property could teach us more about how brains work, and how they break down, Di Ventra said.


“From memcomputing we can learn for instance the ability of the network of interconnected memprocessors in bypassing broken connections, namely how robust is such a network to damage of its units while still able to compute specific tasks,” Di Ventra said. “This could possibly translate in our understanding of the maximum amount of damage to neurons done by degenerative diseases, like Alzheimer’s, before we lose specific functions.”

The study represents a significant advance in the field, said Yuriy V. Pershin, another researcher who has collaborated with Di Ventra and Traversa, but did not take part in this study.

“To the best of my knowledge, this is the first time people have used the collective state approach to solve exponentially hard problems,” said Pershin, an associate professor in the department of physics and astronomy at the University of South Carolina.

“This work opens the way of solving complex problems in one step in the analog fashion, potentially much faster (for large problems) compared to the time it takes to find the solution of the same problem by conventional computer algorithms,” Pershin said by email.


The prototype memcomputer is limited because it is analog, not digital, Di Ventra said, also by email. Analog computing is especially susceptible to interference from noise, which limits the ability to scale up the numbers of memprocessors in one computer.

“However, memcomputers can be made also digital (namely to process 0s and 1s like our present computers) therefore less susceptible to noise and hence they are scalable to a large number of units,” Di Ventra said. “These digital memcomputers then hold great promise to complement present computers in those tasks in which they are not efficient, such as the combinatorial problems we have considered in our paper.”

Alternatively, the paper stated, noise can be reduced by adding error-correcting codes.