So what would it take for a computer brain to retain what it knows, even as it learns new things? That was the question Clune had when he and his colleagues set out to make an artificial brain act more like a human one. Their central idea: See if you can get a computer to organize—and preserve—what it knows within distinct modules of the brain, rather than overwriting what it knows every time it learns something new.

"Biological brains exhibit a high degree of modularity, meaning they contain clusters of neurons with high degrees of connectivity within clusters, but low degrees of connectivity between clusters," the team explained in a video about their research, which was published last week in the journal PLoS Computational Biology.

In humans and animals, brain modularity evolved as the optimal way to organize neural connections. That's because natural selection arranges the brain to minimize the costs associated with building, maintaining, and housing broader connections. "It is an interesting question as to how evolution solved this problem," Clune told me. "How did it figure out how to allow animals, including us, to learn a new skill without overwriting the knowledge of a previously learned skill?"

In order to encourage modularity in a computer's brain, researchers incorporated what they call "connection costs"—essentially showing the computer that modularity is preferable. Then they measured the extent to which a computer remembered an old skill once it learned a new one.

And, indeed, modularity appears to help computers, like humans, compartmentalize what they know. Which means that modularity may be one of the keys to ending catastrophic forgetting. "This paper both illuminates how natural evolution may have solved this problem, and how machine intelligence may learn to overcome it as well: by having modular brains," Clune said. "I believe modularity will be one key focus going forward that will improve AI’s ability to learn a variety of different skills, just like humans can."

"At present, there are more differences between human and computer intelligence than similarities," Clune said, "but we are slowly closing that gap."

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.