The most nightmare-inducing characteristic of Big Dog , DARPA’s robotic military mule, might be the way it moves so stiffly, yet unrelentingly, over treacherous battleground. Turns out the repetitive mechanical gait that calls to mind some coming robopocalypse is also a huge headache for Big Dog’s makers—and lots of the big thinkers behind walking bots envisioned for everyday domestic use.

Units like Big Dog move so awkwardly because of their rudimentary brains, which require pre-programming for every little action. A four-legged walking bot could jump smoothly over rocks or weave through trees with the fluid grace and reflexes of a cheetah—if it only had a better brain. One that was more animal-like. Thanks to breakthroughs in understanding how biological brains evolve, a team of robotic researchers say they’re close.

“We are working on evolving brains that can be downloaded onto a robot, wake up, and begin exploring their environment to figure out how to accomplish the high-level objectives we give them (e.g. avoid getting damaged, find recharging stations, locate survivors, pick up trash, etc.),” says Jeffrey Clune, Assistant Professor of Computer Science at the University of Wyoming, who is part of the robotics team.

In its first test the software evolved digital brains with neural patterns to make a four-legged robot walk within a few hours.

The group began exploring the idea initially by evolving gaits for robots in an effort to reduce the time it takes to get them operational. Currently getting any robot to walk or perform other behaviors is extremely time-consuming for engineers. Not only must they manually program every movement, they have to reprogram them for new robots or different versions of the same robot. “The manual approach is too expensive and will not scale to produce many different types of robots,” explains Clune. Not to mention the resulting clunky and intimidating robo-swagger might work for the battlefield, but it’s not practical for, say, your future butler or home cleaner bot. The challenge is to get all sorts of robots to somehow learn to walk by themselves.

Clune and fellow team members Hod Lipson, Cornell Associate Professor of Mechanical and Aerospace Engineering and Cornell students Sean Lee and Jason Yosinski, began by combining neural networks with evolutionary concepts from developmental biology. Using the revolutionary new approach, they began growing artificial digital brains that could take a simulated or physical robot body, recognize the type of body (two-legged, four-legged, etc.), and evolve the neural patterns needed to control it. In its first test the software evolved digital brains with neural patterns to make a four-legged robot walk within a few hours. What’s more, instead of each leg doing its own thing, the walking patterns it came up with were coordinated and natural.

“Previously if you gave evolution a quadruped, and said ‘make it walk,’ it could do it, but it didn’t really understand that it had a four-legged body,” says Clune. “With developmental biology, it realizes the nature of its body, grows a brain that sees the four legs, creates similar neural wiring patterns for each leg, and thus produces regular gaits that have all four legs working together.“

The process is not an immediate cake-walk. Initially the group allowed the evolving digital brains to directly control the quadruped robot. This led to the robot breaking down numerous times because evolution tried crazy walking patterns. To improve matters the team let the brains evolve and control a body in simulation for hundreds of generations until they got the walking motion right and then transferred control to the actual robot.