Equipped with wearable AI systems and digital eyes that see what human eyes can't, space explorers of the future could be not just astronauts, but "cyborg astrobiologists."

That's the vision of a research team led by Patrick McGuire, a University of Chicago geoscientist who's developed algorithms that can recognize signs of life in a barren landscape.

"When they look at scenery, children gravitate towards the thing that's different from the other things," said McGuire. "That's how I looked at the cyborg astrobiologist."

At the heart of McGuire's system is a Hopfield neural network, a type of artificial intelligence that compares incoming data against patterns it's seen before, eventually picking out those details that qualify as new or unusual.

As described in a paper published Thursday inarXiv, the system successfully differentiates lichen from surrounding rock — a proof-of-principle test that lays the foundation for adding other types of data.

For the last several years, McGuire worked on CRISM, a Mars-orbiting imager that detects infrared and other invisible-to-human-eye wavelengths of light, allowing it to identify different types of rock and soil. McGuire envisions the digital eyes of cyborg astrobiologists as scaled-down versions of CRISM, their data perpetually crunched by the Hopfield networks on their hips.

"You would have a very complex artificial intelligence system, with access to different remote sensing databases, to field work that's been done before in the area, and it would have the ability to reason about these in human-like ways," said McGuire.

The lichen tests were conducted in Spain and at Utah's Mars Desert Research Station, where two of the researchers donned spacesuits and lived for two weeks in the field as astronauts. They carried hand-held digital microscopes and cell phone cameras, which sent the data via bluetooth to netbooks running McGuire's Hopfield network.

The lichen identification was based on color data. McGuire next plans to train the network to process different textures. Ultimately he wants to conduct analysis at different scales, from the microscopic up to landscape-wide.

McGuire cautioned that his team's system is "nowhere near" its ready-for-Mars ideal, and it will likely be decades before people explore the surface of Mars in person. In the meantime, cyborg astrobiologists might search the South Pole for Martian meteorites, and feature-identifying algorithms could be uploaded to Mars-roving robots.

"Then you'd have a robotic astrobiologist, and the humans would be back here on Earth, in Mission Control," he said. "The algorithms help us out, but humans are ultimately responsible."

Images: Patrick McGuire

See Also:

Citation: "The Cyborg Astrobiologist: Testing a Novelty-Detection Algorithm on Two Mobile Exploration Systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah." P.C. McGuire, C. Gross, L. Wendt1, A. Bonnici, V. Souza-Egipsy, J. Ormö, E. Díaz-Martínez, B.H. Foing, R. Bose, S. Walter, M. Oesker, J. Ontrup, R. Haschke, H. Ritter. arXiv, October 29, 2009.

Brandon Keim's Twitter stream and reportorial outtakes; Wired Science on Twitter. Brandon is currently working on a book about ecosystem and planetary tipping points.