AEGIS works with an instrument on Curiosity called the ChemCam, short for chemistry and camera. The ChemCam, a one-eyed, brick-shaped device that sits atop the rover’s spindly robotic neck, emits laser beams at rocks and soil as far as 23 feet away. It then uses the light coming from the impacts to analyze and detect the geochemical composition of the vaporized material. Before AEGIS, when Curiosity arrived at a new spot, ready to explore, it fired the laser at whatever rock or soil fell into the field of view of its navigation cameras. This method certainly collected new data, but it wasn’t the most discerning way of doing it.

With AEGIS, Curiosity can search and pick targets in a much more sophisticated fashion. AEGIS is guided by a computer program that developers, using images of the Martian surface, taught to recognize the kind of rock and soil features that mission scientists want to study. AEGIS examines the images and finds targets that resemble set parameters, ranking them by how closely they match what the scientists asked for. (It’s not perfect; AEGIS can sometimes include a rock’s shadow as part of the object.)

Here’s how Curiosity’s cameras see the Martian landscape with AEGIS. The targets outlined in blue were rejected, the red are potential candidates. The best targets are filled in with green, and the second-best with orange:

NASA / JPL-Caltech

When AEGIS settles on a preferred target, ChemCam zaps it.

AEGIS also helps ChemCam with its aim. Let’s say operators back on Earth want the instrument to target a specific geological feature they saw in a particular image. And let’s say that feature is a narrow mineral vein carved into bedrock. If the operators’ commands are off by a pixel or two, ChemCam could miss it. They may not get a second chance to try if Curiosity’s schedule calls for it so start driving again. AEGIS corrects ChemCam’s aim in human-requested observations and its own search.

These autonomous activities have allowed Curiosity to do science when Earth isn’t in the loop, says Raymond Francis, the lead system engineer for AEGIS at NASA’s Jet Propulsion Laboratory in California. Before AEGIS, scientists and engineers would examine images from Curiosity, determine further observations, and then send instructions back to Mars. But while Curiosity is capable of transmitting large amounts of data back to Earth, it can only do so under certain conditions. The rover can only directly transmit data to Earth for a few hours of day because it saps power. It can also transmit data to orbiters circling Mars, which will then kick it over to Earth, but the spacecraft only have eyes on the rover for about two-thirds of the Martian day.

“If you drive the rover into a new place, often that happens in the middle of the day, and then you’ve got several hours of daylight after that when you could make scientific measurements. But no one on Earth has seen the images, no one on Earth knows where the rover is yet,” Francis says. “We can make measurements right after the drives and send them to Earth, so when the team comes in the next day, sometimes they already have geochemical measurements of the place the rover’s in.”