The Centre Valbio research station, a modern building of stone and glass set in the jungled hills at the edge of Madagascar’s Ranomafana National Park, was starting to look like the third season of The Wire. Big tackboards lined the walls, each one covered with dozens of pinned-up photographs. Some images were grouped together in families, while others floated alone, unconnected. It was 2012, and Rachel Jacobs was using Detective McNulty-style tactics to sort out the associations in a very different kind of crew: the park’s population of red-bellied lemurs.

A biological anthropologist, Jacobs was studying how color vision evolved in lemurs, which meant keeping track of more than 100 animals. She got good at telling them apart. After Jacobs finished her dissertation, her Ranomafana colleagues kept calling her up for lemur ID help—so much that the Skype pings got overwhelming. So Jacobs started sending emails to every computer vision expert she could find. Last week, after years of working with students and faculty at Michigan State University to train an artificial deep neural network on her stash of field photos, Jacobs finally revealed her second set of eyes: LemurFaceID.

The program is a facial recognition system much like the ones Facebook and Google use for people. But instead of looking at facial geometries—like the distance between your eyes, or the length of your nose—LemurFaceID uses 10x10-pixel squares to identify differences in fur texture. (Also like human face recognition software, photos have to be black and white for LemurFaceID to work.) It’s good enough to correctly identify a lemur out of a known set of individuals 98.7 percent of the time.

Crouse et al. 2017

Facial recognition software like Facebook’s need huge amounts of training data—millions of photographs—but Jacobs only had hundreds of lemur pics. So they had to do some finagling, using not one search image, but two fused together, and manually showing the computer where each lemur’s eyes are. “That was a huge wake up call for us,” says Jacobs. “Anything over 20 individuals is a large dataset to a lemur biologist. To automate this further we’ll need lots more cameras and lots more photographs.”

That’s the dream. Twenty-two thousand tourists visit Ranomafana every year to see its 12 species of lemurs, most of which are threatened or endangered. That’s a lot of smartphone cameras that could be turned toward the trees. Jacobs and her team are working toward building LemurFaceID into an app tourists could download when they visit, so that the database and the power of the software grow with every snap.

“I don’t think we should ever be wholly reliant on any computer system for identification,” says Jacobs, who is now a postdoc at George Washington University. But it’s certainly a less invasive technique than capturing, drugging, and collaring or tagging. Those processes—while they get you additional data, like health assessments and DNA samples—always carry the risk of injuring the animals or disrupting group dynamics.

Lemurs aren’t the only animals getting the benefit of newer and better computer vision and artificial intelligence systems coming online right now. A group in Germany is starting to do similar facial recognition for chimpanzees. Ecologists in the Congo use computer vision to track zebras based on their unique stripes. And scientists at Dartmouth recently developed a pattern-matching algorithm called Wild-ID to monitor large migrations of wildebeest and giraffes in Tanzania. It works so well for giraffes that they’ve stopped capturing and tagging the animals, even as they conduct the largest-ever study of giraffe demographics.

After the LemurFaceID paper came out, Anil Jain, one of the Michigan State collaborators, started getting emails from biologists all over the globe wanting to know if it was possible to make a system for them too. From grizzly bears in Montana to elephants in India, scientists are clamoring to get more cameras and more computers involved in counting, monitoring, and tracking their wild wards. For now, Jain isn’t taking on any new partnerships, but he’s optimistic about the potential for the field. “What we did with lemurs we did as a side project with no money,” he says. “But you could do a lot more with more time and resources.”

Like say, an army of aerial drones all equipped with high-res cameras. Or a fleet of underwater robotstricked out with fish-cams. They sure beat a tackboard full of pins and Post-It notes.