The system accurately identified people wearing a cap, scarf and glasses more than half the time John Powell/REX/Shutterstock

Ditch the hat and scarf – it’s not fooling anyone. Face recognition software can now see through your cunning disguise – even you are wearing a mask.

Amarjot Singh at the University of Cambridge and his colleagues trained a machine learning algorithm to locate 14 key facial points. These are the points the human brain pays most attention to when we look at someone’s face.

The researchers then hand-labelled 2000 photos of people wearing hats, glasses, scarves and fake beards to indicate the location of those same key points, even if they couldn’t be seen. The algorithm looked at a subset of these images to learn how the disguised faces corresponded with the undisguised faces.


The system accurately identified people a wearing scarf 77 per cent of the time – a cap and scarf 69 per cent of the time and a cap, scarf and glasses 55 per cent of the time. This isn’t as good as systems that recognise undisguised human faces, but it is the best at seeing through disguises, says Singh.

The system only needs to be able to see a fraction of facial key points – most of which are around the eyes and mouth – to be able to guess where the other points are likely to be. Based on that guess, it can identify the person if it has already been shown a map of their key points.

“In effect, it is able to see through your mask,” says Singh. You can also probably say goodbye to CV Dazzle, the vaunted face recognition camouflage makeup that has been mooted as the way to stay anonymous in a world of face recognition. “This will work very well for this type of camouflage because it works on key points of the face,” he says.

He will present his findings at the International Conference on Computer Vision in Italy in late October.

Singh has plans to take this research even further and see if it’s possible to design an algorithm that can identify someone wearing a rigid plastic mask, like the V for Vendetta masks that are popular at some protests.

The system could be used to identify criminals who are trying to hide their identities, says Singh. But he admits it could also be used by authoritarian governments to identify protesters. “It kind of impinges on the privacy of people,” he says.

Automatic face recognition software is catching on with law enforcement across the world. In August, the UK government said it planned to spend £4.6 million on upgrading face recognition software so algorithms could automatically spot suspects from live video footage.

“There’s always a trade-off between security and privacy,” says Anil Jain at Michigan State University. But he says that people in public spaces are already under constant surveillance by security cameras, so they shouldn’t be too worried about every improvement in the technology.

For now, the system is far from perfect. The fewer facial key points it can see, the worse the software is at recognising a person in a photo. It’s also thrown off by busy backgrounds, so can only identify a person wearing a cap, glasses and scarf 43 per cent of the time if they’re standing in front of a complicated background.

It’s also not clear how well this system would work in the real world, Jain says. The algorithm was only trained on photos of 25 people, which he says isn’t enough to really determine its efficacy.

And anti-surveillance tricks are keeping pace with improvements. Last year, a team of researchers from Carnegie Mellon University found they could trick face recognition software by wearing specially designed glasses . Now might be the time to trade in your fake beard for a pair of jazzy specs.

Reference: https://arxiv.org/pdf/1708.09317.pdf