Facial recognition technology could revolutionize everyday life. But at what cost? Here's everything you need to know:

How does the technology work?

It scans faces, either in person or on a photograph, and measures distinguishing facial features such as eye position, eyebrow shape, and nostril angle. This creates a distinctive digital "faceprint" — much like a fingerprint — which the system then runs through a database to check for a match. Law enforcement agencies have had faces on file for decades; their databases provide them with the identified person's name, age, address, and any criminal history. But facial recognition is increasingly being used by commercial firms too. Facebook's system for "tagging" a photo — identifying who is in the picture — is now as accurate as users doing it themselves. Apple's new iPhone X can be unlocked when its owner simply looks at it. As the technology becomes more widespread, there are growing fears that it will erode privacy and be misused by bad actors. "We need to ask ourselves," says Kelly Gates, author of Our Biometric Future, "whether a world of ubiquitous automated identification is really one we want to build."

Who uses this technology?

Facial recognition is most common in China, where people can use it to pay for a coffee, visit tourist attractions, and even withdraw cash from ATMs. Several Chinese cities use face-scanning cameras to shame jaywalkers, by flashing their names and photographs on public display boards. But the West isn't lagging too far behind. In Europe, high-end hotels and retailers use facial recognition cameras to identify VIPs and celebrities as they enter, in order to give them preferential treatment. Several U.S. airlines are looking to replace boarding passes with face scanners. Department stores are using facial recognition to monitor how customers react to certain product displays. And these developments are only the tip of the iceberg.

What else is coming?

Doctors have already started using facial recognition to help them diagnose rare genetic diseases that produce distinctive facial characteristics; as the technology improves, they should be able to do the same for more common conditions, such as autism. Shops will soon be able to identify individual customers as soon as they walk in the store, and try to sell them specific items based on their interests and previous transactions. Dubai International Airport is scrapping one terminal's security clearance counter altogether, and replacing it with a short tunnel fitted with 80 face-scanning cameras hidden behind video screens. Law enforcement agencies are also ramping up their facial recognition capacity.

In what way?

Through various state and federal databases, the FBI now has access to photographs of half the U.S. adult population, according to a major 2017 report by the Georgetown Law Center on Privacy and Technology. Eighty percent of these people don't have a criminal record; their faces are on file solely because they have some form of state ID, such as a resident's card or a driver's license. Several police departments, including Los Angeles', have even started using body cams for "real-time" facial recognition of people officers are talking to on the street or during traffic stops. But the system is far from flawless. One in seven of the FBI's searches identifies an innocent party, even when the actual culprit is in the database. And facial recognition has always been less reliable for people with darker skin — because of the way light reflects off it — who are already arrested in disproportionately high numbers. "If you're black, you're more likely to be affected by this technology, and that technology is more likely to be wrong," says Rep. Elijah Cummings (D-Md.). "That's a hell of a combination."

What are the other risks?

The biggest danger is that authoritarian governments will use the technology to surveil and control their populations. Stanford University researchers made an algorithm that guessed someone's sexual orientation from a picture of their face with 81 percent accuracy; humans managed only 61 percent. In countries where homosexuality is illegal, that could be a dangerous weapon. FindFace, a Russian app, can identify strangers by comparing their photo to more than 200 million social media profile pictures, and it's been used to harass people. "Like any new tool," says Nicholas Rule, an associate professor of psychology at the University of Toronto, "if it gets into the wrong hands, it can be used for ill purposes."

Is there any regulation?

European regulators have proposed that all biometric data, including "faceprints," belong to their owner and thus require consent to use. But U.S. lawmakers appear relatively unconcerned: Only Illinois and Texas have laws regulating facial recognition; of 52 police agencies that have acknowledged using the technology, only one obtained legislative approval. Facial recognition still isn't as good as it is in the movies, with computers instantaneously identifying every individual in a huge crowd. But it's not that far off. "From a technological perspective, the ability to successfully conduct mass-scale facial recognition in the wild seems inevitable," says Carnegie Mellon professor Alessandro Acquisti. "Whether as a society we will accept that technology, however, is another story."

Beating the algorithm

Tricking high-end facial recognition systems isn't easy. Wired magazine hired top Hollywood artists to create silicone face masks that would trick an iPhone X into believing it was seeing its owner. They failed. But there are ways to get the better of less-advanced facial recognition systems. While wearing a regular hat, scarf, or pair of glasses makes little difference, particular patterns can confuse the software. Researchers at Carnegie Mellon created oversize colored glasses that not only masked the wearer's identity but also made the software think the person was a celebrity. Others have created patterned scarves that look, to machines, like human faces. Some have even dabbled with face paint — ­covering up parts of their cheeks with specific blocks of colors — to "dazzle" the scanner. But there's one major flaw in all these anti-surveillance techniques: They make you stick out like a sore thumb. "The very thing that makes you invisible to computers," says tech writer Robinson Meyer in The Atlantic, "makes you glaringly obvious to other humans."