We’re all getting comfortable with face-recognition— unlocking our phones, skipping airport lines and even unlocking front doors. But the convenience is blinding us to how risky this technology actually is and how it is being used without us realizing. I’m Clare Garvie. My job is to research the use of face-recognition technology by law enforcement and then make recommendations around the use of the technology. Right now, most Americans are in a perpetual police lineup, because they got a driver’s license. After that DMV agent snaps your picture, your face is turned into a face print, a unique series of numbers that a face-recognition system can read and compare to other faces. Now, any police officer can run searches against your face for any reason. Who robbed that corner store? Who was jaywalking at 3 a.m.? Who was at this protest? The digital equivalent of police walking through a crowd and yanking each of our I.D.s out of our pockets, you could be picked out, investigated, possibly arrested, and sent to jail, because you got a driver’s license in one of these 32 states. That’s a violation of your privacy and your Fourth Amendment protection against unreasonable search. And that’s just the tip of the iceberg. “Nearly half of American adults are in facial- recognition databases.” “It does make our jobs a lot easier. And it also kind of finds that needle in a haystack.” “Photos are called from social media images, driver’s licenses and government ID.” With face recognition, America is closer to a Chinese surveillance state than most of us realize. Maybe you’re thinking, “I‘m not afraid of face recognition. I haven‘t done anything wrong, so I’ve got nothing to hide.” But wouldn‘t you object to police secretly searching your apartment every once in a while, even if you‘ve got nothing to hide? Let me tell you the three aspects of face-recognition technology that worry me the most. First, the way law enforcement uses face recognition violates our right to due process. In New York, police were looking for a suspect who‘d stolen socks from Target. They ran a face- recognition search against the surveillance footage and turned up over 200 matches. Authorities never told the suspect, who was arrested and charged, that there were over 200 other possible matches or that a face-recognition search was run at all. That information is crucial to mount a defense and give the defendant a fair trial. Ultimately, this case was dropped. We pride ourselves in this country for due process. But for thousands of people across the country, face-recognition was used to help convict them. And they never knew. Second, pictures aren‘t perfect. They‘re a tad grainy. Maybe the subject is squinting or they‘re wearing a hat or a scarf. In such cases, the algorithm has trouble finding anyone and turns up zero matches. To circumvent that, N.Y.P.D. went as far as playing celebrity look-alike in putting Woody Harrelson‘s photo, when one detective thought the surveillance camera picture of the thief resembled the actor. This may sometimes work, but the bottom line is if you search for the suspect against explicitly the wrong photo, then you‘re bound to get the wrong results out. And those inaccurate matches will lead to wrongful convictions. Third, it exhibits bias. Very simply put, some of these algorithms think all black people look more alike than white people. In San Diego, law enforcement agencies found that they were using face-recognition between 1½ and 2½ times more on communities of color than their proportion of the population. If we don‘t implement legal restrictions on face-recognition, the future looks like a Chinese-style surveillance state, one that violates our right to privacy, our right to anonymity in public and our right to free speech. Congress must first implement a national moratorium on the use of the technology. Congress can then work to develop legal restrictions, limiting the use and scope of face-recognition technology. Every American‘s privacy, First Amendment rights, freedom from unreasonable search and due process are at stake.