Last year, a Russian startup announced that it could scan the faces of people passing by Moscow’s thousands of CCTV cameras and pick out wanted criminals or missing persons. Unlike much face recognition technology — which runs stills from videos or photographs after the fact — NTechLab’s FindFace algorithm has achieved a feat that once only seemed possible in the science fictional universe of “Minority Report”: It can determine not just who someone is, but where they’ve been, where they’re going, and whether they have an outstanding warrant, immigration detainer, or unpaid traffic ticket. For years, the development of real-time face recognition has been hampered by poor video resolution, the angles of bodies in motion, and limited computing power. But as systems begin to transcend these technical barriers, they are also outpacing the development of policies to constrain them. Civil liberties advocates fear that the rise of real-time face recognition alongside the growing number of police body cameras creates the conditions for a perfect storm of mass surveillance. “The main concern is that we’re already pretty far along in terms of having this real-time technology, and we already have the cameras,” said Jake Laperruque, a fellow at the Constitution Project. “These cameras are small, hard to notice, and all over the place. That’s a pretty lethal combination for privacy unless we have reasonable rules on how they can be used together.” This imminent reality has led several civil liberties groups to call on police departments and legislators to implement clear policies on camera footage retention, biometrics, and privacy. On Wednesday morning, the House Oversight Committee held a hearing on law enforcement’s use of facial recognition technology, where advocates emphasized the dangers of allowing advancements in real-time recognition to broaden surveillance powers. As Alvaro Bedoya, executive director of the Center on Privacy and Technology at Georgetown Law, told Congress, pairing the technology with body cameras, in particular, “will redefine the nature of public spaces.” The integration of real-time face recognition with body-worn cameras is further along than lawmakers and citizens realize. A recent Justice Department-funded survey conducted by Johns Hopkins University found that at least nine out of 38 manufacturers of body cameras currently have facial recognition capacities or have built in an option for such technology to be used later. Taser, which leads the market for body cameras, recently acquired two startups that will allow it to run video analytics on the footage the cameras collect, and Taser’s CEO has repeatedly emphasized the development of real-time applications, such as scanning videos for faces, objects, and suspicious activity. A spokesperson for NTechLab, which has pilot projects in 20 countries including the United States, China, the United Kingdom, and Turkey, told The Intercept that its high-performing algorithm is already compatible with body cameras. Police see the appeal. The captain of the Las Vegas Police Department told Bloomberg in July that he envisions his officers someday patrolling the Strip with “real-time analysis” on their body cameras and an earpiece to tell them, “‘Hey, that guy you just passed 20 feet ago has an outstanding warrant.’” At least five U.S. police departments, including those in Los Angeles and New York, have already purchased or looked into purchasing real-time face recognition for their CCTV cameras, according to a study of face recognition technology published by Bedoya and other researchers at Georgetown. Bedoya emphasized that it’s only a matter of time until the nation’s body-worn cameras will be hooked up to real-time systems. With 6,000 of the country’s 18,000 police agencies estimated to be using body cameras, the pairing would translate into hundreds of thousands of new, mobile surveillance cameras. “For many of these systems, the inclusion of real-time face recognition is just a software update away,” said Harlan Yu, co-author of a report on body camera policies for Upturn, a technology think tank. Civil liberties experts warn that just walking down the street in a major urban center could turn into an automatic law enforcement interaction. With the ability to glean information at a distance, officers would not need to justify a particular interaction or find probable cause for a search, stop, or frisk. Instead, everybody walking past a given officer on his patrol could be subject to a “perpetual line-up,” as the Georgetown study put it. In Ferguson, Missouri, where a Justice Department investigation showed that more than three-quarters of the population had outstanding warrants, real-time face searches could give police immense power to essentially arrest individuals at will. And in a city like New York, which has over 100 officers per square mile and plans to equip each one of them with body cameras by 2019, the privacy implications of turning every beat cop into a surveillance camera are enormous. “The inclusion of face recognition really changes the nature and purpose of body cameras, and it changes what communities expect when they call for and pay for cameras with taxpayer dollars,” Yu said. “I think there’s a real fear in communities of color, where officers are already concentrated, that these body-worn cameras will become another tool for surveillance rather than a tool for accountability.”

A Los Angeles police officer wears an AXON body camera during the Immigrants Make America Great March, which protested actions taken by the Trump administration, on Feb. 18, 2017, in Los Angeles, California. Photo: David McNew/Getty Images

A Digital Enemies List Civil rights group concur that tracking individuals caught on body cameras — either live or using archival footage — could put a chill on First Amendment-protected activities. “Are you going to go to a gun rights rally or a protest against the president, for that matter, if the government can secretly scan your face and identify you?” Bedoya asked the House Committee in his testimony on Wednesday. These are not far-fetched concerns, given revelations in recent years of the NYPD’s Demographics Unit, tasked with monitoring the activities of Muslim communities, and ongoing surveillance of Black Lives Matter activists in Ferguson, Baltimore, Washington, D.C., and New York. In a 2010 slideshow, the FBI discussed how face recognition could be used to tag individuals at campaign rallies. And law enforcement officials in Memphis revealed last month that they have used surveillance footage of protesters linked to Black Lives Matter to create a “watchlist” that prohibits those individuals from entering the Memphis City Hall without an escort. “It’s not hard to imagine the worst way this could play out today, with a digital version of a J. Edgar Hoover-style ‘enemies list,’” Laperruque said, of the use of a real-time watchlist. “Even if we don’t have [a list], the mere threat develops a chilling effect.” The provisions for such a system are already in place. Other types of real-time searches of biometric databases — such as mobile fingerprinting and rapid DNA tests — are now part of law enforcement routines and face few legal challenges. FBI searches of state driver’s license databases using face recognition software are almost six times more common than federal court-ordered wiretaps, according to the Georgetown study. The databases, too, have already been built. Georgetown researchers estimated that one in every two faces of adults in the United States — many of whom have never committed a crime — are captured in searchable federal, state, or local databases. The Department of Defense, the Drug Enforcement Administration, and Immigration and Customs Enforcement are just a few of the federal agencies that can gain access to one or more state or local face recognition systems. Regular interagency data-sharing programs, such as fusion centers, have given officers the ability to track not only people convicted of crimes, but also petty offenders and immigrants. Immigrants entering and exiting the country with visas have already handed over fingerprints and photos of their faces to the Department of Homeland Security. President Trump has demanded the completion of a biometric system for all travelers at the border, and a new bill introduced Tuesday in the House calls for all ICE agents to wear body cameras. “I think it is absolutely a concern that face recognition would be used to facilitate deportations,” said Rachel Levinson-Waldman, an expert on policing technology at the Brennan Center for Justice at New York University School of Law. “We’re seeing how this administration is ramping up these deportation efforts. They’re looking much more widely.” But despite these precedents and possibilities, few departments have outlined policies to limit the pairing of facial recognition technology with body camera footage. In August, Yu and colleagues at Upturn surveyed the major city police departments in the country that have equipped — or will soon equip — officers with body cameras. Out of 50 departments, only six had addressed the use of biometrics such as face recognition with their recordings. Baltimore’s policy appears to be the first to explicitly prohibit using “stored” body-camera video with face recognition, but it still leaves the door open for real-time recognition. Meanwhile, the Boston police department limits “technological enhancements” to the cameras themselves, “including, but not limited to, facial recognition or night-vision capabilities.” This policy has the opposite problem of Baltimore’s, Yu pointed out, as it still could allow for algorithms to analyze the department’s stored footage retrospectively. He said it was essential that police departments limit the amount of time they keep footage that has no obvious evidentiary value. “When they have this footage around, it will make it possible for departments to identify all the public places where specific individuals have encountered police over the years,” said Yu. “Given that departments are going down the path of better image recognition and better artificial intelligence technologies, they need to make public promises now that this is not the reason why they want to adopt body cameras.”

A video wall shows New York City police officers an interactive map of the area, security footage from nearby cameras, and whether any threats have been made. Photo: Mary Altaffer/AP