This Tuesday Apple unveiled a new line of phones to much fanfare, but one feature immediately fell under scrutiny: FaceID, a tool that would use facial recognition to identify individuals and unlock their phones.

WIRED OPINION ABOUT Jake Laperruque (@jakelaperruque) is senior counsel for privacy and security issues at The Constitution Project. He previously served as a fellow for New America's Open Technology Institute and The Center for Democracy and Technology.

Unsurprisingly, this raised major anxiety about consumer privacy: Consumers are already questioning whether FaceID could be spoofed. And it's also possible police would be able to more easily unlock phones without consent by simply holding an individual’s phone up to his or her face.

But FaceID should create fear about another form of government surveillance: mass scans to identify individuals based on face profiles. Law enforcement is rapidly increasing use of facial recognition; one in two American adults are already enrolled in a law enforcement facial recognition network, and at least one in four police departments have the capacity to run face recognition searches. Still, until now, co-opting consumer platforms hasn’t been an option. While Facebook has a powerful facial recognition system, it doesn’t maintain the operating systems that control the cameras on phones, tablets, and laptops that stare at us every day. Apple’s new system changes that. For the first time, a company will have a unified single facial recognition system built into the world's most popular devices---the hardware necessary to scan and identify faces throughout the world.

Apple doesn't currently have access to the faceprint data that it stores on iPhones. But if the government attempted to forced Apple to change its operating system at the government's behest—a tactic the FBI tried once already in the case of the locked phone of San Bernardino killer Syed Rizwan Farook—it could gain that access. And that could theoretically make Apple an irresistible target for a new type of mass surveillance order. The government could issue an order to Apple with a set of targets and instructions to scan iPhones, iPads, and Macs to search for specific targets based on FaceID, and then provide the government with those targets’ location based on the GPS data of devices that receive a match. Apple has a good record of fighting for user privacy, but there's only so much the company could do if its objections to an order were turned down by the courts. (On Wednesday Sen. Al Franken (D-Minnesota) released a letter to Apple CEO Tim Cook, asking how the company will handle the technology's security and private implications.)1

Over the last decade the government has increasingly embraced this type of mass scan method. Edward Snowden's disclosures revealed the existence of Upstream, a program under FISA Section 702 (set to expire in just a few months). With Upstream, the NSA scans all internet communications going into and out of the United States for surveillance targets' emails, as well as IP addresses and what the agency has called cybersignatures. And last year Reuters revealed that Yahoo, in compliance with a government order, built custom software to scan hundreds of millions of email accounts for content that contained a digital signature used by surveillance targets.

To many these mass scans are unconstitutional and unlawful, but that has not stopped the government from pursing them. Nor have those concerns prevented the secretive FISA Court from approving the government’s requests, all too often with the public totally unaware that mass scans continue to sift through millions of Americans’ private communications.

Until now text has been the focus of mass scan surveillance, but Apple and FaceID could change that. By generating millions of face prints while simultaneously controlling the cameras that can scan and identify them, Apple might soon face a government order to turn its new unlocking system into the killer app for mass surveillance.

What should Apple—and the rest of us—do to respond to this risk? First, Apple should take every step possible to insulate itself from an overly broad government order to conduct mass scans for faces. It's important Apple hold to its commitment that face prints developed through FaceID are stored only locally on devices, and be fully encrypted with a key that even Apple doesn't possess.