The San Francisco Board of Supervisors voted Tuesday to ban the use of facial recognition by city agencies, a first-of-its-kind measure that has inspired similar efforts elsewhere.

Gregory Barber covers cryptocurrency, blockchain, and artificial intelligence for WIRED.

San Francisco’s ban covers government agencies, including the city police and county sheriff’s department, but doesn’t affect the technology that unlocks your iPhone or cameras installed by businesses or individuals. It’s part of a broader package of rules, introduced in January by supervisor Aaron Peskin, that will require agencies to gain approval from the board before purchasing surveillance tech and will require that they publicly disclose its intended use. In coming weeks, Oakland and Somerville, Massachusetts, are expected to consider facial-recognition bans of their own.

Facial-recognition technology has been used by law enforcement to spot fraud and identify suspects, but critics say that recent advances in AI have transformed the technology into a dangerous tool that enables real-time surveillance. Studies by researchers at MIT and Georgetown have found that the technology is less accurate at identifying people of color and could automate biases already pervasive in law enforcement. Privacy advocates see banning facial recognition as a unique opportunity to prevent the technology from getting too entrenched. “We’re doing it now before the genie gets out of the bottle,” says Brian Hofer, an attorney who heads Oakland’s Privacy Advisory Commission, which spearheaded the legislation in that city.

In San Francisco, the police department says it doesn’t currently use facial recognition, although it tested the technology on booking photos between 2013 and 2017. The Sheriff’s department, which is included under the board’s unique city-and-county authority, says it doesn’t either. “We will comply with whatever the requirements are,” says spokeswoman Nancy Crowley, adding that officers are equipped with Axon body cameras that don’t use facial-recognition technology. (Last week, the California State Assembly passed a ban on biometric surveillance in police body cameras.) San Francisco’s ban will not affect federal agencies, including agents at the airport and ports.

There was little organized opposition to the proposal, but one local group, Stop Crime SF, argued a ban would remove a potential deterrent to property crime and impact the collection of evidence. The legislation was amended to clarify that private individuals can still share tips with law enforcement, although agencies can’t actively solicit information that they know comes from a facial-recognition system. The agency is also required to ask how the information was obtained in order to track how often facial recognition was involved. “If there’s a huge uptick, then that might mean we’re shoving facial recognition into a less-regulated private sector,” says Lee Hepner, a legislative aide to Peskin.

Joel Engardio, vice president of Stop Crime SF, says he’s largely satisfied with the amended bill. “We agree with the concerns that people have about facial ID technology. The technology is bad and needs a lot of improvement,” he says. While the group would have preferred a moratorium while the city worked out regulations, rather than a ban, he says he supports the broader set of surveillance rules.

Makers of facial-recognition systems have been notably silent in the local debates thus far. But Benji Hutchinson, vice president of federal operations for NEC, a major supplier of facial-recognition technology, says the industry is watching closely. “I think there’s a little bit too much fear and loathing in the land around facial-recognition technology,” he says. He’s concerned about the potential for “copycat bills” in other cities that could result in a patchwork of local laws. NEC is pushing for a federal law that would preempt local and state laws, require systems to be tested for accuracy by outsiders, and include new rules protecting against bias and civil rights abuses.

LEARN MORE The WIRED Guide to Artificial Intelligence

In a statement, Daniel Castro, vice president of the Information Technology and Innovation Forum, a think tank backed by tech companies including Amazon, which makes Rekognition facial-recognition software, called for “safeguards on the use of the technology rather than prohibitions.” Castro called the ban a “step backward for privacy,” because it will leave more people reviewing surveillance video.