The proposal also bars city officials from using any data sourced from facial recognition. If police in a neighboring city wanted to share a list of suspects sourced from facial recognition, the San Francisco Police Department would be prohibited from using it, explains Lee Hepner, a legislative aide who helped draft the bill. He says this is only the first of many steps changing how the city balances policy and technology.

Regulating surveillance technology is difficult because data collected for one purpose can be used for another. That’s not always a bad thing. In New York, for example, city data tracking asbestos complaints were used to predict tenant harassment. In Sacramento, city officials tracked welfare recipients suspected of fraud by using license-plate data employed by police to find stolen vehicles. Peskin’s proposal is something of a “good faith” ordinance: Departments must explain how they plan to use the technology, but aren’t forbidden from using data in other ways.

This flexibility in how departments use data is balanced by the second part of the ordinance, which requires annual reviews. Yearly, departments must submit a “surveillance impact report” for board review and public discussion, explaining how technology was used and why. These impact reports include all uses of the data, costs (including personnel, maintenance, and equipment), and crucially, where the technology was used and crime statistics for those locations.

If city officials want to keep using surveillance cameras in a park to prevent car break-ins, for example, each year they would have to submit evidence that the cameras reduced crime.

“We want there to be a justification for use of the technology in the location,” Hepner says. “If the ostensible benefits of any of these surveillance technologies is the prevention of crime, then it’s helpful for the board to be able to track that. Over time, is this technology having a positive impact?”

The ordinance applies retroactively to all the surveillance systems and technologies already in use. Officials will have to submit impact reports annually and disclose the costs of surveillance tech already in operation, hopefully revealing to the public how deeply embedded surveillance already is in their daily lives. Even after approval, technologies can be rescinded at a later date pending the annual reviews.

But Peskin’s bill regulates face recognition as a tool for policing, not for commerce. Consumers use facial recognition to unlock phones, board flights, tag friends in wedding photos and children in summer-camp pictures, and even buy beer at baseball games.

This introduces some ironies: For example, if the bill passes, the SFPD will be barred from using Amazon’s Rekognition software to scan video footage for suspects after a shooting—but a grocery store will be permitted to do the same thing to analyze shopper behavior. Curtailing this kind of retail surveillance—especially given how plainly convenient it often feels—will require an entirely different approach.

The ordinance is currently in a 30-day hold, during which the public can submit comments and concerns. Afterward, it will go in front of the city’s Rules Committee, where Hepner expects it will be ratified quickly.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.