San Francisco could become the first city in the nation to ban any city department from using facial recognition under a proposal that says any benefits of the technology do not outweigh its impact on civil rights, and Oakland may not be far behind.

In San Francisco, a Board of Supervisors committee is scheduled to vote Monday on the Stop Secret Surveillance Ordinance, which would make it illegal for any department to “obtain, retain, access or use” any face-recognition technology or information obtained from such technology.

The proposal, introduced by San Francisco Supervisor Aaron Peskin in January, would also require public input and the supervisors’ approval before agencies buy surveillance technology with public funds. That includes the purchase of license plate readers, toll readers, closed-circuit cameras, body cams, and biometrics technology and software for forecasting criminal activity.

“The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring,” the ordinance reads.

Other Bay Area cities and counties, including Berkeley, Palo Alto and Santa Clara County, have similar rules in place about buying surveillance tech, but a San Francisco ban on facial recognition would set a precedent. In Oakland, a proposal to add a ban on facial recognition to city regulations about surveillance tech is set to be considered by Oakland’s Public Safety Committee later this month.

Brian Hofer, the chairman of the Oakland Privacy Advisory Commission who has helped draft ordinances around the Bay Area, said as far as he knows, facial recognition isn’t being used by police in the area.

“That’s the reason we’re trying to prevent that now,” said Hofer, who has filed suit against the Contra Costa County Sheriff’s Department, the San Jose Police Department and others after he said he was pulled over last year and handcuffed, with guns pointed at him when a license plate reader mistakenly identified the rental car he was driving as stolen. “The genie’s not out of the bottle yet.”

But the ACLU, which also helped draft the ordinances, pointed out that deploying facial recognition would be easy enough.

“The raw materials for face surveillance — data such as mugshots and video feeds from CCTV and body cams — already exist,” said Matt Cagle, technology and civil liberties attorney with the ACLU of Northern California. “With just a few lines of code, existing photo systems can be turned into dangerous dragnet surveillance networks.”

The proposed ordinances come after high-profile examples of the pitfalls of facial recognition, including a report last year that Amazon’s Rekognition software falsely matched the faces of members of Congress with mugshots of people who had been arrested.

The San Francisco Police Department, which said it doesn’t use facial recognition, submitted amendments to the ordinance after talking with other city departments, community groups, neighborhood watch groups and businesses.

“(Our) mission must be judiciously balanced with the need to protect civil rights and civil liberties, including privacy and free expression,” said David Stevenson, spokesman for the San Francisco Police Department. “We welcome safeguards to protect those rights while balancing the needs that protect the residents, visitors and businesses of San Francisco.”

Lee Hepner, legislative aide to Peskin, said the supervisor’s office incorporated some of the SFPD’s requests into the ordinance. If it is approved in committee Monday, the full board will vote May 14.

“Over time, this will build a lot of trust among the community and the police,” he said. “Hopefully in the end it will be a win-win.”

San Francisco Sheriff’s Department spokeswoman Nancy Crowley said her department does not use facial recognition. She added that most of the agency’s work is in non-public spaces, but that if the ordinance is passed “we will comply with the requirements that impact our work.”

The Oakland Police Department did not return a request for comment.

Color of Change, a national nonprofit racial justice advocacy group founded in Oakland, supports both ordinances.

“This is an important moment for San Francisco,” said Brandi Collins-Dexter, senior campaign director for the group. She said the city “is positioned to really protect its constituents” and could influence others around the nation.

In a letter urging supervisors to pass the ordinance, Color of Change expressed concern about “high-tech profiling.” The group cited a 2009 incident in which multiple San Francisco police officers pointed their guns at a black woman who was pulled over based on mistaken information from a license plate reader that the car she was driving was stolen. The woman, Denise Green, a former Muni driver, settled her lawsuit against San Francisco in 2015 for $495,000.

Nowadays, Collins-Dexter said, police have access to “technologies the likes of which we’ve never seen.”

AI experts in April urged Amazon to stop selling facial recognition software to law enforcement until safeguards and laws are put into place. (Its technology is now being tested by police in Oregon.) Amazon shareholders are scheduled to vote later this month on a shareholder resolution urging Amazon to stop selling Rekognition.

The companies that make the technologies have also called for limits and regulations: Microsoft late last year called for regulating artificial intelligence, and Amazon followed suit earlier this year.

In addition, the Partnership on AI — whose members include Facebook, Google, Amazon.com, Apple, Microsoft, IBM and academic researchers — last week said law enforcement should not use artificial intelligence algorithms to make decisions about jailing people.