California lawmakers today passed a bill placing a three-year state-wide moratorium on the use of facial recognition technology by law enforcement agencies.

AB 1215, The Body Camera Accountability Act, was introduced earlier this year by assemblymember Phil Tang, a Democrat. Both San Francisco and Oakland previously passed similar bills preventing the use of facial recognition by law enforcement agencies, now the ban‘s gone state-wide.

The bill goes into effect on 1 January, 2020, and will be reviewed under a “sunset provision” in 2023.

Tang, according to an ACLU statement, says the bill will protect Californians:

Without my bill, facial recognition technology essentially turns body cameras into a 24-hour surveillance tool, giving law enforcement the ability to track our every movement. Let’s not become a police state and keep body cameras as they were originally intended – to provide police accountability and transparency.

US citizens have the right to privacy and the reasonable expectation that public surveillance systems are in place to protect us in the event that a crime is committed.

But AI-powered facial recognition systems aren’t designed to monitor public spaces for crimes. As we’ve seen in leaked Palantir documents, these systems are meant to connect to a database wherein police officers have access to the private details of any citizen. Here’s a graphic showing what kind of information law enforcement officers have available to them with the Palantir app:

Credit: Vice / DOJ US DOJ Records / Image from Vice

In essence, these tools give police officers the kind of data and information that a detective 20 years ago couldn’t have gleaned with a search warrant and six months to investigate – today there’s literally an app for that.

As Electronic Frontier Foundation Associate Director of Community Organizing Nathan Sheard put it, use of these tools would force citizens to decide “between actively avoiding interaction and cooperation with law enforcement, or having their images collected, analyzed, and stored as perpetual candidates for suspicion.”

Our right to privacy, like our right to keep a well-regulated militia, is meant to protect us from tyranny. Thousands of US law enforcement agents have been outed over the past year as members of online hate-groups, and the Supreme Court is currently deciding whether it’s okay for an employer to fire someone for being queer. The risk posed to marginalized communities by rogue law enforcement agents is only exacerbated by facial recognition software.

Here’s hoping the rest of the country catches up to California before it’s too late.

Read next: Apple bans app that warns Hong Kong citizens about police activity, again (Updated)