In a recent test of Amazon's facial recognition software, the American Civil Liberties Union of Northern California revealed that it mistook 26 California lawmakers as people arrested for crimes.

The ACLU used Rekognition, Amazon’s facial recognition software, to evaluate 120 photos of lawmakers against a database of 25,000 arrest photos, ACLU attorney Matt Cagle said at a press conference on Tuesday. One in five lawmaker photographs were falsely matched to mugshots, exposing the frailties of an emerging technology widely adopted by law enforcement. The ACLU used the default Rekognition settings, which match identity at 80 percent confidence, Cagle said.

Assemblymember Phil Ting was among those whose picture was falsely matched to an arrest photo. He’s also an active advocate for limiting facial recognition technology: in February, he introduced a bill, co-sponsored by the ACLU, that bans the use of facial recognition and other biometric surveillance on police-worn body cameras.

The ACLU is concerned about the potential for facial recognition to track people without their consent. They are also worried that police body cameras, intended to help keep tabs on police behavior, will be used instead for mass-surveillance. Assemblymember Reggie Jones-Sawyer said at the press conference that the technology would “automate mistaken identity” and reaffirm racial bias in policing.

Even if the facial recognition algorithms were perfectly accurate, the effects of the technology would still be disproportionately borne by vulnerable communities, according to Cagle. The bill also notes that facial recognition in body cameras will impact the rights of people in highly-policed communities and could dissuade undocumented people or those with criminal histories from talking to the police.

“This is not the sort of problem that can be solved simply by tweaking an algorithm,” Cagle said at the press conference. “This is the kind of problem that the legislature needs to step up and fix right now to protect all Californians.”

Amazon said that it encourages police to use 99 percent confidence ratings for public safety uses: "First, you should use confidence thresholds of 99% or higher to reduce errors and false positives," a guide for law enforcement states.

"When using facial recognition to identify persons of interest in an investigation, law enforcement should use the recommended 99% confidence threshold, and only use those predictions as one element of the investigation (not the sole determinant)," the company wrote in a blog post in February.

But these are simply recommendations. Cagle said in an email that Amazon's 99 percent confidence threshold doesn't reflect how the technology is used in the real world.

"In the real world, Amazon does not consider the failure to use a 99 confidence score to be irresponsible and a prominent law enforcement customer for Rekognition has acknowledged lowering and not using any score at all, according to news reports," he said. He cited this Gizmodo piece, in which police who use Rekognition said they do not set a threshold.

If this story sounds strangely familiar, it’s because it’s happened before. Last year, the ACLU tested Rekognition on members of Congress, misidentifying 28 of them as people arrested for crimes. Those false matches were disproportionately people of color.

“If you get falsely accused of an arrest, what happens?” Ting said at the press conference. “It could impact your ability to get employment, it absolutely impacts your ability to get housing. There are real people who could have real impacts.”