Despite facial recognition's seal of approval from law enforcement agencies across the U.S., recent experiments show the technology is far from infallible.

In a demonstration by the American Civil Liberties Union, about 26 California lawmakers were misidentified by face-matching software built by Amazon, putting the rate of a mismatch at about 1 in 5.

The results mimic a similar test done by the advocacy group in 2018 when a test saw Amazon's software, called 'Rekognition', mismatch 28 members of congress -- many of whom were people of color.

The ACLU says a test of Amazon's facial recognition software misidentified 1 in 5 lawmakers fed into its system

Similarly, the software attempted to match their head shots against a database of known criminals -- a process that has become commonplace for the at least 200 departments across the U.S. who use Rekognition software.

According to the LA Times, the test is fueling calls from California legislators to limit the technology's application in a law enforcement capacity, including its integration with police body cameras.

'The software clearly is not ready for use in a law enforcement capacity,' California Assemblyman Phil Ting told the LA Times.

'These mistakes, we can kind of chuckle at it, but if you get arrested and it’s on your record, it can be hard to get housing, get a job. It has real impacts.'

Increasingly lawmakers and activists worry that the technology could also lead to mass surveillance, the likes of which could infringe on people's civil rights, especially when coupled with technology like home security and police body cameras.

'Body cameras were really deployed to build trust between law enforcement and communities,' Ting told the LA Times.

'Instead of trust, what you are getting is 24/7 surveillance.'

A state-wide bill banning the technology's use in police-worn body cameras would mark a significant step forward for opponents of the technology looking to reel in its use.

Awareness surrounding the use and potential misuse of facial recognition has spread as the technology finds its way into more arenas

While other, more localized, laws are now on the books, there is no federal regulation on exactly when and how facial recognition software can be used in policing, even despite a first-of-its-kind congressional hearing on the topic that took place earlier this year.

Recently, Oakland California became the third city across the country to ban facial recognition software's use, following major metro areas like San Francisco and smaller examples like Somerville Massachusetts.

Critics say that use of the technology carries a number of pitfalls including potentially increasing the risks of falsely accusing someone of a crime.

In an unprecedented report, the New York City Police department was found in May to be inputting celebrity lookalikes into its facial recognition software in an attempt to match famous faces with pictures of criminals captured on CCTV and more.