Amazon's facial-recognition software falsely matched 27 professional athletes to mugshots in a law enforcement database in tests conducted by the ACLU of Massachusetts, the organization announced Monday.

New England Patriots safety Duron Harmon, one of the people mistakenly matched, spoke out against the use of facial recognition by law enforcement.

The ACLU is campaigning in support of a Massachusetts bill that would put a moratorium on government agencies' use of facial recognition.

Amazon has argued that the ACLU is using the software incorrectly.

Visit Business Insider's homepage for more stories.

As a New England Patriots safety and three-time Super Bowl winner, Duron Harmon is a familiar face to New England sports fans. That wasn't enough to stop Amazon's facial-recognition software from mistakenly matching him to a criminal's mugshot.

Harmon is one of 27 professional athletes that were falsely matched to mugshots in a criminal database by Amazon's Rekognition software in tests conducted by the ACLU of Massachusetts, the organization announced Monday.

The ACLU chapter was attempting to prove a point: That facial-recognition tech is fallible, and that law enforcement agencies shouldn't rely on the software to identify potential suspects. Massachusetts' state legislature is currently weighing a bill that would implement a moratorium on the state's use of facial recognition.

Harmon joined in the calls for the moratorium in the wake of the ACLU's findings.

"This technology is flawed," he said in a statement provided by the ACLU. "If it misidentified me, my teammates, and other professional athletes in an experiment, imagine the real-life impact of false matches. This technology should not be used by the government without protections."

The ACLU used an 80% similarity threshold for its tests — which it says is Rekognition's default setting — but Amazon has said it recommends law enforcement use a 99% threshold, meaning the software will not match a subject's face to a mugshot unless it's 99% certain that the match is accurate.

In a statement to Business Insider, an Amazon spokesperson reiterated that complaint.

"The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines," the spokesperson said. "When used with the recommended 99% confidence threshold and as one part of a human driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking."

Critics of Amazon Rekognition also say the software could promulgate racial bias, especially if used by law enforcement. An MIT study in July found that Rekognition has a harder time identifying faces of women and people of color. Amazon rebutted that study, arguing that its findings were only related to the software's ability to spot and analyze faces in footage, not match faces to a database.