Axon (formally known as Taser) has been shifting its business toward body cameras for police officers for the past few years, but today, the company is making a big change. At the recommendation of its AI ethics board, “Axon will not be commercializing face matching products on our body camera,” the company announced in a blog post today.

Axon founded its AI and Policing Technology Ethics Board last April to help advise the company on how to ethically develop products. The board’s first report was published today, with a focus on advising Axon against using facial recognition technology.

“Face recognition technology is not currently reliable enough to ethically justify its use on body-worn cameras.”

According to the board’s report, “Face recognition technology is not currently reliable enough to ethically justify its use on body-worn cameras.” It cites that, at the very least, more accurate technology that “performs equally well across races, ethnicities, genders, and other identity groups” would be required, assuming facial recognition technology for police body cameras can ever be considered ethical at all, a conversation that the board has begun to examine.

The board also advocated against allowing users (i.e., police officers) to be able to customize facial recognition software if it is a part of future products in order to prevent misuse, while also recommending that any jurisdiction planning on using facial recognition technology should do so through “open, transparent, democratic processes.”

Axon’s decision to not put face matching software on its police body cameras doesn’t go as far as the board’s suggestions. The company’s blog post makes it clear that Axon will continue to research and pursue face matching technology, including an effort to de-bias algorithms in the future. That implies that the company is still hoping it is a case of “when” rather than “if” it’s able to add the technology to its products.

The fact that Axon — a major supplier of police cameras in the US — is shying away from the technology is still significant, though, given the contentious nature of the technology. Earlier this year, San Fransisco became the first city in the US to ban government agencies from using face matching software, and Microsoft has denied software to law enforcement agencies, citing human rights concerns.

At the same time, Amazon came under fire last year for doing exactly that, selling its Rekognition software to police in Orlando and Oregon’s Washington County, while New York City was caught abusing its own facial recognition software earlier this year.