New Data On London Metro Police Facial Recognition Tech Shows It's Still Wrong 96 Percent Of The Time

from the targeting-violent-criminals-with-a-four-percent-success-rate dept

Is this good news or bad news? It's tough to say. The London Metro Police are proud of their many cameras and their cameras' many features, but there doesn't appear to be any improvement in the facial recognition tech its deploying.

Three Freedom of Information requests sent to the Metro Police last year returned documents showing its tech was reporting nothing but false positives. The first response reported a 98% failure rate. A follow-up request generated an admission of a 100% failure rate by the Metro's tech. Now another set of FOI requests has gathered more data from the Metro Police and it appears past reports of consistent failure were pretty indicative of future results.

Facial recognition technology used by London’s Metropolitan Police incorrectly identified members of the public in 96 per cent of matches made between 2016 and 2018. Biometric photos of members of the public were wrongly identified as potential criminals during eight incidents across the two-year period, Freedom of Information (FoI) requests have revealed.

This may be a small sample size, but it was enough to subject a 14-year-old student to a police stop after the facial recognition software mistook him for a criminal.

The Metro Police are continuing to use the tech despite its relative uselessness. The Met does claim its deployments over the last couple of years have led to eight arrests, but it needs far more than that to offset the system's apparent desire to see the innocent punished.

As the Metro Police continues beta testing its tech on the general public, it's continuing to amass a collection of non-criminal faces in its facial recognition database. This has drawn some attention from Parliament members who have called this "unacceptable." There has been some improvement in one area since the last time the Metro Police were queried about its facial recognition tech. It used to hold onto all images for a year. Now, it only holds watchlist images for 30 days and deletes all non-hit images immediately.

Unfortunately, this spectacular run of failure hasn't moved Parliament to, you know, discourage use of the tech. And it appears those who publicly refuse the privilege of being misidentified as a criminal will have their complaints addressed by being turned into criminals.

In one incident, a 14 year-old black child in school uniform was stopped and fingerprinted by police after being misidentified by the technology, while a man was fined for objecting to his face being scanned on a separate occasion.

Problem solved. The system is only interested in criminals and only criminals would object to having their faces scanned by the Metro's faulty tech. Self-fulfilling prophecies are just another undocumented feature.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, false positives, london metro, london metro police, the tube