Report Confirms Deep Flaws Of Automated Facial Recognition Software In The UK, Warns Its Use In The US Is Spreading

from the mind-the-step-change dept

Techdirt has written many stories about facial recognition systems. But there's a step-change taking place in this area at the moment. The authorities are moving from comparing single images with database holdings, to completely automated scanning of crowds to obtain and analyze huge numbers of facial images in real time. Recently, Tim Cushing described the ridiculously high level of false positives South Wales Police had encountered during its use of automated facial recognition software. Before that, a post noted a similarly unacceptable failure rate of automated systems used by the Metropolitan Police in London last year.

Now Big Brother Watch has produced a report bringing together everything we know about the use by UK police of automated facial recognition software (pdf), and its deep flaws. The report supplements that information with analyses of the legal and human rights framework for such systems, and points out that facial recognition algorithms often disproportionately misidentify minority ethnic groups and women.

The UK situation is fairly well known. There's been less coverage of automated facial recognition systems in the US, and the Big Brother Report offers some comments from experts about what is happening there. For example, Clare Garvie from the Georgetown Law Center on Privacy and Technology, writes:

Face recognition surveillance -- identifying people in real-time from live video feeds -- risks being an imminent reality for many Americans. Are we comfortable with a society where face recognition allows police to identify anyone with a driver’s license, without suspicion or consent? Are we comfortable with a society where the government can find anyone, at any time, by continuously scanning the faces of people on the sidewalk? Face recognition fundamentally changes the nature of privacy in public spaces. As government agencies themselves have cautioned, face recognition surveillance 'has the potential to make people feel extremely uncomfortable, cause people to alter their behaviour, and lead to self-censorship and inhibition,' chilling the exercise of the rights protected under the First Amendment and calling into question the scope of protections offered by the Fourth Amendment.

Alongside its report, Big Brother Watch has launched the "Face Off" campaign calling for the UK public authorities to stop using automated facial recognition software with surveillance cameras, and to remove the thousands of images of unconvicted individuals from the UK's Police National Database. Given the UK authorities' world-famous love of CCTV and surveillance, it's unlikely they will take much notice.

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: face recognition, flaws, uk