A landmark case against police use of automatic facial recognition tech gets underway today in Cardiff, with the pursuer saying it breaches human rights.

This case is the first major challenge against the use of automatic facial recognition (AFR) by police, with pursuer Ed Bridges, a former Lib Dem councillor, saying that poor regulation of the technology breaches human rights.

Bridges was Christmas shopping in Cardiff when his image was taken by AFR cameras operated by South Wales Police in December 2017.

The police have defended their use of AFR, but have not publicly commented on the case brought against them. The outcome of this trial could potentially impact the future regulation and use of the technology.

Speaking to BBC News, Bridges said: “I popped out of the office to do a bit of Christmas shopping and on the main pedestrian shopping street in Cardiff, there was a police van.

“By the time I was close enough to see the words ‘automatic facial recognition’ on the van, I had already had my data captured by it. That struck me as quite a fundamental invasion of my privacy.”

Bridges had his image captured a second time while attending a peaceful protest against the arms trade. He argues that AFR breaches his human right to privacy, data protection laws and equality laws.

The use of fingerprints and DNA by police is strictly regulated, but there is a lack of such measures for the use of other forms of biometric data. Currently, there is little governance on how the data is gathered or managed by police and government.

Civil rights group Liberty, which is representing Bridges, has said the use of the tech is equivalent to the unregulated taking of DNA and fingerprints without consent. The group asserts that if the tech breaches human rights then it should not be used.

AFR tech is capable of scanning large numbers of people in public spaces such as shopping centres or football crowds without their knowledge. The captured data can then be compared to the images on the police’s ‘watch lists’ to see if they match.

Related

Chris Phillips, former head of the National Counter Terrorism Security Office, said: “If there are hundreds of people walking the streets who should be in prison because there are outstanding warrants for their arrest, or dangerous criminals bent on harming others in public places, the proper use of AFR has a vital policing role. The police need guidance to ensure this vital anti-crime tool is used lawfully.”

While it is acknowledged that the tech can be used to help prevent serious crimes, such as acts of terrorism, Liberty says that police are also deploying the tech for petty offences like capturing pickpockets.

Liberty has raised a number of concerns including where the watch list images are coming from and the fact the police have not ruled out scrapping social media platforms for watch list images.

Moreover, other civil liberty groups say that studies have shown AFR has a high-rate of misidentification and discriminates against ethnic minorities, in particular women of colour.

Megan Goulding, a lawyer for Liberty, said: “If you are a woman or from an ethnic minority and you walk past the camera, you are more likely to be identified as someone on a watch list, even if you are not. This means you are more likely to be stopped and interrogated by the police.

“This is another tool by which social bias will be entrenched and communities who are already over-policed simply get over-policed further.” The group says the frequency of false-positives has the potential to change the nature of public spaces.

Last week, due to rising concerns over the reliability and infringement of people’s liberty and privacy, San Francisco banned the use of AFR, making it the first city in the US to do so.

The Home Office, information commissioner and surveillance camera commissioner have become involved in the case, showing a high level of interest and concern in relation to the parameters of the legal use of AFR.

Like this: Like Loading...