Ed Bridges (pictured) claimed the cameras were a breach of human rights

A human rights activist has lost the world's first legal challenge over police use of facial recognition technology.

Ed Bridges, 36, brought the challenge at the High Court after claiming his face was scanned while doing Christmas shopping in 2017 and at a peaceful anti-arms protest in 2018.

His lawyers argued the use of automatic facial recognition (AFR) by South Wales Police caused him 'distress' and violated his privacy and data protection rights by processing an image taken of him in public.

But his case was dismissed on Wednesday by two leading judges, who said the use of the technology was not unlawful.

Civil liberties campaigners were outraged at the decision and claimed the 'dystopian' technology was a threat to the public's freedoms.

After the judge's decision, Mr Bridges vowed to continue his fight against what he called 'sinister technology' and said he will appeal the judge's ruling.

He added: 'South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent.

'This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.'

Lord Justice Haddon-Cave, sitting with Mr Justice Swift, said as he made the ruling: 'We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Wales Police's use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.

South Wales Police piloted facial recognition technology along with the Met and one other force

The judges said they were told by lawyers during a three-day hearing in May that Mr Bridges' case was the first time any court in the world had considered the use of AFR.

At the start of the ruling, Lord Justice Haddon-Cave said: 'The algorithms of the law must keep pace with new and emerging technologies.

'The central issue is whether the current legal regime in the United Kingdom is adequate to ensure the appropriate and non-arbitrary use of AFR in a free and civilised society.

'At the heart of this case lies a dispute about the privacy and data protection implications of AFR.

'Counsel inform us that this is the first time that any court in the world had considered AFR.'

Silkie Carlo, director of Big Brother Watch, said the ruling showed facial recognition technology 'interferes with the public’s privacy rights'.

Six steps behind facial recognition technology The Metropolitan Police uses facial recognition technology called NeoFace, developed by Japanese IT firm NEC, which matches faces up to a so-called watch list of offenders wanted by the police and courts for existing offences. Cameras scan faces in its view measuring the structure of each face, creating a digital version that is searched up against the watch list. If a match is detected, an officer on the scene is alerted, who will be able to see the camera image and the watch list image, before deciding whether to stop the individual. Advertisement

He said: 'Today’s judgement acknowledges that live facial recognition surveillance by South Wales Police interferes with the public’s privacy rights but contends that its use, even to monitor peaceful protesters, is lawful.

'We feel that people in Wales have been let down and are pleased that Mr Bridges intends to appeal this profoundly disappointing judgement, which failed to grasp the intrusive nature of this technology.'

The decision was relayed over video link from the High Court in London to the High Court in Cardiff, where the case was heard.

Mr Bridges, from Cardiff, crowdfunded his legal action against the force and was represented by civil rights campaign group Liberty.

Liberty lawyer Megan Goulding said after the ruling that the 'highly intrusive' technology still poses a threat to the public's rights and freedoms.

She added: 'This disappointing judgement does not reflect the very serious threat that facial recognition poses to our rights and freedoms.

'Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all.

'It is time that the Government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.'

Now that the High Court has ruled the use of the technology is lawful, Liberty are calling for for an outright ban.

Facial recognition technology maps faces in a crowd by measuring the distance between features then compares results with a 'watch list' of images - which can include suspects, missing people and persons of interest.

South Wales Police has been conducting a trial of the technology since 2017, with a view to it being rolled out nationally, and is considered the national lead force on its use.

The trial comprises two pilot projects, AFR Locate and AFR Identify, and the force has used the technology 50 times to date.

Campaigners say the use of the technology (file image) is a step too far towards a police state

South Wales Police Chief Constable Matt Jukes said: 'This is innovative work that has put South Wales Police at the front of the development of this technology and the debate that surrounds it.

'We have always wrapped some good, common-sense decision-making by experienced police officers around the systems.

'But we've also underpinned this with careful legal work and the involvement of a range of bodies, including our own Police and Crime Commissioner and Ethics Committee.

'I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern.

'So, I'm pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.

'With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach.

'There is, and should be, a political and public debate about wider questions of privacy and security.

'It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.

'So, this decision is welcome but, of course, not the end of the wider debate.

'I hope policing will be supported by that continuing in an informed way with bodies such as Liberty who brought this action and Government each playing their valuable role.'