The Metropolitan police is to start using live facial recognition (LFR) cameras linked to powerful computers on London’s streets despite scepticism from experts over how efficient the system is and widespread concerns over civil liberties.

The Met rejected claims the scheme was “a breathtaking assault on rights” and claimed that 80% of people surveyed backed the move. It said the system would launch next month and would be aimed at catching serious criminals and tracking down missing persons.

However, some of its central claims came under fire from the expert it hired to scrutinise two years worth of trials. The Met said the system was 70% effective at spotting wanted suspects and falsely identified someone as wanted in one in a thousand cases. But Prof Pete Fussey – an expert on surveillance from Essex University who conducted the only independent review of the Met’s public trials on behalf on the force – found it was verifiably accurate in just 19% of cases.

Fussey told The Guardian: “I stand by our findings. I don’t know how they get to 70%.”

The Met said it would deploy the technology overtly and only after consulting communities in which it is to be used.

It said the cameras would be linked to a database of suspects uploaded using the latest intelligence. If the system detects someone who is not on the database, their information will be deleted in seconds. But if it generates an alert because the person is wanted, an officer will speak to them.

Using facial recognition linked to databases of suspects is potentially the next big leap for law enforcement, as big as the introduction of fingerprints, and police have been working on it for years. The security services are also hugely interested.

Nick Ephgrave, an assistant commissioner at the Met, said: “As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard.”

The Guardian understands the system is less effective at night-time and works best with good daylight. The Met said its system was less effective at scanning dense crowds.

The Met has made promises to the mayor of London, Sadiq Khan, after an independent ethics review raised concerns over its earlier trials of facial recognition software. The system will not be linked to other official databases. It is not designed to allow the authorities to scan every corner of London or allow them to be confidently capable of tracking anyone down.





However, civil liberties groups immediately vowed to challenge the rollout in the courts, possibly before the Met can even deploy the system.

Silkie Carlo, the director of Big Brother Watch, called the move “an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK”.

“This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the home secretary.”

Quick guide What is facial recognition - and how do police in the UK use it? Show Hide What is facial recognition? This is a catch-all term for any technology that involves cataloguing and recognising human faces, typically by recording the unique ratios between an individual’s facial features, such as eyes, nose and mouth. Why is it in the news? After a trial of the technology, London's Metropolitan police have said they will start to use it in London within a month. On Friday, the force said it would be used to find suspects on “watchlists” for serious and violent crime, as well as to help find children and vulnerable people. Scotland Yard said the public would be aware of the surveillance, with the cameras being placed in open locations and officers handing out explanatory leaflets. How is it used in policing? The technology greatly improves the power of surveillance. At the simple end, a facial recognition system connected to a network of cameras can automatically track an individual as they move in and out of coverage, even if no other information is known about them. At the more complex end, a facial recognition system fuelled by a large database of labelled data can enable police to pinpoint a person of interest across a city of networked cameras. Why is it controversial? Facial recognition frequently sparks two distinct fears: that it will not work well enough, or that it will work too well. The first concern highlights the fact that the technology, still in its infancy, is prone to false positives and false negatives, particularly when used with noisy imagery, such as that harvested from CCTV cameras installed years or decades ago. When that technology is used to arrest, convict or imprison people, on a possibly faulty basis, it can cause real harm. Worse, the errors are not evenly distributed; facial recognition systems have regularly been found to be inaccurate at identifying people with darker skin. But the technology will improve, meaning the second concern is harder to shake. This is the fear that facial recognition inherently undermines freedom by enabling perfect surveillance of everyone, all the time. The fear is not hypothetical; already, Chinese cities have proudly used the technology to publicly shame citizens for jaywalking, or leaving the house in their pyjamas. Alex Hern Technology editor

Allan Hogarth, from Amnesty International UK, said: “The Met’s decision to introduce facial recognition technology poses a huge threat to human rights.

“This technology puts many human rights at risk, including the rights to privacy, non-discrimination, freedom of expression, association and peaceful assembly.

“This is no time to experiment with this powerful technology that is being used without adequate transparency, oversight and accountability.”

Parliament has yet to bring in guidance balancing the potential security benefits of live facial recognition versus safeguards, which have been put in place for police use of fingerprints and DNA.

A spokesperson for the campaign group Liberty said: “This is a dangerous, oppressive and completely unjustified move by the Met. Facial recognition technology gives the state unprecedented power to track and monitor any one of us, destroying our privacy and our free expression.”

South Wales police already use live facial recognition. Last year, after a series of court cases, judges ruled in favour of the technology. The Met believe it paved the way for Friday’s announcement, but a warning came from the office of the biometrics commissioner, Prof Paul Wiles: “This is a step-change in the use of LFR by the UK police, given that the technology will be deployed fully operationally rather than on a trial basis.

“Although the court found South Wales’ use of LFR to be consistent with the requirements of the Human Rights Act and data protection legislation, that judgment was specific to the particular circumstances in which South Wales police used their LFR system.”

Ephgrave said the technology would not be used indiscriminately and that its initial use would be limited. “The Met will begin operationally deploying LFR at locations where intelligence suggests we are most likely to locate serious offenders,” he said. “Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences.”

Khan endorsed the decision, but said it would be under constant review: “New technology has a role in keeping Londoners safe, but it’s equally important that the Met are proportionate in the way it is deployed and are transparent about where and when it is used in order to retain the trust of all Londoners.

“City Hall and the Ethics Panel will continue to monitor the use of facial recognition technology as part of their role in holding the Met to account.”

The Labour mayor is up for re-election this year and is facing a challenge from the Liberal Democrats trying to outflank him from the left. They branded the scheme as Khan’s “mass surveillance roll out”.