Facial-recognition technology the UK’s Metropolitan police is using to automatically identify people on surveillance cameras is said to be wrong 98 percent of the time.

Civil liberties group Big Brother Watch branded the automated facial recognition software as “dangerous and inaccurate” due to its inability to identify people correctly.

The organization added that its use is “lawless” and could breach the right to privacy protected by the Human Rights Act.

According to the watchdog’s new report entitled “Face Off: the lawless growth of facial recognition in UK policing,” the Metropolitan Police’s facial-recognition tech misidentified 95 people at last year’s Notting Hill Carnival as criminals.

Yet the force is still planning to go ahead with seven more deployments of the technology later this year.

“Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified – or misidentified – everywhere they go,” said director of Big Brother Watch, Silkie Carlo.

“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for and that poses a major risk to our freedoms.

“This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”

The report also found that South Wales Police store photos of all innocent people that are incorrectly matched by the facial recognition for a full year, without their knowledge, resulting in a biometric database of over 2,400 innocent people

The Home Office is said to have spent £2.6 million ($3.5 million) funding South Wales Police’s use of this technology.

Alongside the release of its report, Big Brother Watch said it’s also taking its findings to Parliament today to launch a campaign calling for police to stop using the controversial technology.

The campaign is already backed by MP David Lammy and 15 rights and race equality groups, including Article 19, Football Supporters Federation, Index on Censorship, Liberty, Netpol, Police Action Lawyers Group, the Race Equality Foundation and Runnymede Trust.