British police forces uploaded 18 million pictures of the public to a facial recognition database without telling the Home Office or independent watchdogs.

Thousands of photos in the database are of innocent people who have never been charged with or convicted of a crime.

Biometrics Commissioner Alastair MacGregor QC said he was concerned about the civil liberties implications of the database, as well as the risk of false matches.

Britain’s High Court ruled in 2012 that retaining photographs of innocent people is illegal, with Lord Justice Richards ordering police forces to revise their policy within “months, not years.”

Despite this, almost every police force in England and Wales now has access to a facial recognition database of mugshots created without the knowledge of the Home Office, MacGregor said.

Britain’s first biometrics commissioner told BBC Newsnight that new technology like facial recognition should not take over without wider issues being addressed.

As biometrics commissioner, MacGregor is also responsible for monitoring the use of DNA and fingerprint profiles by police.

While MacGregor recognizes the technology could be “tremendously useful” in catching criminals, he warned its use must be controlled.

“Its value will be very significantly undermined if the public cannot have confidence in it and cannot feel there are proper controls,” he told the BBC.

“I think there is always a danger that if you can do something then you will do it, the technology takes over ... without giving the attention to the other issues that arise in relation to it as one should.”

MacGregor also expressed concern over the technology’s reliability.

“If the facial recognition software throws up a false match, one of the consequences of that could easily send an investigation off into the completely wrong direction,” he said.

Facial recognition technology is currently employed by the UK Border Agency in its ePassport gates, where officials say it is more reliable than humans.

Social media giant Facebook announced last year it had developed software capable of correctly identifying human faces 97.25 percent of the time. Human abilities score 97.53 percent.

Called DeepFace, the software firstly corrects the angle of a photographed face so it looks forward, then it works out a numerical description of the reoriented face. This numerical description is compared with other images until a close enough match is found.

READ MORE:Britain risks ‘sleepwalking into a surveillance state’ – CCTV watchdog

Andy Ramsay, identification manager at Leicestershire Police, told the BBC facial recognition could become more valuable than DNA or fingerprints.

“All three have a place. This is developing. This is going to be, I think, the most cost-effective way of finding criminals,” he said.

He also said Leicestershire Police have a database with 100,000 custody photos.

Former Conservative Home Secretary David Davis said, “It's quite understandable, police always want more powers, but I'm afraid the courts and parliament say there are limits.

“You cannot treat innocent people the same way you treat guilty people. You should not misuse the data in this way. No facial recognition software is 100 percent reliable,” he added.

Norman Baker, the Liberal Democrat former home office minister, said police forces “ought to have stopped and asked themselves what they were doing and if it had public support.”

READ MORE:FBI's facial recognition program hits 'full operational capability'

Chief Constable Mike Barton, of the association of Chief Police Officers, defended the move saying police were “ahead of the game.”

“I hear much criticism of policing that we're not up to speed and it does come as a surprise to me that we're now being admonished for being ahead of the game,” he told the BBC.

However he conceded there was no legal framework for the system.

“If Parliament chooses to ... regulate our use of photographs over and above that which we already have, then I'm more than happy,” he said.