Police fined a pedestrian £90 for disorderly behaviour after he tried to cover his face when he saw a controversial facial recognition camera on a street in London.

Officers set up the camera on a van in Romford, East London, which then cross-checked photos of faces of passers-by against a database of wanted criminals.

But one man was unimpressed about being filmed and covered his face with his hat and jacket, before being stopped by officers who took his picture anyway.

After being pulled aside, the man told police: 'If I want to cover me face, I'll cover me face. Don't push me over when I'm walking down the street.'

It comes just weeks after it was claimed the new technology incorrectly identified members of the public in 96 per cent of matches made between 2016 and 2018.

The cameras have been rolled out in a trial in parts of Britain, with the Met making its first arrest last December when shoppers in London's West End were scanned.

But their use has sparked a privacy debate, with civil liberties group Big Brother Watch branding the move a 'breach of fundamental rights to privacy and freedom of assembly'. Police argue they are necessary to crack down on spiralling crime.

Officers previously insisted people could decline to be scanned, before later clarifying that anyone trying to avoid scanners may be stopped and searched.

It was first deployed by South Wales Police ahead of the Champions League final in Cardiff in 2007, but wrongly matched more than 2,000 people to possible criminals.

Police and security services worldwide are keen to use facial recognition technology to bolster their efforts to fight crime and identify suspects.

But they have been hampered by the unreliability of the software, with some trials failing to correctly identify a single person.

Silkie Carlo (left, in the yellow hat), director of Big Brother Watch, observed the technology in use - and asked an officer about the man they had taken aside: 'What's your suspicion?'

The technology made incorrect matches in every case during two deployments at Westfield shopping centre in Stratford last year, according to Big Brother Watch. It was also reportedly 96 per cent accurate in eight uses by the Met from 2016 to 2018.

Six steps behind facial recognition technology The Metropolitan Police uses facial recognition technology called NeoFace, developed by Japanese IT firm NEC, which matches faces up to a so-called watch list of offenders wanted by the police and courts for existing offences. Cameras scan faces in its view measuring the structure of each face, creating a digital version that is searched up against the watch list. If a match is detected, an officer on the scene is alerted, who will be able to see the camera image and the watch list image, before deciding whether to stop the individual. Advertisement

In Romford, the man was fined £90 at the scene by officers, who also arrested three other people during the day thanks to the technology, according to BBC Click.

After being stopped he asked an officer: 'How would you like it if you walked down the street and someone grabbed your shoulder? You wouldn't like it, would you?

The officer told him: 'Calm yourself down or you're going in handcuffs. It's up to you. Wind your neck in.' But the man replied: 'You wind your neck in.'

After being fined, the man told a reporter: 'The chap told me down the road - he said they've got facial recognition. So I walked past like that (covering my face).

'It's a cold day as well. As soon as I've done that, the police officer's asked me to come to him. So I've got me back up. I said to him 'f*** off', basically.

'I said 'I don't want me face shown on anything. If I want to cover me face, I'll cover me face, it's not for them to tell me not to cover me face.

'I've got a now £90 fine, here you go, look at that. Thanks lads, £90. Well done.'

Silkie Carlo, the director of civil liberties group Big Brother Watch, was at the scene holding a placard saying 'stop facial recognition' - before she asked an officer about the man they had taken aside: 'What's your suspicion?'

The officer replied: 'The fact that he's walked past clearly masking his face from recognition and covered his face. It gives us grounds to stop him and verify.'

Ivan Balhatchet, the Metropolitan Police's covert and intelligence lead, said: 'We ought to explore all technology to see how it can make people safer, how it can make policing more effective.

One man was unimpressed about being filmed by the camera in East London and covered his face with his hat and jacket, before being stopped by officers who took his picture anyway

'However, we are completely aware of some of the concerns that are raised, and what we're doing with these trials is actually trying to understand those better so we protect human rights but also keep people safe at the same time.'

Q&A: How police are using facial recognition technology in London Why are the police using facial recognition technology? The Metropolitan Police hopes live facial recognition technology will help reduce crime, especially violent incidents, and could be used as a tactic to deter people from offending. They claim trialling the system in real life conditions will enable them to gather accurate data and learn as much as possible. Are the public being made aware of the trial? The Metropolitan Police said it is making people aware the trial is under way with posters displayed around the affected areas. There will be a 'clear uniformed presence' of the technology, with information leaflets being handed out to members of the public. Are faces stored in a database? The Metropolitan Police said it will only keep faces matching the watch list for up to 30 days - all other data is deleted immediately. Can you refuse to be scanned? People can refuse to be scanned without being viewed as suspicious, although the Metropolitan Police said 'there must be additional information available to support such a view'. Is this the first facial recognition trial? The Metropolitan Police has previously tested the system at major events, first at the Notting Hill Carnival in August 2016 and then at Remembrance Day, as well as at the Stratford transport hub. South Wales Police have also tried the technology at various events including the Champions League 2017 final in Cardiff, rugby matches and Liam Gallagher and Kasabian concerts. How accurate is the technology? Trials in London and Wales have had mixed results so far. Last May, the Metropolitan Police released figures showing it had identified 102 false positives - cases where someone was incorrectly matched to a photo - with only two correct matches. South Wales Police said its trial results improved after changes to the algorithm used to identify people. Advertisement

Eight people were arrested during the trial on January 31 this year, which took eight hours, but just three were a direct result of the technology.

A 15-year-old boy identified by the recognition cameras was arrested on suspicion of robbery but released with no further action.

A 28-year-old man was arrested on suspicion of false imprisonment and a 35-year-old man was arrested on suspicion of breach of a molestation order.

The five other arrests were two teenage boys accused of robbery, a 17-year-old boy accused of firing a gun and two men aged 25 and 46 for drug possession.

Speaking about the man who was fined, a Scotland Yard spokesman said: 'On Thursday, January 31 officers stopped a man who was seen acting suspiciously in the area of Romford Town Centre during the deployment of the live facial recognition technology.

'After being stopped the man became aggressive and made threats towards officers. He was issued with a penalty notice for disorder as a result.

'While anyone who declined to be scanned was not necessarily be viewed as suspicious, officers used their judgement to identify any potential suspicious behaviour.'

Last December, a suspect was arrested by the Metropolitan Police during a trial of the facial recognition technology among Christmas shoppers at Leicester Square in London's West End.

Another man was stopped due to the technology, but found not to be the man the computer thought he was - although he was arrested over another offence.

Big Brother Watch has previously said the technology is a 'breach of fundamental rights to privacy and freedom of assembly'.

They have monitored the officers and say police treat those who avoid the cameras with suspicion.

But the police insist people can decline to be scanned without arousing suspicion and the move is necessary to crack down on spiralling violence crime.

A mandate they have produced to guide officers states: 'It is right and appropriate to bring people who are unlawfully at large to justice as they may otherwise pose a threat of safety to the public through the commission of crime.

The man was then fined £90 at the scene in Romford by officers for disorderly behaviour

'This approach is less intrusive than other methods of tracing wanted persons.

'It is less resource intensive which will save police time and money and allow police to concentrate resources on other priorities.'

The Home Office has said the system can be an 'invaluable tool' in fighting crime, while the National Police Chiefs Council said it could disrupt criminals but insisted any rollout must show be effective within 'sufficient safeguards'.

The technology was first deployed by South Wales Police ahead of the 2017 Champions League final in Cardiff.

That trial led to the technology wrongly matching more than 2,000 people to possible criminals.

Police and security services worldwide are keen to use facial recognition technology to bolster their efforts to fight crime and identify suspects.

Police insist people can decline to be scanned without arousing suspicion and the move is necessary to crack down on spiraling violence crime

But they have been hampered by the unreliability of the software, with some trials failing to correctly identify anyone.

The Metropolitan Police have already used the cameras at the Notting Hill carnival and other forces have used them at football matches.

And pop star Taylor Swift used the software at a concert in the US to identify stalkers in the crowds.

Ms Carlo told MailOnline: 'It is important to note that police are now days away from making a decision about the future of facial recognition in the UK.

'We believe it has no place in a democracy and we will continue with our legal challenge against the Met if they do go ahead with it.

'We believe we have a huge amount of public support for our campaign and have crowdfunded £10,000 to bring the legal challenge.

Campaigners say the use of the technology (file image) is a step too far towards a police state

'This is a turning point for civil liberties in the UK. If police push ahead with facial recognition surveillance, members of the public could be tracked across Britain's colossal CCTV networks.

'For a nation that opposed ID cards and rejected the national DNA database, the notion of live facial recognition turning citizens into walking ID cards is chilling.

'This China-style mass surveillance tool is the very antithesis of British democratic freedom and police using it on our streets sets a dangerous example to countries around the world.

'It would be disastrous for policing and the future of civil liberties and we urge police to drop it for good.'

As for whether police would stop people who are wearing facial coverings for religious reasons, Ms Carlo said it was one of 'many questions police will have to answer if they keep using this'.

She added: 'We've never seen police make anyone remove religious clothing around facial recognition but we have seen them stopping people wearing scarves during winter and hooded coats.'