Members of the House of Lords have slammed plans for the use of Automated Facial Recognition technology (AFR) as a step too far for an already heavily monitored nation; with a number of peers calling for heavier regulation in the House of Lords on March 1st.

The use of such technology may frighten many across the UK and signal a move toward an even more Orwellian-style Big Brother society. That dystopian future however would require a great deal of coordination and – critically – legislation.

According to a number of peers, the government has neither the policy ideas nor the legislation in place to implement AFR in a manner which respects the data rights of the British public.

Liberal Democrat peer, Paul Scriven, went so far as to describe the governments proposals as a ‘few scattered papers; outlining a terrifyingly unregulated system and a ‘make it up as you go’ policy toward collecting citizens images.’

Baroness Jenny Jones of the Green Party, who brought the debate to the Lords, called for an ‘immediate ban’ on the use of AFR until the necessary legislative infrastructure is put in place.

Primarily deployed in police operations, the issue of data collection raises questions over the government’s attitude toward your personal information. With the deadline for the General Data Protection Regulation (GDPR) implementation looming, data rights are at the heart of discussions on how Britain’s digital future will develop.

AFR is but another intrusive system to add to an already heavily monitored society; your data will be collected online, in your home and on the streets you walk down.

What is Automated Facial Recognition?

Facial recognition technology is a form of biometric surveillance which observes and matches unique facial characteristics for the purpose of identifying people. Facial recognition technology has been around for some time – That new iPhone you purchased? It uses facial recognition technology to unlock the screen.

Automated Facial Recognition is a new technology, however – And one that could prove to be highly intrusive. The technology uses images captured on live camera feeds to identify you as an individual, whether it be on public transport, at a football match or walking down the street.

AFR then matches the images captured through databases, such as the Custody Images Database, to identify you.

A Nation Under Observation

The use of CCTV monitoring across the UK is widespread. We are one of the most heavily watched nations on earth. CCTV use has proven its worth in policing, from tackling anti-social behaviour, to traffic policing.

What you may not know, however, is that facial recognition has already been deployed in the UK on a number of occasions.

Our police and security services have been using CCTV to monitor the population for some time, and South Wales Police are currently leading the deployment of facial recognition technology in the UK, benefiting from £2 million of funding from the Home Office.

SWP piloted its use of the technology extensively during the Champions League Final in Cardiff in 2017.

Intrusive policing is not the only issue that British public must contend with in 2018. Our web activity is continually tracked, and your personal data is regularly shared between private companies; those products you searched for on Google now have ads appearing on your Facebook timeline.

With your sensitive data at the mercy of both government and private business, AFR represents another Orwellian-style invasive glimpse into your private life.

How Does AFR Change Things?

Data collection and CCTV surveillance are already fact of life in modern Britain. So what difference does it make to introduce more technology to the fold? Does AFR truly differ from the use of DNA to solve crimes, a system that has proven its worth time and time again?

It absolutely does. Blanket monitoring of a crowd with cameras differs greatly from taking fingerprints or DNA samples; you wouldn’t tolerate that, so why tolerate your images being captured?

The most concerning aspect of AFR lies in data retention and the government’s attitude toward hoarding your data. The Ed Snowden leaks proved that security services such as GCHQ have been utilising AFR for nefarious means, accessing private information and webcams while targeting over 1.8 million Yahoo users in a six month period in 2008.

Security services using AFR in conjunction with the Investigatory Powers Act – which allows them to access a mountain of British web users’ data – bears harrowing similarities to George Orwell’s telescreens in 1984; complete observation of your daily life and home.

Gerry Grant, Chief Ethical Hacker at the Scottish Business Resource Centre believes a serious discussion is needed over the government’s use of data gathered through the use of AFR, saying “it’s clear that there needs to be a big conversation around automated facial recognition.



He added: “there are clear and obvious issues with facial recognition and security, not least who is holding information attained from the tech.”

Efforts to Curb Data Hoarding

In 2012, the High Court ruled that the police’s tactic of storing of your images on the Custody Images Database was unlawful. However, the government has been criticised for its lax attitude in complying with this ruling and has continued to do so for over five years.

The Custody Images Database holds some 19 million images and videos of individuals, with many of those neither convicted nor aware that their information may still be held on the Police National Database.

The storage of sensitive data, like images captured through AFR, does not have the necessary safeguards to protect your privacy. In Scotland specifically, a 2016 audit revealed that Police Scotland lacked any independent oversight of AFR.

More recently, the Independent Advisory Group on the Use of Biometric Data, led by John Scott QC is working to provide a legislative framework for police use of biometric data and its associated technologies; specifically AFR

DIGIT spoke to Professor Bill Buchanan, Head of the Cyber Academy at Edinburgh Napier University. He believes that we are all too familiar with abuses of privacy, and AFR represents another form of intrusive monitoring the public may not be aware of. “What happens if this data is leaked?” asked Professor Buchanan. “How embarrassing would it be for individuals, where there traces of activity could be tracked down to the street their were in, and who they came in contact with.”

Mr Buchanan acknowledges that “While in high risk areas, such as in airports, we accept that we may be tracked, but in other areas it is perhaps too much of a liberty to track our footsteps.”

GDPR Compatible?

GDPR will radically change the data privacy rights of millions of EU citizens, and with Britain complying with the legislation, Professor Buchanan believes the use of intrusive data tracking technology could prove a stumbling block: “This approach to gathering data is now coming into question, as it does not comply with GDPR, and if the UK wants to create a frictionless border for data, it must comply with GDPR.

“A non-compliance would bring so many problems, and could see data centres and data-related industries moving from the UK to mainland Europe.”

GDPR will primarily affect businesses in the EU, but the implementation of this legislation raises the question of how government can abide by one rule for data protection, while flagrantly disregarding rights in the name of security.

This legislation specifically highlights biometric data as a ‘sensitive’ form of personal information, and one that requires a great deal of protection. With the apparent lack of clear-cut information on the issue of AFR and data collection, Lord Paddock (Liberal Democrats) emphasised the need for robust legislation on March 1st, saying “without regulation and oversight there is the potential for Nineteen Eighty-Four to become a reality, albeit 34 years later than originally envisaged”

Like this: Like Loading...