London’s King’s Cross is using facial recognition to track tens of thousands of people and Canary Wharf is considering following suit, across a total area that covers more than 160 acres of the city.

The 67-acre King’s Cross area, which has been recently redeveloped and houses several office buildings including Google’s UK headquarters, Central Saint Martins college, schools and a range of retailers, has multiple cameras set up to observe visitors.

Argent, the property developer for the King’s Cross estate, said: “These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public.”

The person did not confirm how many cameras were in use or how long facial recognition had been active in the area.

Meanwhile, Canary Wharf is in talks to install facial recognition across its 97-acre estate, which counts many major financial services companies, including Barclays, Credit Suisse and HSBC, as tenants.

Canary Wharf Group, the company that owns both private offices and public spaces in the area, is actively speaking to facial recognition suppliers to pilot the technology in an area traversed by 140,000 people daily, as part of its security systems.

“What’s really worrying is for any worker who doesn’t want to participate. This is essentially a geofenced experiment, so I don’t see how anybody could opt out of it,” said Stephanie Hare, an independent researcher of facial recognition technologies in the UK. “You can’t opt out of walking around London, or working there. How do they defend it when this technology is the subject of legal action and MPs are calling for a moratorium on it?”

Under current general data protection laws, collecting sensitive personal data including faces requires explicit consent from the people being observed.

If the technology were to be adopted in Canary Wharf, it would not operate continuously on pedestrians and office workers, but be limited to specific purposes or threats, according to sources close to the company.

Canary Wharf currently operates at least 1,750 CCTV cameras, as well as an automatic licence plate recognition system to track vehicles in the area, according to Genetec, a Canadian company that supplies the district with its security software. The systems then automatically notify police of any hits from a vehicle watchlist.

As facial recognition technology has become consumerised in recent years, via companies such as Apple and Facebook, it has been adopted enthusiastically in the UK, where at least two police forces including London’s Metropolitan Police and South Wales Police have trialled facial recognition systems on innocent citizens.

Convenience stores such as Budgens and supermarkets — including Tesco, Sainsbury’s and Marks and Spencer — all have cameras that are already, or soon will be, capable of facial recognition, used for applications ranging from crime prevention to estimating the age of those buying alcohol or cigarettes.

London already has an estimated 420,000 CCTV cameras operating in and around the city, although many were installed as analogue video systems that are low-quality and difficult to scale. Increasingly, these are being upgraded to “internet protocol” cameras that are connected to the internet, which have far better image resolution and can be accessed remotely. These cameras can also be upgraded to include facial recognition software.

“The private sector uses of facial recognition need a lot of attention because there is less regulation and governance here,” said Pete Fussey, a criminologist at the University of Essex who specialises in digital surveillance. “The privatisation of public spaces in London raises interesting legal questions [for surveillance].”

The Information Commissioner’s Office, which is the UK regulator for data protection, said it was looking into the use of facial recognition technology by police and private companies. “Since new data protection laws came into effect on 25 May 2018, there are extra protections for people. These require organisations to assess and reduce the privacy risks of using new and intrusive surveillance technologies like automatic facial recognition,” said a spokesperson.

“Organisations wishing to automatically capture and use images of individuals going about their business in public spaces need to provide clear evidence to demonstrate it is strictly necessary and proportionate for the circumstances and that there is a legal basis for that use.”

Letter in response to this article:

Function creep should be ringing alarm bells / From Prof Emeritus John Taylor, St Andrews, Fife, UK