Microsoft deletes massive face recognition database Published duration 7 June 2019

image copyright South Wales Police image caption South Wales Police is one of three UK police forces to use automatic facial recognition technology

Microsoft has deleted a massive database of 10 million images which was being used to train facial recognition systems, the Financial Times reports.

The database was released in 2016 and was built of online images of 100,000 well-known people.

The database is believed to have been used to train a system operated by police forces and the military.

The deletion comes after Microsoft called on US politicians to do a better job of regulating recognition systems.

Active use

Microsoft told the FT the database was no longer available, because the person who curated it had now left the company.

Last year Microsoft President Brad Smith asked the US Congress to take on the task of regulating the use of facial recognition systems because they had "broad societal ramifications and potential for abuse".

More recently, Microsoft rejected a request from police in California to use its face-spotting systems in body cameras and cars.

The massive set of images, called the MSCeleb database, was compiled from images of celebrities found online.

The Megapixels project, which tracks face databases, said the "majority" of images were of American and British actors, but it added that it also included a lot of people who "must maintain an online presence for their professional lives".

This meant that it included journalists, artists, musicians, activists, policy makers, writers and researchers.

Even though the data is no longer available from Microsoft, it is probably still being used by people who downloaded a copy.

"You can't make a data set disappear," Adam Harvey from the Megapixels site told Engadget. "Once you post it, and people download it, it exists on hard drives all over the world."

In the UK, police forces have been criticised for trialling home-grown facial recognition systems that have proved to be bad at recognising people. One trial was wrong in 92% of the cases it flagged.