Flickr sparked some controversy back in May after it was discovered that the service’s new autotagging feature was prone to mislabeling black people as “apes.” It looks like Google Photos developers didn’t learn from Flickr’s embarrassing misstep: a Google developer is apologizing after it was found that Google’s Photos app misidentifies photos of black people as “gorillas.”



The problem was first reported by programmer Jacky Alciné, who shared his surprise in a series of Tweets:

“This is how you determine someone’s target market,” writes Alciné, suggesting that Google didn’t test the app with a large number of black people before releasing it to the public.

To Google’s credit, the response was swift and strongly worded. Yonatan Zunger, Google’s Chief Architect of Social, began to respond to Alciné directly through Twitter, apologizing for the app’s mistake and working with him to get it fixed:

Zunger put a development team on the issue, and just hours later, he reported that a fix for the problem was being rolled out to Google Photos users. He also says that Google is working on long-term fixes that will ensure that certain words are not used when tagging photos of people.

This error was just “ordinary machine learning trouble,” Zunger tweets, but it was “particularly bad” due to the “history of racism.”

“Mistagging white people as lemurs or babies as seals wouldn’t have been a big deal.”

(via Jacky Alciné via BBC News)

P.S. Back in 2010, Nikon was accused of making a “racist” camera. When it was used to photograph Asian subjects, the camera’s blink detection feature would ask “Did someone blink?”