The image recognition software built into Google Photos is impressive. Show it a photo of the Eiffel Tower, and it will know that picture was taken in Paris. Snap an image of a dog or a tree, and it will automatically put them in a group with all your other pictures of dogs or trees.



Related: Google Wants to Store All Your Photos for Free, Forever

But the software is far from foolproof. And when it fails, it does so in a spectacular way — as when it recently processed a photo of two black friends and labeled them “Gorillas.”

View photos

(Jacky Alcine — @jackyalcine)

Jacky Alcine, a 21-year old programmer who lives in Brooklyn, N.Y., was checking out his Google Photos account last night when he saw that the service had automatically generated a folder titled “Gorillas.” It contained nothing but pictures of him and a friend that he had taken in 2013.

When alerted to the error, Google provided a solution to the problem within hours and issued an immediate mea culpa.

View photos

“We’re appalled and genuinely sorry that this happened,” a Google representative told Yahoo Tech.

“We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future,” the spokesperson added.

Alcine first noticed the problem when he looked at his photo collection and found the “Gorillas” folder containing images of himself and his friend, who is also black. He then took to Twitter to call out Google for the issue.

Shortly thereafter, Google’s chief architect of social, Yonatan Zunger, tweeted Alcine asking if Google could access his account to see where things went wrong. A few hours later, Google alerted him that the problem had been fixed.

Alcine said that as of Monday evening, the issue had largely been addressed, though he noted, “there’s still complications with the hands obscuring the face causing it to still match to the gorilla tag. Chimp gives results as well (but not chimpanzee).”

Alcine believes the gaffe was caused by a faulty Google algorithm. But he added, “This could have been avoided with accurate and more complete classifying of black people, especially darker-toned people of color like myself and my friend.”

Story continues