Google has been forced to apologise after its image recognition software mislabelled photographs of black people as gorillas.

The internet giant's new Google Photos application uses an auto-tagging feature to help organise images uploaded to the service and make searching easier.

However the software has outraged users after it mislabelled images of a computer programmer and his friend as the great apes.

Scroll down for video

Google has issued an apology after computer programmer Jacky Alcine, from New York, spotted photographs of him and a female friend had been labelled as gorillas by Google Photos image recognition software. He sent a series of Tweets to Google highlighting the problem (like above) leading Google to issue a fix for the problem

Google said it was 'appalled' and 'genuinely sorry' for the mistake.

The fault comes just over a month after Flickr's autotagging system placed potentially offensive tags on images including mislabelling concentration camps as 'jungle gyms' and people as apes.

Google launched its standalone Photos app in May, announcing a number of features such as automatically creating collections of people and objects like food or landscapes.

GOOGLE FIGHTS REVENGE PORN Google is to censor unauthorized nude photos from its search engine in a policy change aimed at cracking down on a malicious practice known as 'revenge porn.' The new rules will allow people whose naked pictures have been posted on a website without their permission to ask Google to prevent links to the image from appearing in its search results. A form for submitting the censorship requests to Google should be available within the next few weeks, according to the Mountain View, California, company. Google traditionally has resisted efforts to erase online content from its internet search engine, maintaining that its judgments about information and images should be limited to how relevant the material is to each person's query. The company decided to make an exception with the unauthorized sharing of nude photos because those images are often posted by ex-spouses and jilted romantic partners or extortionists demanding ransoms to take down the pictures. Advertisement

Tapping on a person's face was also intended to search for other pictures of that person in your collection.

However on Monday, Jacky Alcine, from Brooklyn, New York, spotted photos of him and a female friend posing for the camera had been grouped into a collection tagged 'gorilla'.

In a series of Tweets to Google he said: 'Google Photos, y'all f***** up. My friend's not a gorilla.

'The only thing under this tag is my friend and I being tagged as a gorilla.

'What kind of sample image data you collected that would result in this son?

'And it's only photos I have with her it's doing this with.

'I understand how this happens, the problem is more so on the why. This is how you determine someone's target market.'

His tweets triggered a response from Yonatan Zunger, chief architect of social at Google, who said programmers were working on a fix to the problem.

He said: 'Thank you for telling us so quickly. Sheesh. High on my list of bugs you *never* want to see happen. Shudder.'

However, even after a fix had been issued Mr Alcine reported two photos were still showing up under the terms gorilla and gorillas.

Mr Zunger later said that Google had turned off the ability for photographs to be grouped under that label to stop the problem.

@jackyalcine Thank you for telling us so quickly!



Sheesh. High on my list of bugs you *never* want to see happen. ::shudder:: — Yonatan Zunger 🔥 (@yonatanzunger) June 29, 2015

He said however the error may occur in photographs where their image recognition software failed to detect a face at all.

He said a fix for that was being worked upon.

He added: 'We're also working on longer-term fixes around both linguistics – words to be careful about in photos of people – and image recognition itself, eg better recognition of dark skinned faces.

Jacky Alcine's tweet about the problem triggered a horrified response from Google's chief architect of social Yonatan Zunger, who said engineers were working on a variety of fixes to prevent similar issues in the future

'Lots of work being done, and lots still to be done. But we're very much on it.

Google launched its Photo app in May this year

'We should have a patch around searches turning up pics of partially obscured faces out very soon.'

Google has now issued an official apology for the mistake and said its image labelling technology was still in its infancy and so not yet perfect.

Previously some users have noticed photos of horses being labelled as dogs for example.

The company said Google Photos also includes a feature that allows users to remove results on incorrectly labelled images, which can help train its image recognition software to be more accurate.

A spokesman said: 'We're appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing.