Flickr’s auto-recognition tool for sorting photos appears to be identifying people as animals and the gates of Dachau as a jungle gym.

The tool — which Flickr admits is still getting things wrong and is in the process of teaching itself — was built to automatically add tags to photos so that they could be more easily sorted through by users. But launched just days ago, it is already applying offensive tags to some photos.

It picked other sensitive images such as a photo of a gate at Dachau, which it thought was a "jungle gym".

Clicking on any of the tags shows that most of the auto-tags are correct. But those offensive ones generated anger on Twitter, with comments on the "ape" picture saying "OMG! APE?" and "this auto tagging is really bonkers... shame on flickr".

Flickr told the Guardian that it was "aware of issues with inaccurate auto-tags on Flickr and are working on a fix".

"While we are very proud of this advanced image-recognition technology, we’re the first to admit there will be mistakes and we are constantly working to improve the experience," a spokesperson said. "If you delete an incorrect tag, our algorithm learns from that mistake and will perform better in the future. The tagging process is completely automated – no human will ever view your photos to tag them”.

The tags are also identified as being generated by Flickr's robots by being a different colour. Tags added by people have a grey background, while automatically generated tags have a white one.