Don’t freak out, but your iPhone knows all about your underwear selfies. On Monday, a viral tweet led to thousands of users discovering that the Photos app, on Apple’s iOS and macOS operating systems, knows what a bra looks like – and lets you search for it.

Apple being Apple, it’s vaguely classy, of course: the app will only give responses for “brassiere”. But type that into the search bar and there, in all their glory, are likely to be a fair few pics of people – maybe you – in various states of undress.

Users were perturbed by the discovery (“why are Apple saving these and made it a folder!!?!!?”, asked the original tweeter), but it is actually a feature that has been hovering in iPhones for a year now. And, no, it doesn’t involve someone at Apple scanning through all your images looking for salacious ones.

ATTENTION ALL GIRLS ALL GIRLS!!! Go to your photos and type in the ‘Brassiere’ why are apple saving these and made it a folder!!?!!?😱😱😱😱 — ell (@ellieeewbu) October 30, 2017

Since the launch of iOS 10, iPhones have been capable of classifying more than 4,000 objects and scenes based on the imagery alone. Everything from abacus to zucchini can be searched for, even if you have never labelled a single picture.

The AI that recognises objects was trained on a library of hundreds of thousands of labelled images, and is almost uncannily accurate (not only can it distinguish a dog from a cat, it can tell a dachshund from a corgi). But the actual recognition is carried out entirely on the iPhone itself, with a unique version of the AI running on each device, meaning your brassiere pictures remain entirely private – a secret between you and Siri.

That is different to how some of Apple’s competitors do it. Google, for instance, uploads all images stored on Google Photos to its own cloud, and carries out the bulk of the object recognition there. It can also recognise a brassiere from a brasserie, and is generally better than Apple at the same task – but with a trade-off: the images are (anonymously, and without human involvement) used to further train its own AI.

It may be that Apple’s privacy-focused approach is actually the cause of the furore. Unlike Google, which went to great lengths to explain what it is doing and ask permission to upload pictures, Apple took its trademark approach of playing down the tech in favour of promoting the fact that “it just works”. Sometimes that leads to a moment of magic; here, it seems, the result is simply an awkward surprise.