New York (CNN Business) On Bumble, lewd pictures will soon come with a warning.

The company, which launched as a female-focused dating app but has since expanded its service to networking for friends and jobs, announced Wednesday plans to introduce a feature in June that uses artificial intelligence to flag inappropriate images sent through direct messages. Recipients of the images will be able to choose how to respond: View the image, block it, or report it to Bumble.

The image in question will be blurred, just as all images are when sent through private messages on Bumble. The company enabled photo sharing four years ago and requires recipients to hold down the image in order to view it. That decision was intended to be a guardrail, giving individuals an additional layer of choice to view any photo before it appears with clarity in a chat. Bumble also watermarks all photos with the sender's image — a bid to hold people accountable to the images they share.

Majority Bumble owner Andrey Andreev and founder/CEO Whitney Wolfe Herd

But existing safeguards on Bumble apparently aren't enough to crack down on the spread of lewd images, despite the fact that nudity and pornography are banned from its platform.

Bumble CEO and founder Whitney Wolfe Herd told CNN Business that the new feature, called Private Detector, is a "gesture" to show that it is "desperately trying to build safety products to engineer a more accountable internet, not just talk about it."

Read More