The AI-powered DeepNude app made a splash earlier this year as it allowed users to upload photos of a clothed woman in order to see her undressed. The app was taken down by its developers following online outrage, but it has seemingly inspired copycats.

Two new "X-ray vision" apps, DeepNude.to and NudifierApp, have emerged online following the shutdown of the original DeepNudes service, whose algorithm digitally altered images of clothed women, adding eerily realistic-looking breasts and genitalia onto them, SexTechGuide reported.

The apps, run absolutely anonymously, are based on a neural network that has been trained on thousands of photos to learn to undress them.

Both services charge users to access their software via cryptocurrencies, and while the NudifierApp says it wants to ensure user privacy by accepting digital coins, the DeepNude.to team says that they have plans to introduce credit card payments as well.

© Photo : screenshot An image-spoofing website, DeepNude.to

The anonymous founders of DeepNude.to claim that they are not breaking any existing laws and are "embedding visible and invisible watermarks to highlight that this content is fake".

While admitting that there are those who want to use the app's technology "maliciously", for instance, for "revenge porn", a developer behind the NudifierApp insists that it's typically easy to "identify whether the nude is created by [a neural] net".

He went as far as to suggest that if the technology continues to develop, it would solve the problem of the distribution of sexually explicit images without an individual's permission.

"The quality mostly depends on thoroughness of [the person] using advanced mode. So you can even reach the same result as a professional fake artist in just one minute or less. But look, if deepnudes become indistinguishable from real nudes (which will never happen, of course), it will solve any issue related to revenge porn forever. So we don't see any ethical problems regardless of deepnude quality. It is just a mass-media panic", NudifierApp's founder told SexTechGuide.

Both "copycats" appeared after the originally released DeepNude app was pulled in late June by its own creators following online uproar.

Horrifying deepfake app called DeepNude realistically removes the clothing of any woman with a single click. It costs $50 and only works on women. (The black bars were added after using the app.) pic.twitter.com/5KS36FPTqZ — Mike Sington (@MikeSington) 27 июня 2019 г.

The app had a free version that put a huge watermark on the deep fake images it created, and a paid premium version that produced uncensored photos. Users somehow managed to bypass the mechanism and gained free to access to the premium version.

At the time, they explained in a tweet that they were unable to control the traffic after their brainchild went viral, and they didn't want to make money that way.