GitHub is banning code from DeepNude, the app that used AI to create fake nude pictures of women. Motherboard, which first reported on DeepNude last month, confirmed that the Microsoft-owned software development platform won’t allow DeepNude projects. GitHub told Motherboard that the code violated its rules against “sexually obscene content,” and it’s removed multiple repositories, including one that was officially run by DeepNude’s creator.

DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI “deepfakes.” The development team shut it down after Motherboard’s report, saying that “the probability that people will misuse it is too high.” However, as we noted last week, copies of the app were still accessible online — including on GitHub.

Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. “The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code,” wrote the team on a now-deleted page. “DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”

GitHub bans ‘obscene sexual content’

GitHub’s guidelines say that “non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes.” But the platform bans “pornographic” or “obscene” content.

DeepNude didn’t invent the concept of fake nude photos — they’ve been possible through Photoshop, among other methods, for decades. And its results were inconsistent, working best with photos where the subject was already wearing something like a bikini. But Motherboard called them “passably realistic” under these circumstances, and unlike Photoshop, they could be produced by anyone with no technical or artistic skill.

Politicians and commentators have raised alarm about deepfakes’ potential political impact. But the technology began as a way to create fake, non-consensual porn of women, and like those deepfakes, DeepNude pictures primarily threaten women who could be harassed with fake nudes. At least one state, Virginia, has grouped using deepfakes for harassment alongside other forms of nonconsensual “revenge porn.”

None of this can stop copies of DeepNude from appearing online — but GitHub’s decision could make the app harder to find and its algorithm harder to tinker with.