Reddit has banned the r/deepfakes subreddit that’s devoted to making AI-powered porn using celebrities’ faces, classifying it as a form of “involuntary pornography.” In a post today, Reddit announced an update to its rules on posting sexual imagery of a person without their consent. The new rule extends a ban on posting photos or video of people who are nude or engaged in sexual acts without the subject’s permission, saying that this includes “depictions that have been faked” — including the sophisticated face-swapped videos that have become especially popular on Reddit over the past month. “Do not post images or video of another person for the specific purpose of faking explicit content or soliciting ‘lookalike’ pornography.”

This doesn’t affect all AI-based face swapping enthusiasts on Reddit. The subreddit for FakeApp, a program that allows anyone to swap faces in videos, is still online. So is r/SFWdeepfakes, which is devoted to non-pornographic use of the technology — like putting Nicolas Cage in every movie. At least one small, specific subreddit devoted to simulated porn for an individual actor also seems to have slipped under the radar.

But along with the central deepfakes hub, the main subreddit for posting not-safe-for-work deepfakes has gotten shut down, and so has the community r/YouTubefakes. The subreddit r/CelebFakes, which focused on non-AI-powered photoshopped pornographic images, was initially left online, but removed shortly after the announcement.

Reddit issued the following statement to reporters, noting that it’s part of a general splitting of its rules around sexual imagery.

Reddit strives to be a welcoming, open platform for all by trusting our users to maintain an environment that cultivates genuine conversation. As of February 7, 2018, we have made two updates to our site-wide policy regarding involuntary pornography and sexual or suggestive content involving minors. These policies were previously combined in a single rule; they will now be broken out into two distinct ones. Communities focused on this content and users who post such content will be banned from the site.

In response to questions from users, Reddit suggested that it would rely on reports to shut down future deepfakes material: “First-party reports are always the best way for us to tell. If you see involuntary content of yourself, please report it. For other situations, we take them on a case-by-case basis and take context into account.” It’s theoretically possible for deepfakes to be totally consensual, involving performers who know their faces will be swapped — but not particularly likely, at least right now.

The r/deepfakes subreddit was created after Motherboard reported on the phenomenon of AI-generated porn late last year. Around the time of its banning, the community had around 90,000 subscribers. Its ancillary subreddits were significantly smaller — the one for collecting NSFW deepfake images had around 23,000 subscribers, for instance.

Reddit follows several other platforms that have already banned deepfakes pornography. That includes Gfycat, Discord, and recently Pornhub, which said that deepfakes imagery counted as nonconsensual pornography.

Reddit has balked in the past at removing content that may be ethically distasteful but not illegal. And so far, there’s no legal consensus on deepfakes, although videos could hypothetically run afoul of laws against copyright infringement or unsimulated so-called “revenge porn.” But Reddit, often under public pressure, has banned subreddits featuring things like sexually suggestive images of minors, leaked celebrity nude photos, and “creepshots” of unsuspecting women. In 2015, the platform said it would remove non-consensually posted explicit material (or links to that material) if the subject complained. Now, it’s taken a somewhat unexpected, but unambiguous, stance on simulated involuntary pornography as well.

Update 1:30PM: Added statement from Reddit and more detail about banned communities.

Update 3:00PM: Updated with news that r/CelebFakes has been banned.