Pornhub will be deleting “deepfakes” — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. “We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it,” the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes “revenge porn, deepfakes, or anything published without a person’s consent or permission.”

As Motherboard points out, you can still find dozens of fake celebrity porn videos simply by searching “deepfakes” on Pornhub. A statement from company VP Corey Price, given to The Verge, says that this content will be removed as Pornhub is made aware of it, pointing us to Pornhub’s content flagging form. “Users have started to flag content like this and we are taking it down as soon as we encounter the flags,” he says. “We encourage anyone who encounters this issue to visit our content removal page so they can officially make a request.”

Deepfakes AI porn — which is generally unapproved face-swapped videos of female celebrities, although it can involve any combination of face and body — exists in fuzzy legal territory. It’s essentially a vastly more sophisticated version of pornographic photoshops, or a vastly more democratized version of Hollywood’s digital recreations of actors. Although the videos might edge into copyright infringement or defamation, the most obvious way to shut them down is to go after big platforms like Pornhub or Gfycat. Despite these recent bans, though, its central community hub on Reddit remains untouched — and, especially since Reddit isn’t hosting the porn in question, it seems likely to remain so.

Update 8AM ET: Added statement from Pornhub.