The site had removed some of the videos after BuzzFeed got in touch, but others were still listed (again, with hundreds of thousands of views) after searching for other keywords.

In a statement, VP Corey Price reiterated that Pornhub would remove deepfakes and other non-consensual content "as soon as we are made aware of it" through methods like content flags and a submission form. This material is "a form of sexual assault," Price said.

The problem, as you might imagine, is that this is a purely reactive approach that doesn't actually deal with the issue. What's the likelihood that someone looking for this content is going to report it? While it's difficult for any large video upload site to completely eliminate unwanted videos (just ask YouTube), it shouldn't be possible to find an abundance of offending clips with basic search terms. Services ideally make at least some effort to proactively remove and filter content that violates their terms of service.

We'd add that Pornhub has a tendency toward inaction in other categories as well. It's still well-known as a haven for (clearly unauthorized) game-themed CG porn, often based directly on character models lifted from the games' resource files. The studios have tried to crack down on this material in the past, but it's still abundant. There may not be a purge of this content until there's a fundamental shift in Pornhub's stance toward all material that violates its terms, not just the most offensive.