The video spread like wildfire. Jane Doe 11—one of 22 women who sued porn production company Girls Do Porn in 2016 for coercing them to have sex on video and lying to them about how the videos would be distributed—learned from the student council president that "everyone was watching it in the library, so much so that the internet essentially crashed." The student council president also informed her that "the administration actually had to watch the video. They all congregated in a room to watch it, to I guess make sure I didn't say the university name in the video."

On May 1, 2016, in the middle of final exams, a young woman got a text message that would change her life forever. It included a screenshot of a pornographic video posted online, featuring her. Panicking, she quickly tried to justify what she had done. "They said it would only be in Australia," she told her friend, according to court documents. "I only did it for money."

Pornhub claims that victims of nonconsensual porn—as many of the Girls Do Porn videos are—can easily request to remove videos from the site, and that those videos can be "fingerprinted." Broadly speaking, video fingerprinting is a method for software to identify, extract, and then summarize characteristic components or metadata of a video, allowing that video to be uniquely identified by its "fingerprint." According to Pornhub, this would automatically prevent future attempts to upload a video that was flagged.

But even with the official site shut down and its owners in jail or on the run, the ruling has done little to stop the spread of the videos online. Even today, hundreds of Girls Do Porn videos are easy to find, especially on Pornhub, which claims to get 100 billion video views a year and more than 100 million daily visits. Searching Google for Girls Do Porn videos leads users to Pornhub, where these videos are hosted against pre-roll and banner ads that Pornhub's parent company Mindgeek profits from.

The following morning, Jane Doe 11 sent Michael Pratt, the owner of Girls Do Porn and currently a wanted fugitive , an email: "This is not spam but a life or death situation. Please contact me...Please. I beg of you."

In July 2019, Motherboard reported on Pornhub's central role in the Girls Do Porn case. The platform enabled rampant doxing and harassment of the women suing the porn production company, as well as other women not part of the lawsuit who we contacted independently, and "over 100" who are not part of the lawsuit but who the women's lawyers interviewed and said had the same experience. Whenever we published a story about the case, more women who were in Girls Do Porn videos who didn't know about the lawsuit reached out to us with identical stories.

Jane Doe 11 testified that she was devastated to see her video on many websites, including Pornhub, which described her in derogatory language and had more than 9 million views. She found out that her name and personal information were connected to her video and posted on numerous forums, along with links to her Facebook profile. These anonymous users also posted her brother's, sister's, and church's Facebook profiles to harass her. "Such comments made her question her desire to live," the ruling said.

But a Motherboard investigation found that this system can be easily and quickly circumvented with minor editing. Pornhub's current method for removing Girls Do Porn videos and other forms of non-consensual porn not only puts the onus of finding and flagging videos almost entirely on potentially-traumatized victims—those victims can't even rely on the system to work.

During our reporting, we repeatedly asked Pornhub what it was doing to contain and remove the Girls Do Porn videos on its website. On October 22, a Pornhub spokesperson sent us a statement attributed to Pornhub VP Blake White:

"We strongly condemn non-consensual content, including revenge porn. Content that is uploaded to Pornhub that directly violates our Terms of Service is removed as soon as we are made aware of it and this includes non-consensual content. With regards to other unauthorized Girls Do Porn videos, Pornhub takes a number of actions to protect our community, and to keep content that violates our policies off of our platform. We use a state-of-the-art third party digital fingerprinting software, which scans any new uploads for potential matches to unauthorized material and makes sure the original video doesn’t go back up on the platform. Anyone who digitally fingerprints their content is then protected from having their video uploaded by unauthorized parties. It is completely free and strongly encouraged to fingerprint content."

With the consent and cooperation of several women who were featured in Girls Do Porn videos, Motherboard tested Pornhub's content removal and video fingerprinting systems using editing techniques common in porn regular users upload to Pornhub. We found that while Pornhub removed videos when provided with specific links, the video fingerprinting method that Pornhub calls "state-of-the-art" and relies on to automatically moderate its platform can be easily circumvented, allowing anyone to upload Girls Do Porn videos or other non-consensual videos.

"It’s not really 'doing the right thing' when you only act when it is in your self-interest."

To fingerprint a video, victims have to email Vobile, the company Pornhub contracts, to provide the service. After Vobile responds to a request confirming it has fingerprinted the video, the victim then has to flag the video to Pornhub by filling out a short form on the site. Once the videos are removed, it should be theoretically impossible to post the same video again to Pornhub.

To test Pornhub's system, Motherboard first downloaded a Girls Do Porn video from Pornhub. Then, we asked the woman in that video to email Vobile for fingerprinting, and then flag the video to Pornhub. Vobile confirmed the video was fingerprinted and Pornhub removed the specific video we flagged. After that video was removed, we tried to upload the exact same file to Pornhub. That identical video was seemingly automatically removed within an hour. A shorter portion of the same video file we uploaded was also seemingly automatically removed within an hour.

However, a slight edit of the same Girls Do Porn video circumvented Pornhub's fingerprinting method. The specific video we flagged for Pornhub was a 5 minute portion cut from the original Girls Do Porn episode. Our edit took a 30 second portion from the same Girls Do Porn episode, but it was sourced from a different video hosted on another Pornhub site. This 30 second portion featured similar shots to the fingerprinted video. We successfully uploaded this video to Pornhub, where it was available for anyone to find and view for 24 hours, before we removed it ourselves. During this time, it also would have been easy to find via Pornhub's search or tagging system if we labeled the video clearly as a Girls Do Porn video.

Overall, we successfully uploaded eight videos that used footage from the same fingerprinted Girls Do Porn episode. In addition to the 30 second video described above, we successfully uploaded videos by cutting them together with stock footage, removing the sound, and using video of the same episode from different sources of varying quality and with different watermarks. As far as we can tell, Pornhub only stopped us from uploading videos sourced from the specific video file we fingerprinted.