Facial Recognition for Porn Is Still a Terrible Idea

And whether or not this one guy actually built the thing he claims to have built doesn’t matter.

What happened? A few weeks ago, a Chinese developer based in Germany claimed he built an application that uses facial recognition technology to identify women in online porn by crosschecking it with images on social media. His claim initially went viral on Chinese micro-blogging platform Weibo, to much excitement in the comment thread. Yiqin Fu, a Research Associate at Yale Law School’s Paul Tsai China Center, brought the story over to Twitter (read the full thread) and translated it for an international audience:

“A Germany-based Chinese programmer said he and some friends have identified 100k porn actresses from around the world, cross-referencing faces in porn videos with social media profile pictures. The goal is to help others check whether their girlfriends ever acted in those films.’’

She further reports that, according to the original Weibo thread, over 100 terabytes of video data from porn sites such as PornHub and xVideos were matched against profile pictures from Facebook, Instagram, TikTok, Weibo, and other social media platforms to identify 100k porn performers.

While no one has actually seen the app, and the identity of the developer, who calls himself Li Xu, is unverified, the story made the rounds. It’s about porn, after all.

(Screenshot: Yiqin Fu on Twitter)

On Weibo, Li Xu confirmed to Motherboard that he was planning to release a “database schema” and further “technical details” the following week. That never happened, as he cancelled the live-stream where he was going to answer media questions and. In a take-down that is more likely to stem from the outrage the story garnered online than of introspection and better judgement, he instead deleted the whole project along with its data.

Many discussions of this story on social media revolved around whether or not this app ever existed and worked, how much porn the team may have watched while creating it, and whether or not they have girlfriends. Needless to say — I hope — that this angle distracts from what’s at stake in applying facial recognition software to porn.

This story is just one instance of a wider trend

That someone would build such an app is neither impossible nor unlikely. Asking whether or not they really built this app and how accurately it may or may not have worked feigns misplaced incredulity when we shouldn’t be surprised at all.

Motherboard has warned against precisely this use case two years ago, when both Pornhub and xHamster began using facial recognition technology to automatically tag and categorise their videos. They called it a “privacy nightmare waiting to happen.”

And remember when FindFace was used to dox sex workers and porn performers by matching their faces to profiles on Russian social media platform VK?

The idea is neither new nor original.

Not everyone lives under strong data protection legislation

Li Xu was initially under the impression that there were no legal issues with his project because he had not made the data public, and because sex work is legal in Germany. The latter is correct, but has little to do with the non-consensual collection and processing of personal (worse, biometric) data.

The GDPR defines a number of permissible grounds for processing personal data. Consent is clearly absent in this case, as is any contractual obligation or the public interest. And making a case on the grounds of legitimate interest seems difficult when the only interest is to dox women so men can verify whether their wives and girlfriends ever appeared in porn.

Photos are considered particularly sensitive biometric data when processing them

“through a specific technical means allowing the unique identification or authentication of a natural person.”

That’s good news for those living in the EU, and in other countries that protect citizens from such exploitative data use. But the porn and social media platforms in question attract users globally. By far not every geography is covered by strong data protection, and enforcing existing regulations is often time consuming and expensive.

The implications for performers are particularly problematic in countries where the legality of pornography is restricted or uncertain, performing in porn is highly stigmatised, and data protection legislation is either absent or not strong enough to protect against this particular use case.

Gendered surveillance is an increasing threat

Ultimately, this project was about identifying, exposing, shaming or harassing women who have performed in porn. The stated original purpose of the app, according to the translation on twitter, was to give men a way to verify whether their wives and girlfriends ever appeared in porn.

“The goal is to help others check whether their girlfriends ever acted in those films.”

Many performers use stage names, which is particularly crucial for amateur performers and others needing to keep their forays into the industry separate from their personal life, family or other careers. Doxxing them by the means of facial recognition is not only incredibly intrusive and a severe privacy violation, it’s also gendered surveillance at it’s finest.

And it’s not an isolated incident either. Targeted online harassment neither began nor ended with Gamergate and affects women and marginalised communities in their freedom to express themselves online. So-called stalkerware is a growing trend too. That is, software marketed directly or indirectly to abusive partners for the covert surveillance of a spouse or ex-partner. For anyone who wants to know more, Motherboard has an excellent series on this topic.

And the current debates about deepfakes serve as yet another case in point. Most research and journalism on deepfakes focuses on the methods used and on the political implications. Granted, those are important questions. But it’s worth remembering, that the first mainstream use of deepfake video technology was inserting women’s faces into porn videos to harass and humiliate them.

Women’s bodies, participation, movements, and autonomy have been subject to men’s and society’s watchful and controlling eye long before contemporary surveillance tech came along. But with ever-increasing amounts of data and advances in machine learning come additional ways in which gendered surveillance plays out and is amplified, as scholars of surveillance have been pointing out for years.

And finally, using similar algorithms “for good” is no solution either

In the midst of all the backlash against his project, the developer later added, that the app could also be used by women to identify revenge porn. While I doubt that this afterthought came from a place of genuine concern for the victims of revenge porn, this is technically true. An algorithm capable of identifying porn performers via social media will also find the profiles of women who never wanted to be in porn in the first place.

In practice, however, the risks outweigh the potential benefit. Finding out about revenge porn is rarely the main issue. What victims struggle with much more is getting images removed from the respective platforms, personal safety after their home address and phone number was posted, trauma from the wave of harassment that typically follows, not being taken seriously by law enforcement, and handling the fallout from such an incident with family, friends and workplace. And facial recognition won’t help with any of that.

Protecting data, protecting privacy, and sharing nudes safely are the best defense against revenge porn. Teaching and learning how to do either of these things are a much better use of anyone’s time than trying to fix revenge porn with facial recognition software.