This image was removed due to legal reasons.

Technology companies are finally starting to offer help to people whose nude photos wind up being published online without their consent. Reddit and Twitter both banned the sharing of "nonconsensual nude images" on their platforms earlier this year. And on Friday, Google announced that it will let people remove "revenge porn" from Google search results. Given how resistant Google usually is to altering search results, this is a huge deal.


"This is great news and a welcome change in Google's policy," said Colette Vogele, a lawyer who co-founded Without My Consent, an advocacy group that helps victims whose intimate images are shared online. "We need to see how it will play out, and details like how verification would be accomplished, but looks like a great step for helping victims and survivors of nonconsensual pornography, digital domestic violence, and online harassment."

Amit Singhal, Google's senior vice-president of search, wrote in a blog post that the search giant had heard "troubling stories of 'revenge porn:' an ex-partner seeking to publicly humiliate a person by posting private images of them, or hackers stealing and distributing images from victims’ accounts." Yes, indeed, these stories have been told for at least the last four years, but the fact that it happened en masse last fall to iCloud-using celebrities, such as Jennifer Lawrence, in what Reddit and 4Chan users deemed "The Fappening" seems to have galvanized tech companies to do something about it.


Google will start treating nude photos like it does social security numbers or signatures, other categories of information that Google will remove from search because they enable identity theft. Given the impact nude photos can have on a person's reputation, making getting a job difficult, for example, their showing up in search can be just as damaging.

"Google's actions provide victims with immediate relief," said lawyer and other Without My Consent co-founder Erica Johnstone, who works with people whose nude images have been released. "Victims can exhale knowing that when an prospective employer or colleague or family member googles their name, perhaps the first thing that will pop up on Search will be their accomplishments, their life's work, their ideas about the world, and not a fly-by-night porn site exploiting their naked form."

"Our philosophy has always been that Search should reflect the whole web," wrote Singhal. "But revenge porn images are intensely personal and emotionally damaging, and serve only to degrade the victims—predominantly women. So going forward, we’ll honor requests from people to remove nude or sexually explicit images shared without their consent from Google Search results."

Google says it will have a form available in coming weeks to allow people to make requests for nude photo removal. It will likely be similar to Google's "Right to be Forgotten" form, that it set up for Europeans once a court said it had to give them the right to have irrelevant and damaging results removed from a search of their names. As Propublica privacy journalist Julia Angwin tweeted, "This is a big step. Google extends the right to be forgotten to revenge porn."


Unfortunately, you can only get nude images removed. Terrible, clothed images of you will remain in your Google search results.