Google is finally taking steps to remove "revenge porn" from its search results. The company announced a new policy on Friday that will allow anyone to fill out a form requesting the removal of nude and sexual images of themselves that have been posted without consent.

Google (GOOG) is the latest and most significant tech company to add a reporting system for taking down this type of content. Reddit updated its policy in February, and Twitter and Facebook (FB) banned it in March. The images are typically taken and shared privately, but then spread online by jilted exes or hackers who obtained them illegally.

"Revenge porn images are intensely personal and emotionally damaging, and serve only to degrade the victims -- predominantly women," Amit Singhal, senior vice president of Google Search said in a blog post. "So going forward, we'll honor requests from people to remove nude or sexually explicit images shared without their consent from Google Search results."

CNNMoney Investigation: The Cyberwar against Women

The tech industry has been under increasing pressure from activists and lawmakers to address the issue. By being proactive, they hope to manage the problem on their own terms instead of dealing with new federal and state laws.

A few state laws exist or are in the works, though many have glaring loopholes. For example, California's revenge porn law only applies when the person spreading the images was also the photographer. It doesn't cover images that were originally taken by the victim or stolen.

Jackie Speier, a member of Congress from California, has been working on a federal law that would ban non-consensual sexual images.

Currently the only "offensive images" Google will remove from search results without a legal request are personal information like social security numbers and financial details or child pornography.

"I applaud Google's new policy, but there is still a gaping hole in the law that leaves victims with little or no legal recourse," Speier said Friday. "We already punish the unauthorized disclosure of private information like medical records and financial identifiers. Why should personal images of one's naked body, given in confidence, be any different?"

Related: To fight revenge porn, I had to copyright my breasts

The new policy will require victims to fill out the online form, but it's not yet known what kind of verification process will be required to have an image removed. Well-known revenge porn sites will not be blocked from Google search results, just individual images.

A recent CNNMoney investigative series highlighted several victims of Revenge Porn.

"I describe it [as] similar to maybe the feeling of getting raped -- you feel like you're that exposed," one victim described. "You feel like a million people are watching ... the most intimate moment of your life."

"I would stay up almost all night, every night, just in a little cave, just searching more and finding more and more," another victim recalled.

Both women's most intimate moments were plastered on revenge porn sites by ex-boyfriends

In an op-ed for CNNMoney, Reddit head of community Jessica Moreno explained why the company took a stand against revenge porn.

"We did not want Reddit to be a part of that nightmare," she said. (In August 2014, Reddit saw record traffic when it was used to disseminate nude images of celebrities obtained by hackers.)

Google's new reporting feature (which will be available in the coming weeks) has been in the works for at least two years. The company has collaborated closely with experts like Cyber Civil Rights Initiative, an anti-revenge-porn advocacy group, on the policy.

"I'm thrilled with this announcement. Google search results are the resume of the digital age, and this step will do so much to prevent the destruction of careers, relationships and lives," said Mary Anne Franks, vice president of the CCRI and a law professor at Miami Law School.