Facebook is turning to a suite of tech tools — and humans — to help stem the spread of non-consensual intimate photos, commonly referred to as revenge porn, across its network.

The tools are building on a vision Mark Zuckerberg outlined in a nearly 6,000 word letter in February about his desire to make Facebook the safest place it can be.

Facebook CEO Mark Zuckerberg at an event at the Facebook headquarters in Menlo Park, California.

"Our success isn't just based on whether we can capture videos and share them with friends. It's about whether we're building a community that helps keep us safe — that prevents harm, helps during crises, and rebuilds afterwards," he wrote.

Facebook hasn't said how many images or reports they've received relating to possible revenge porn.

A study from the Cyber Civil Rights Initiative found that 93 percent of revenge porn victims suffered major emotional distress, 51 percent had suicidal thoughts, and 49 percent reported being stalked or harassed by others who saw their non-consensual images online.

Related: Facebook Is Putting the Fake News Fight in Your Hands

Antigone Davis, head of global safety at Facebook, told NBC News the focus is on the "unique harm" endured by victims who have their privacy violated by the spread of these images.

"That really played into why we approached this with such enthusiasm," Davis said.

Here's how the new revenge porn reporting process works: If you see an intimate image on the site that looks like it may have been shared without consent, you can use the "Report" link by tapping the downward arrow or "..." next to a post.

Related: Twitter Introduces More Anti-Trolling Measures

That photo will then go to specially trained members of Facebook's community team who will review the flagged images.

Facebook has already carefully defined parameters about nudity on the site, removing photos "of people displaying genitals or focusing in on fully exposed buttocks," according to its community standards.

"We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring," it said.

If the flagged photos appear to violate community standards, they'll be deleted and in some cases, the account belonging to the person who shared the photo will be deactivated.

Related: Facebook Slammed for Censoring Iconic 'Napalm Girl' Photo

Unfortunately, that's not always enough to stop revenge porn. Facebook is now turning to its photo-matching technologies to help stop the image from ever being shared on Facebook, Messenger and Instagram. If someone tries to share an image that has been reported and removed, their attempt will be automatically thwarted and they'll be notified the photo violates Facebook's terms of service.

"We look at this as a first step," Davis said.

She added that Facebook plans to continue the conversation with its partners around the world, including the Cyber Civil Rights Initiative and the UK's Revenge Porn Helpline, among others, to hopefully put an end to the issue.