Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.

Facebook is taking a decidedly low-tech approach to tackling the spate of harmful videos posted recently on the social network.

Hours before the company is set to announce its earnings for the first quarter, Facebook CEO Mark Zuckerberg revealed he is hiring 3,000 humans — not bots! — to help tackle the problem of harmful content on the site.

Facebook CEO Mark Zuckerberg announced Wednesday that the social network giant would be adding 3,000 workers to the current stable of 4,500 people worldwide who currently monitor posts. Getty Images

Related: Cleveland Shooting Highlights Facebook’s Responsibility in Policing Depraved Videos

The additional employees will join Facebook sometime in the next year, Zuckerberg wrote in a Facebook post, adding to a team that is already 4,500-strong.

Byers Market Newsletter Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox. This site is protected by recaptcha

"These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they're about to harm themselves, or because they're in danger from someone else," Zuckerberg wrote.

The move comes as Facebook has been the target of intense criticism following a rash of harmful videos posted to the platform, including live-streamed teen suicides, the seemingly random murder of a Cleveland man, and a father in Thailand who killed his young daughter in a live-streamed video.

Related: When Seeing the Most Depraved Side of the Internet is Your Job

And that's just a small list of the horrors that have been broadcast since the tool became available to Facebook's 1.86 billion users in the past year.

Zuckerberg said Facebook is also investing in more ways of fostering a safe community. In February, the social network announced the integration of suicide prevention tools into Facebook Live, allowing viewers the option to report a video and to get resources to help them reach out to a friend in need.

Facebook's algorithm is also getting smarter and is working to "identify posts as very likely to include thoughts of suicide." After those posts have been flagged, a human member of Facebook's team will review them and, if appropriate, reach out to the person with resources.

"This is important," Zuckerberg said of Facebook's investment in safety tools. "Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself."

He added, "In other cases, we weren't so fortunate."

The recent addition of enhanced safety features is building on a vision Mark Zuckerberg outlined in a 5,700 word letter in February, sharing his vision for building a safer and more inclusive community.

Related: Mark Zuckerberg Pens a 5,700-Word Letter on the Dangers of Isolationism