Facebook is rolling out several new features across its apps to combat content violations and improve safety as part of an effort to rebuild trust with users.

Starting Wednesday, Facebook is updating its new community standards to include a section that lets users track the updates made each month, allowing people to see what is and isn't allowed on the site. It's also including explanations on specific policy changes to explain why content is removed.

The company is increasingly relying on third-party sources to combat fake news and is looking to journalists, academics and fact-checking experts to promote reliable stories.

"We need to find solutions that support original reporting, promote trusted information, complement our existing fact-checking programs and allow for people to express themselves freely — without having Facebook be the judge of what is true," Guy Rosen, Facebook's vice president of integrity, and Tessa Lyons, the head of news feed integrity, said in a statement.

Since last year's Cambridge Analytica scandal and the continuing fallout of the 2016 election, Facebook has been regularly hosting media calls to discuss its role in reducing the spread of misinformation and harmful content after the company was blamed for allowing voters to be improperly influenced in national elections.

To get a handle on misinformation that's being distributed by video, Facebook is expanding its fact-checking program with the Associated Press. That effort will also cover Spanish-language content.

The updates come about a month after Facebook was criticized for its handling of a livestream showing a massacre at a mosque in New Zealand. The company added a feature in 2017 to filter harmful content, but Facebook founder and CEO Mark Zuckerberg the technology failed to flag the livestream.

Facebook is also making changes to its other properties. Instagram is updating its algorithm to crack down on posts that are deemed inappropriate even if they don't technically violate guidelines. For example, posts with sexually suggestive content won't appear under the Explore feed.

For the Messenger app, Facebook is introducing a verified badge to help people recognize scammers and users who are pretending to be other people. Users will be given more blocking capabilities and additional control over who can message them.

WATCH: How Facebook might change to accommodate regulation