Facebook says it will start prioritizing news from outlets that its users think are "trustworthy."

The change comes amid criticism that the social network helps to spread misinformation.

Facebook's CEO said the company wasn't "comfortable" deciding for itself whether a news outlet is reliable.



Facebook says it will start sorting news sources by how "trustworthy" its users think they are — a major change as the social media giant continues to come under fire over the spread of misinformation on its platform.

On Friday, the California tech giant announced in a blog post that it would alter the algorithm for picking news to show in its News Feed based on whether the news is considered "trustworthy," whether it is "informative," and whether it is "relevant to people's local community."

Facebook won't be assessing the trustworthiness of news outlets itself. Instead, users are being polled on which outlets they believe to be trustworthy, and that data will be used to rank outlets, said Adam Mosseri, Facebook's head of News Feed.

He wrote in the blog post: "We surveyed a diverse and representative sample of people using Facebook across the US to gauge their familiarity with, and trust in, various different sources of news. This data will help to inform ranking in News Feed."

But there are concerns that this could prioritize partisan sources of information. For example, a right-leaning user Facebook polls might consider CNN extremely untrustworthy but rate a right-wing blog far higher — even if CNN is, in reality, a more accurate source of information about current affairs.

In short: "Trustworthy" is not the same as "accurate."

Earlier this month, Facebook announced major changes to the News Feed to prioritize updates from friends and family while de-emphasizing news and brands, a move aimed at fostering what CEO Mark Zuckerberg called "meaningful interaction."

Meanwhile, Facebook — and the broader tech industry — has come under a barrage of criticism over its impact on society, from its role in spreading Russian propaganda and misinformation during the 2016 US presidential election to its effects on the mental health of children.

"We feel a responsibility to make sure our services aren't just fun to use, but also good for people's well-being," Zuckerberg wrote in a blog post last week.

In a post on Friday outlining the latest change and the rationale for it, the CEO said that Facebook wasn't "comfortable" assessing the trustworthiness of news outlets itself and that asking outside experts wouldn't be "objective." So it views community feedback as the most suitable method.

He wrote: "The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.

"We decided that having the community determine which sources are broadly trusted would be most objective."