Facebook is turning to its users to help fight misinformation and fake news.

Starting next week, the company will begin allowing its users to rank news sources they see as the most credible and trustworthy so it can better rank and prioritize them in its News Feed.

In a post on Friday afternoon, Facebook CEO Mark Zuckerberg described the new strategy as a "big change" and a "major update" that will ultimately lower the percentage of news inside News Feed from 5% to 4%.



"The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division," Zuckerberg wrote in a company blog post. "We decided that having the community determine which sources are broadly trusted would be most objective."

The move comes roughly a week after Facebook announced it would dramatically overhaul its core News Feed product to focus more on engaging content from friends and family rather than "passive" content like text-laden videos or low-quality news.

Friday's change is notable, however, given Facebook's fear of being perceived as an arbiter of news quality. The company has shied away from ranking publications on credibility and instead attempted to combat misinformation by employing stopgap measures, including independent fact-checkers. Indeed, Facebook is going to great lengths to assure skeptics that its users and not its employees will be the ones making the value judgments. In an interview with the Wall Street Journal, Facebook's head of News Feed, Adam Mosseri, noted that the company was entering "interesting and tricky" territory but that "the important distinction is that we’re not actually deciding what is trusted and what is not — we’re asking our community to decide."

The move could potentially disproportionately benefit publications that publish less contentious news and shy away from covering subjects like politics or race. Neutral publishers — the USA Todays of the world — might be at an advantage over a publication with a credible reputation that frequently engages in reporting on controversial issues.

Questions abound. Will Facebook rank publications with a score or along a spectrum — X publication is Facebook's most trusted — or will the surveys be used to divide publications into buckets to screen out likely peddlers of misinformation?

Facebook says the surveys will come from a cross section of its users, but it's unknown just how many users it plans to survey or how it will determine that cross section. It's unclear if Facebook will factor in controls to help ideological bias — given that legitimate news outlets on both sides of the political spectrum are often attacked or dismissed by ideological enemies as "fake news." Similarly, it's unclear if the platform will be able to safeguard survey results from bad actors and trolls (or even paid activists) looking to down-rank specific publications despite their credibility.



For critics of the platform, the move will likely be received as yet another abdication of responsibility for a problem that it helped to create. By surveying users, Facebook is hoping to avoid claims of censorship and bias by punting one of the existential questions facing the future of the platform to its users. It's also betting a whole lot on the notoriously fickle and tribal news judgment of the masses. But 2 billion people can't be wrong — can they?







