Facebook to let users rank what news sources are trustworthy

Show Caption Hide Caption Facebook is making major changes to its newsfeed Two billion users will see major changes as part of what the social network has planned to address growing controversy over the role it plays in people's lives and in society.

SAN FRANCISCO — What is a credible source of news? Facebook plans to let its users decide.

Responding to charges it did not do enough to curb the viral spread of disinformation and divisive messages from Russian operatives, CEO Mark Zuckerberg says Facebook will boost the sharing of high quality news by allowing its U.S. users to rank which news organizations are most trustworthy to deliver it to their news feed.

Publications trusted by a broad cross-section of Facebook users will get priority over those that have low trust ratings. The change — which will rely on surveys of Facebook users — were scheduled to roll out Monday.

For Facebook, which has long resisted becoming an arbiter of what's fact and what's fiction, the introduction of a trustworthiness score for news outlets represents a major philosophical shift.

"There’s too much sensationalism, misinformation and polarization in the world today," Zuckerberg wrote in a Facebook post late Friday. "We decided that having the community determine which sources are broadly trusted would be most objective."

Pressure has been building on Facebook and its CEO as the toxic content flowing through Facebook — violent live videos, false news articles and Russian operatives trying to stir political unrest and influence the 2016 U.S. presidential campaign — has been blamed for ripping holes in the social fabric.

Zuckerberg recently declared that his personal challenge for the year, which in the past has run the gamut from learning Mandarin to slaughtering his own meat, would be to fix what ails Facebook. A big part of that effort, he said, would be "making sure that time spent on Facebook is time well spent."

The role Facebook plays in what news people consume has dramatically expanded in recent years. About 45% of U.S. adults get their news there, according to a Pew Research Center survey.

News will get a slightly less prominent role in the news feed — 4%, down from 5% —with sweeping changes announced last week that Facebook says will promote meaningful social interactions over aimless scrolling. Soon, Facebook says you will see more status updates from friends and family and fewer articles and videos.

Experimenting with what news gets shared on Facebook can have unexpected and unintended consequences. And polling users on what news sources are trustworthy could favor partisan outlets that reflect the personal beliefs of users over the more objective efforts of well-known publications.

Some reacted to the news with skepticism.

"Policing this is going to be a nightmare for Facebook and publishers are going to go batty trying to game it," Jessica Lessin, founder and editor-in-chief of the technology news service The Information, said in a tweet.

Nicco Mele, director of the Shorenstein Center on Media, Politics and Public Policy at Harvard University, told the Wall Street Journal that even if Facebook isn’t taking sides, relying on users to judge the quality of news could backfire.

"You may end up with reality television," Mele said.

More: Facebook is making a big change to your news feed

More: Facebook users are fed up with fake news

Facebook was sharply criticized in the aftermath of the 2016 presidential election for helping creators of false news quickly and cheaply expand their reach, confusing voters. Previous attempts to crack down on "fake news," such as partnering with fact-checkers and using labels to highlight "disputed" news sources, stumbled.

Executives have regularly backed away from claims that, as one of the world's most influential information portals, Facebook acts as a media organization — with the responsibility to make editorial decisions that goes along with that mantle.

Instead, Zuckerberg has insisted that Facebook is a technology company, if a new breed of one, and has leaned on other avenues to crack down on misleading information.

"The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem," wrote Zuckerberg. "Or we could ask you — the community — and have your feedback determine the ranking."