One week after it emerged that FaceBook has finally thrown in the towel on free speech, and would henceforth mute accounts it deems "untrustworthy", leaving open the question of just how would Zuckerberg would decide who is or isn't trustworthy, on Friday we got the answer.

In a blog post, the company revealed its plans to start ranking news sources in its feed based on user evaluations of credibility, in what Facebook said will be a step in its effort to fight "false and sensationalist information" which also will push the company even deeper into a role it has long sought to avoid— that of what the WSJ called a "content referee", and which we would define even simpler: the internet's biggest censor.

This is how the company explained the biggest change in its content aggregation and distribution in years:

Last year, we worked hard to reduce fake news and clickbait, and to destroy the economic incentives for spammers to generate these articles in the first place. But there is more we can do. In 2018, we will prioritize: News from publications that the community rates as trustworthy

News that people find informative

News that is relevant to people’s local community To be clear, you can still decide which stories appear at the top of News Feed with our See First feature. Last week, we announced major changes to News Feed that are designed to help bring people closer together by encouraging more meaningful connections on Facebook. As a result, people will see less public content, including news, video and posts from brands. Today we’re sharing the second major update we’re making this year: to make sure the news people see, while less overall, is high quality. Starting next week, we will begin tests in the first area: to prioritize news from publications that the community rates as trustworthy. How? We surveyed a diverse and representative sample of people using Facebook across the US to gauge their familiarity with, and trust in, various different sources of news. This data will help to inform ranking in News Feed. We’ll start with the US and plan to roll this out internationally in the future.

How will Facebook decide who is n "informative" and "trustworthy" source? It will open it up to its audience, which simply means that the echo chamber in which any one users finds themselves will only get bigger.

When we rank and make improvements to News Feed, we rely on a set of core values. These values — which we’ve been using for years — guide our thinking and help us keep the central experience of News Feed intact as it evolves. One of our News Feed values is that the stories in your feed should be informative. For informative sources, we will continue to improve on the work we first announced in August 2016, where we began asking people to rank the informativeness of updates in their feed on a scale of one to five. We’re evaluating ways to expand this work to more areas this year.

As the WSJ adds, "the social-media giant will begin testing the effort next week by prioritizing news reports in its news feed from publications that users have rated in Facebook surveys as trustworthy, executives said Friday. The most “broadly trusted” publications—those trusted and recognized by a large cross-section of Facebook users—would get a boost in the news feed, while those that users rate low on trust would be penalized. The change only applies to U.S. users., though Facebook plans to roll it out later internationally."

The transition comes at a time when Facebook also intends to reduce the presence of news in favor of what it calls “meaningful” interactions on the platform.

This shift will result in news accounting for about 4% of the posts that appears in users’ feeds world-wide, down from the current 5%, Facebook Chief Executive Mark Zuckerberg said in a post Friday.

Naturally most impacted from the change will be publishers, and especially those who are found to be "uninformative" by Facebook. Here's what the company said its new policy will mean for publishers:

We’ll be working on these efforts for the rest of the year. For the first change in the US next week, publications deemed trustworthy by people using Facebook may see an increase in their distribution. Publications that do not score highly as trusted by the community may see a decrease.

In a statement, CEO Mark Zuckerberg said the change is necessary to address the role of social media in "amplifying sensationalism, misinformation and polarization."

“That’s why it’s important that News Feed promotes high quality news that helps build a sense of common ground."

Translation: FaceBook will now only amplify and disseminate content that the majority of its users want to hear, while actively drowning out any minority or variant perceptions, and certainly any sources of content that facebook - and its users - disagree with. And so the echo chamber of passive-aggressive censorship is about to become huge.

Adam Mosseri, the Facebook executive who oversees its news feed, acknowledged that the company was wading into “tricky” territory by weighting publishers based on user trust.

“This is an interesting and tricky thing for us to pursue because I don’t think we can decide what sources of news are trusted and what are not trusted, the same way I don’t think we can’t decide what is true and what is not,” Mr. Mosseri said in an interview.

He added, however, that Facebook engineers themselves weren’t taking a stance on credibility because the company relied on its users to provide a value judgment. He compared the approach with Facebook’s reliance on third-party fact-checkers to determine whether or not an article is completely fabricated.

“The important distinction is that we’re not actually deciding what is trusted and what is not—we’re asking our community to decide,” Mr. Mosseri said in the interview. “We are asking people what they trust and what they don’t trust and acting on that data—as opposed to us deciding.”

Because asking the clearly biased Politifact and Snopes "factcheckers" to provide an objective view somehow washes Facebook's hands of censorship. As for asking a deeply polarized audience to rate the media world as "informative" and "objective"... well good luck with that.

That said, it could always be worse: Facebook could just pull a Twitter and shadow ban or censor any non-liberal users.

Finally, here is Mark Zuckerberg's blog post explaining the radical change in Facebook's model.

Continuing our focus for 2018 to make sure the time we all spend on Facebook is time well spent... Last week I announced a major change to encourage meaningful social interactions with family and friends over passive consumption. As a result, you'll see less public content, including news, video, and posts from brands. After this change, we expect news to make up roughly 4% of News Feed -- down from roughly 5% today. This is a big change, but news will always be a critical way for people to start conversations on important topics. Today I'm sharing our second major update this year: to make sure the news you see, while less overall, is high quality. I've asked our product teams to make sure we prioritize news that is trustworthy, informative, and local. And we're starting next week with trusted sources. There's too much sensationalism, misinformation and polarization in the world today. Social media enables people to spread information faster than ever before, and if we don't specifically tackle these problems, then we end up amplifying them. That's why it's important that News Feed promotes high quality news that helps build a sense of common ground. The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking. We decided that having the community determine which sources are broadly trusted would be most objective. Here's how this will work. As part of our ongoing quality surveys, we will now ask people whether they're familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don't follow them directly. (We eliminate from the sample those who aren't familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.) This update will not change the amount of news you see on Facebook. It will only shift the balance of news you see towards sources that are determined to be trusted by the community. My hope is that this update about trusted news and last week's update about meaningful interactions will help make time on Facebook time well spent: where we're strengthening our relationships, engaging in active conversations rather than passive consumption, and, when we read news, making sure it's from high quality and trusted sources.

Translation: Zuck is setting the stage if not for 2020 then certainly 2024.