You've probably noticed a lot less of what matters to you in your news feed and a lot more of what matters to Facebook.

When it comes to the news, the media’s agenda isn’t revealed simply by “how” a story is covered but by “what” stories are covered.

When it comes to social media, the same standard applies.

Take Facebook. The company’s stated mission is “to give people the power to share and make the world more open and connected. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.”

But if you’ve paid attention to your newsfeed lately, you’ve probably noticed a lot less of what matters to you and a lot more of what matters to Facebook.

What gives?

Facebook algorithms are nothing new. As Hootsuite explains, Facebook’s original criteria for determining what showed up in users’ newsfeeds included three things: “Affinity: how close is the relationship between the user and the content or its source? Weight: what type of action was taken on the content? Time Decay: how current is the content — how recently was it posted?”

Since those good old days, Facebook has vastly expanded its control of your newsfeed.

This is purportedly to deliver you the best experience for your interests (because, you know, Facebook knows better than you what you want to see), but if past is any indication, it’s also so Facebook can tilt the ideological leaning of your newsfeed.

We’ve written before of Facebook’s suppression of conservative news, but now, they have unveiled a new plan to limit “misleading” posts.

Just this week the company announced: “[W]e’re always working to understand which posts people consider misleading, sensational and spammy so we can show fewer of those and show more informative posts instead.”

(Pause here: It’s worth recalling that Facebook’s fact-checkers include leftist organizations such as The Washington Post, the Associated Press, and Climate Feedback, among others.)

Facebook’s announcement continued: “We hear from our community that they’re disappointed when they click on a link that leads to a web page containing little substantive content and that is covered in disruptive, shocking or malicious ads. … Starting today, we’re rolling out an update so people see fewer posts and ads in News Feed that link to these low-quality web page experiences.”

If the validity of a news or other story is Facebook’s judgment of the quality of its web-page experience and lack of disruptive ads, Facebook is losing legitimacy pretty quickly.

Read between the lines, though, and Facebook can use this update to censor web pages that aren’t ideologically aligned with the social media behemoth. Not toeing the leftist line? Well, you have a poor web-page experience.

Still, even this explanation can’t account for some of the changes. The Chicago Tribune’s Deputy Editor for Digital News, Kurt Gessler, recently explained how Facebook’s changes have plunged the Tribune’s Facebook reach. Regardless of one’s viewpoint, the Tribune’s online presence is by any reasonable standards a legitimate web page.

Gessler writes that beginning in January of this year, the paper began noticing a “fairly significant change” in its post reach. After studying the data and possible contributing factors, Gessler concluded the change was due to Facebook itself: The social media giant’s algorithm wasn’t “surfacing” one-third of the Tribune’s posts.

Anecdotally, we in our humble shop have noticed the same thing with our own posts over the last few months.

Of course, as a private company, Facebook is free to censor or promote anything it wants. However, Facebook became the giant it is because we, the users, made it so. The only reason Facebook is worth billions of dollars is that so many of us put our own ideas and interests on the platform. And some of us want to see our favorite pages — not Facebook’s favorites — when we scroll through our newsfeeds.

Yet although a private company, Facebook is behaving curiously like the government — claiming it knows best what we want and insisting on controlling the information we receive on the platform. While this may be an attempt to drive up its stock prices, Facebook doesn’t enjoy the government’s “benefit” of being able to simply force revenue from us. Eventually, if Facebook doesn’t resume serving its users, the same people who built the giant will walk away, and the giant will fall.