Facebook tweaked the algorithm to its all-important News Feed again this morning. Granted, the company tinkers with its engine more often than a neurotic Nascar mechanic. But this adjustment is worth paying attention to, because it's going to fill your News Feed with familiar faces—and that's not necessarily a good thing.

"We are updating News Feed over the coming weeks so that the things posted by the friends you care about are higher up in your News Feed," Facebook engineering director Lars Backstrom wrote. That sounds simple enough, but what it really means is the feed will promote content from your friends over content from publishers.

"This is effectively a vote against making too big of a deal of news versus everything else," says Rick Edmunds, a media business analyst at the Poynter Institute.

This is big, for two reasons. First, many media organizations rely on Facebook for a significant amount of traffic and, by extension, advertising revenue. Second, the News Feed is finite. Making more room for one type of post obviates the presence of another. That combination could spell trouble for the publishers. More importantly, it means you'll almost certainly see an even less diverse range of opinions and news than you already do.

The News Feed has long been an echo chamber, but at least the door to that chamber was open a crack. This change all but closes it.

On the Media

The impact of this change could be devastating to digital media. Facebook provided 39 percent of inbound traffic referrals last year, according to Parse.ly, placing it well ahead of Google. But you don’t need statistics to see how Facebook charts the course others must follow. No one really bothered with live video until Facebook Live. Now everyone's creating videos—until Facebook loses interest or stops paying publishers big bucks to participate. It was the same story with Instant Articles. Few outlets expressed interest in publishing directly to the platform, until Facebook all but forced them to.

So it's natural that Backstrom's comment that "this update may cause reach and referral traffic to decline for some Pages" stirred up more than a little despair among publishers. No one's asking you to shed tears for the media, but it's important to understand the impact Facebook's decision has on the it, because you'll feel it too.

Simply put, you're going to see stories from the publishers you like far less often, unless those publishers pay Facebook to place them in your feed. Instead, you’ll see stories that your friends and family link to. Which means, in turn, that your News Feed will almost certainly become a place of even greater comfort and conformity.

Echo (Echo Echo)

If you don’t rely on Facebook for your news, you’re in the minority. Two in three Facebook users get their news from the platform, according to Pew Research. That's nearly half of the US population.

News outlets often have inherent political leanings, and no one would confuse Mother Jones with Breitbart. But many of them at least aspire to something approaching objectivity, or at least offer dissenting views. And they cover topics and write stories that aren't necessarily fun to share—make way for nuanced geopolitical analysis!—but are vital to understanding the world outside own small slice of it.

Facebook’s problem with presenting news stories, though, is that users sometimes assume that any perceived biases in the content reflects parallel bias on Facebook’s part. Look no further than Gizmodo’s report that political bias of the editors in charge Facebook’s “Trending” module shaped the stories that are shared and which news sources are considered reliable. The backlash was swift, fierce, and ongoing. This change to News Feed reads like a direct response to it.

“We are not in the business of picking which issues the world should read about,” Facebook VP of product management Adam Mosseri wrote today in another announcement. “We don’t favor specific kinds of sources—or ideas. Our aim is to deliver the types of stories we’ve gotten feedback that an individual person most wants to see.”

It's little surprise that the stories people most want to see tend to be the stories that reinforce their beliefs. A review of 10.1 million Facebook accounts in the US confirmed that, and found that Facebook's algorithms suppress opposing content at a lower rate than a user's own choices. It's inevitable, then, that changing the algorithm to favor posts by friends and family will further limit your exposure to opposing viewpoints and perspectives.

The Internet has long struggled with finding objectivity. It's so easy to find ideas that reinforce yours, and ignore the ones that don't. Today, Facebook abdicated any responsibility for striking a balance. “We view our work as only 1 percent finished,” Mosseri wrote. That’s bad news for the media, but even worse news for you.