In our offline lives, people tend to surround themselves with others who share their beliefs, thoughts and feelings. Sure, your family or coworkers may not share your exact ideology, but we tend to gravitate toward people who think like us. Online, ideologically insulating yourself is even easier—on social networks, you may very rarely bump into ideas and thoughts you disagree with.

This political polarization could be partly Facebook's doing. After all, its algorithms are fine-tuned to show us content we're going to love. But it could also be our own doing because we don't seek out or click on stories that conflict with our beliefs. So who's to blame, the social network or the social networker?

A new study out today conducted by information scientists at Facebook has tried to answer the question of how much we make these ideological bubbles ourselves—and how much of it is created by Facebook. By parsing through the data of 10 million Facebook users in the US, the scientists found that Facebook's News Feed algorithm modestly decreases the amount of politically discordant links and articles you'll see and click on—by about 5 to 8 percent, depending on your political ideology. But we're isolating ourselves to a much larger degree, the scientists found.

"You have to occasionally engage with people with different views to understand them."

"There's a growing concern that social media platforms like Facebook and Twitter allow us to more precisely engineer our informational environments than ever before, so we only get info that's consistent with our prior beliefs," says David Lazer, a political and computer scientist at Northeastern University who authored a commentary on the paper in the journal Science.

"The issue is that, in a democracy like ours, occasionally bumping into opposing points of view is a good thing. You have to occasionally engage with people with different views to understand them… rather than demonize them."

In the study, a trio of information scientists at Facebook analyzed the linking, clicking, and viewing data of 10 million Facebook users in the U.S. who publically share their political ideology. Dissecting this data, the scientists immediately identified an ideological bias across friendships. Only 20 percent of a self-described conservative's friends were, on average, liberal-leaning. And the same was true vice-versa. (Not a big surprise to many Facebook users.)

The researcher's second step was to see which news websites were shared by left- or right-leaning users, and rank those websites accordingly. They found that websites like Fox News were shared largely by conservative users, and sites like The Huffington Post by liberal ones. Again, no surprise there. (Lazer notes a possible issue with this website-wide ranking system: "A conservative Op-Ed on The New York Times, which was found to be modestly left-leaning, would be considered liberal," he says.)

Having ranked each website, the Facebook researchers could see how often users clicked on links to sites with political views different from their own. They also tracked how the site's newsfeed algorithm impacted users' ability to see or click politically-challenging links. "Many people may not realize it, but Facebook sorts what you'll see, to keep people engaged," Lazer says. "If you saw everything all your friends posted on Facebook, I guarantee you it would be deadly boring."

"In a democracy like ours, occasionally bumping into opposing points of view is a good thing."

The Facebook researchers found that people largely created their ideological bubbles by themselves—without any help from the algorithm, they ignored or scrolled over stories from sites they'd tend to disagree with. But Facebook's behind-the-scenes curating still had an effect: causing people to see roughly 5 to 8 percent fewer ideologically discordant stories.

The algorithm's added effect might be small, Lazer says. But it shouldn't be ignored. These personalized algorithms "have largely invisibly permeated our virtual world," Lazer says. Lazer notes that Facebook's social algorithms are constantly changing (including a big update last month) and such changes could drastically increase or decrease this ideological bubble effect.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io