Getty

If the results of the 2015 General Election did not reflect the conversation on your News Feed, Facebook wants you to know it's mostly your fault -- not theirs.

The social network reports in a study published in Science that the choices a user makes about who they follow has a greater impact on the political tone of their news feed than its own content algorithms.


The inner workings of Facebook's News Feed have been controversial for years; rather than showing you a basic timeline of every post from your friends, the site generates a selection of posts algorithmically, based on what it thinks you will find most engaging. There are many factors that determine what you see, including what you have clicked on in the past and what you tend to ignore.

One downside of this system, however, is that it can have the effect of creating a classic "echo chamber" -- showing you only those posts which reflect your own attitudes and opinions. A 2011 TED talk by activist Eli Pariser summed up the potential dangers of this system: "if algorithms are going to curate the world for us," Pariser said, "then... we need to make sure that they also show us things that are uncomfortable or challenging or important".

Read next We asked TikTokers why they’re pretending to be Holocaust victims We asked TikTokers why they’re pretending to be Holocaust victims

In an attempt to gauge the bias effect caused by its algorithm, Facebook conducted a study on 10.1 million users and how they used their news. The study, by Facebook data scientists Eytan Bakshy, Solomon Messing and Lada Adamic, examined only profiles which stated a political opinion -- something less than one in ten people do -- and quantified the rate at which news from the opposite side of the liberal-conservative spectrum appeared in their feeds, compared to how often it was actually shared. To do this they used keywords to distinguish "hard" from "soft" news, and assigned it a numerical score based on the beliefs of the person sharing it.

The study said that an average of 23 percent of a politically partisan user's friends identify with the opposite end of the spectrum. (Liberals' conservative friends share less news, but only slightly.) It also found that Facebook does curate that news to show up less often in a politicised user's feed -- liberals see conservative articles eight percent less often than they would if it was a purely random selection. Conservatives are exposed to liberal news 5 percent less often.

Getty


In practice, however, the effect is slight; news from the opposite side shows up only 1 percent less often than it would were Facebook's algorithm not adjusting the figures. Crucially, the study also found people click on politically challenging news less often than they see it -- liberals feeds are about 23 percent, but they only click on a 20 percent share of posts. For conservatives 34 percent of posts are liberal, but they click on a 29 percent share.

Their conclusion? People do most of their news and bias curation themselves. "Individuals are exposed to more cross-cutting discourse in social media [than] they would be under the digital reality envisioned by some," the study says. Compared to algorithmic ranking, "individuals' choices about what to consume had a stronger effect limiting exposure to cross-cutting content". "Of course, we do not pass judgment on the normative value of cross-cutting exposure -- though normative scholars often argue that exposure to a diverse 'marketplace of ideas' is key to a healthy democracy, a number of studies find that exposure to cross-cutting viewpoints is associated with lower levels of political participation. Regardless, our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals."

In reaction to the study criticising his 2011 "filter bubble" TED talk, Eli Pariser wrote on Medium that the results of the study had been overstated by Facebook in its press outreach. "The fact that the algorithm’s narrowing effect is nearly as strong as our own avoidance of views we disagree with suggests that it's actually a pretty big deal," he said. "Each algorithm contains a point of view on the world. Arguably, that's what an algorithm is: a theory of how part of the world should work, expressed in math or code. So while it'd be great to be able to understand them better from the outside, it's important to see Facebook stepping into that conversation. The more we're able to interrogate how these algorithms work and what effects they have, the more we're able to shape our own information destinies."