3 Reasons why Filter Bubbles are Distorting Your Reality

And how Media Sifter can help to tackle them

The web is filled with enormous amounts of information. Googling ‘Beyonce’ will provide you with 123 000 000 results, so a mechanism filtering out unnecessary pages is likely to be useful, unless you have the time to go through all the articles, while looking for the artist’s tour dates in your city.

Likewise, based on your browsing history and likes, if you search for ‘apple’, Google knows whether you are interested in Apple products or fruits, helping you to find relevant articles and advertisements instantly. Based on your clicks and likes, news organisations can recommend articles that may interest you and deepen your understanding about an issue.

While personalized and filtered results help us to navigate in the jungle of information, it has led to a recent issue especially on feed-based platforms — filter bubbles. Here are some of the reasons why filter bubbles are problematic for the health of democracy, and personal well-being, that has even raised worries for Bill Gates:

1.Lack of transparency: Nearly all media outlets and social media sites use algorithmic curation on their sites, but a significant proportion of people are unaware of this, let alone how it works. For instance, according to a study conducted in the University of Michigan in 2015, over 60 % of Facebook users are unaware of algorithmic curation on their newsfeed, presuming that missing a news story is a result of their own actions, such as lack of activity on the site. Furthermore, in order to maintain competitiveness on the market, the logic behind algorithms are some of the best-kept secrets at new media giants such as Google, Facebook and Twitter — even field experts are unable to figure out why certain content is ultimately fed to us. Even though filter bubbles existed in the times of traditional media, their existence was easier to acknowledge; the majority of media consumers have a rough idea of the political stance of the Fox News, for example. On the contrary, personalized newsfeeds on digital platforms are such a mixture of advertisements, promotions and human curated content, that it has become increasingly difficult to differentiate from one another.

2. They reinforce stereotypes: Even when you logout of Google, the company can track up to 57 signals about you — from the device that you are using, to your age, income and gender. They can presume your stance on vaccinations and other significant personal and health-related issues. Likewise, your Facebook newsfeed looks completely different during election time depending whether the algorithmic logic considers you as a liberal or conservative, and feeds you with according news. So contrary to expanding your access to information, filter bubbles may enforce categorization and stereotyping of personalities.

Complicated, debated issues such as abortion portrayed drastically differently on a liberal, vs. a conservative person’s Facebook newsfeed.

3. They can influence political opinions and increase misinformation in the public sphere: As Eli Pariser has pointed out, personalized articles are like information junk food. We may be provided with articles that have entertainment value, but miss reportage on un-sexy, structural issues that cultivate over time, such as deforestation or poverty. Even worse, Facebook has been recently become inundated with ‘fake news’ — poorly researched stories, built on rumour and hoaxes fueled by advertising and the attention economy. Fake news distributed on social media channels have been accused of spreading misleading information about the 2016 US presidential election and polarizing voters, which may have helped the victory of Donald Trump. Social media channels, such as Facebook, have tried to tackle the issue, but so far, their self-regulatory attempts have remained debatable and insufficient. Often, the story gets flagged after it has already gone viral and the damage is done.

While we may continue to discuss the responsibility of new media companies in addressing issues of algorithmic transparency and accountability, us humans still have skills that algorithms don’t: intuition, critical thinking, and pairing and comparing different views together. Therefore, a decentralized platform, pairing news from different sources, where facts and opinions are checked, is helpful at maintaining the quality of information in, and promoting a healthy dialogue in the media and public sphere.

Recent technical innovations, such as Blockchain, can help to further enhance transparency, which also brings significant opportunities to journalism. These are the core concepts Media Sifter is working to address. If you’re interested in joining the debate, or discussing these issues jump on our social channels and talk directly with me and the team.

Join the Community Changing the Rules of the Media Game:

Join the conversation on our Slack and Telegram channels

Follow us for a weekly SIFT of the media

Subscribe to our newsletter

Follow Media Sifter on Reddit and Twitter

For Open positions, see angel.co/media-sifter

For more information, visit MediaSifter.co