With the explosive growth of fake news websites, clickbait ‘journalism’, and hoaxes being shared millions of times on social media every day, we are now entering an extremely aggressive period in the ongoing misinformation wars.

At the forefront of this battle lies technological and social media filter bubbles. These are algorithms and user decisions that alter the priority and existence of material in your feed or search results to cater to your preferences. This might seem harmless at first, but over time this creates ideological isolation where you are being fed materials that confirm your own biases from sources you enjoy. Materials that counter those biases end up further down the feed or are not shown at all. It is a hidden form of confirmation bias and contributes to a radical polarization in our society.

In the past, understanding ideological filter bubbles were a lot easier. Some people read reputable newspapers like The New York Times, while others read gossip or paranormal magazines. It was easy to tell people that maybe they should think twice about believing that some politician got help from alien invaders to win a local election because their source was laughably incompetent, generally low-quality and frankly absurd. Although The New York Times has never been a perfect newspaper, it has a higher credibility and reliability than a random paranormal magazine about Bigfoot or the Loch Ness Monster.

With the advent of the Internet, things changed. Things changed drastically.

How personalized search skews your world

Google is not just a search engine that passively delivers results to your browser based on how well it matches your search query. It also uses a system called Google Personalized Search that customizes and prioritizes search results to fit with your previous Internet activity. Google will serve you with material that matches things you have searched for and clicked on before. Thus, Google in some sense “learns” about your Internet habits and your ideological bent and provide you with material that fits this. Over time, this helps to create a filter bubble where material that fit your biases are prioritized over material that goes against your biases.

Most people have no idea that this is happening. They just think they are searching for material out there and it gets delivered to them in a perfectly objective way. However, the algorithms work as a hidden technological form of confirmation bias. Some of this is based on browser cookies, some of it is based on IP address and some of it is based on criteria that we know nothing about whatsoever because there are many things we do not know about the Google search algorithms.

How social media skew your world

Social media websites like Facebook and Twitter work in a similar fashion, but substantially aggravate the problem. Like Google, Facebook uses your past behavior as a guide to what to show you. If you have viewed, liked, reacted to and click on certain material, Facebook will prioritize it in the future. Thus, you cannot escape this filter bubble by not liking or clicking on things. It is probably enough to just view it. Twitter does not personalize results as aggressively as Facebook, but both Facebook and Twitter suffer from an additional kind of filter bubble: you decide who you follow and see material from.

Thus, social media websites like Facebook and Twitter combines the worst features of newspaper bias of the past with the technological filter bubbles of search engines like Google. If you follow accounts with a certain ideological persuasion and view or interact with such material, your social media feed will fill up with such material and exclude other material over time.

Follow Debunking Denialism on Facebook or Twitter for new updates.

How to defeat technological filter bubbles

So you have now realized that you are trapped in an ideological filter bubble. What can you do about it? You start by understanding that they exist. You cannot solve a problem if you do not know (or even deny) that it exists. You also need to accept that it is not just something other people have, but that you might be just as affected by it as your ideological opponents. Finally, you must take many active steps to prevent, lessen or eliminate the technological filter bubbles that are being applied to you.

Know that they exist: the first step towards defeating the hold of technological filter bubbles on your daily experience is knowing that they are there. Knowing that you are intentionally being fed information and sources that conform to and confirm you own ideological biases. A technological filter bubble is, to put it simply a hidden confirmation bias built into the technological tools you are using. You are not viewing the world at it is, but the world as it is filtered through algorithms. This is especially important when dealing with developing stories of a controversial nature. As the Nobel Prize winning physicist Richard Feynman once remarked, “[t]he first principle is that you must not fool yourself — and you are the easiest person to fool. So you have to be very careful about that.”

Know that you have them too: the second step is to understand that technological filter bubbles are not something that just other people have. It is something we can know with virtual certainty that you also have. If you use Google search, Facebook, Twitter or other social media platforms, you are in a filter bubble. It is just a fact. That does not mean you are “as bad” as the worst of the worst people on the Internet, but it means that you are facing many of the same challenges and obstacles to objectivity.

Use another search engine: try to find and use other search engines that do not as aggressively personalize search results based on your past behavior. One such search engine is DuckDuckGo that claim to avoid filter bubbles. To play it safe, use several different search engines and spend less time using search engines that aggressively personalize your results.

Make other social media accounts: create additional social media account that only follows people and pages from outside of your filter bubble. Although this does not completely eliminate your own filter bubbles, it provides you with a lot of information about how filter bubbles can affect other people and ultimately yourself. Another possibility is to start liking and following social media accounts outside your own filter bubble, but with the current outrage culture of “you follow the wrong people” and even the existence of social media bots that block you for doing it, this is not really recommended. Also, you might have a lot to explain to relatives and friends who start noticing that you all of a sudden like radical extremist people and accounts. Just make a separate anonymous account and follow people and pages outside your filter bubble from there. Check it regularly (like once per day or a few times per week).

Expand diversity of sources: start reading material from a broad spectrum of websites from all kinds of ideological angles and be skeptical of everything you find. Do not read any material with the willingness to believe, but with the willingness to understand and critically evaluate.

Actively seek out disconfirming information: since there is a hidden technological confirmation bias for most of the things you find online, lessen its impact by actively seeking out sources and claims that go against your own beliefs and stuff you find inside your own filter bubble.

Counter misinformation: try to actively counter misinformation regardless of source, even when it comes from sources you like, people you respect or positions you hold.

Consistent skepticism: apply the same amount of skepticism to things that support your beliefs as you do to things that contradict them.

Filter bubbles have been with us for many years but came to the forefront of public discussion during the 2016 U. S. general election. As we are heading into an even more aggressive misinformation war, it is more important than ever to be skeptical and think critically about information you find online.