There are cognitive biases and thinking errors. Our brains have evolved to have tendencies that, sometimes, and unfortunately, lead to errors. Our perception is compromised, our decisions are based on wrong information, our reactions are based on wrong information. So often.

These could be errors in how the brain absorbs information- interpreting conversations, reading articles, understanding behaviour, understanding motivations of others, rights & wrongs.

The brain has evolved quick tendencies for a reason, but many of them may be out of context today. For example, jumping to conclusions such as – ‘the food is toxic based on its color’ may have been useful back in the day, you know, a million years ago, when we were all babies.

These ‘heuristics’ lead to highlighting day to day information in unproductive ways. Especially when there are inherent tendencies about oneself or others. For example, if there is a tendency to be self-critical. You might interpret social cues in unfavorable ways. You might interpret a lack of party invitations as not being cool enough to hang out with the group.

A cognitive bias occurs due to a limited attention span for information in the environment, poor memory of events and details, heuristics for simplifying information and lack of variety of experiences. It is SPEED that is the advantage of having such tendencies to jump to conclusions. But of course, speed compromises accuracy here. Biology is a product of evolution and not design. Thus, there are optimizations. We didn’t get both speed and accuracy like a computer.





A short recap of cognitive biases

Some conclusions are more likely to be wrong than right because they confirm beliefs that we already have by selecting bits of information and giving those bits undue importance. This is the confirmation bias. The mother of all thinking biases. We select information that bolsters existing notions.

There are other biases such as the gambler’s fallacy– we somehow believe that the world likes to balance itself out. If you toss a coin 5 times in a row and get the result heads every time, what do you think the next toss would yield? Heads? Tails? Most people believe that it would be tails. This is wrong. Previous coin tosses have no causal relationship with the next toss. They are independent events. People make this error of thinking that when something happens a lot, the opposite will be true in subsequent events. These errors lead to heavy monetary losses in gambling.

Another pervasive bias is the anchoring effect. Nobel Laureate Daniel Kahneman and his colleague Amos Tversky conducted an experiment (Kahneman, Thinking fast and slow 2011) in which they asked people the following question – What percentage of African countries are a part of the United Nations? Two equal groups were created. One group was asked – Is it greater or lesser than 10%? The other group was asked – Is it greater or lesser than 65%? The first group answered an average of 25% and the second, 45%. The 2 questions included an anchor – 10% and 65%. These numbers gave a starting point for people to think around, follow up with assumptions, and then give an answer.

This post isn’t about the biases. This post is about thinking clearly. Regardless of what the biases and errors are called or where they manifest, there are ways to counter them. Over 100 biases have been described and observed. They are pervasive. However, one can use the following 8 strategies to think clearly and objectively in spite of these tendencies to jump to wrong conclusions.





8 strategies to think clearly and objectively: How to overcome thinking mistakes that we make

We have a powerful multi-purpose instrument called the brain which can be trained with just little practice. Form a habit to do the following:

1. Focus on the data: In any situation that demands decision making, focus on the evidence or information. Even the bad kind. Data might be hard to spot, but can be figured out. Just takes a little bit of effort. Once this becomes a habit, it’s nearly effortless.

2. Seek out contrary data and conclusions: Keep an eye on bad reviews and see if they matter to you. One hundred good reviews are great but a hundred good reviews and a few bad reviews are better. This is your best weapon against the confirmation bias. This is perhaps the most important technique in this list as well, if there is data that supports a notion, find data that doesn’t; or at least try to think in that direction. You’ll have a much clearer picture of everything. And I really mean EVERYTHING. In fact, this is at the core of scientific investigation. This is how accurate knowledge builds.

3. Understand the noise: Focus on important aspects of a problem, not every single aspect. It is hard to filter out the noise but let me show how noise is useful when avoided correctly. Perhaps you want to figure out who the best influencer on Instagram is and you choose the follower count as your test. You find an influencer who has 200K followers, you go nuts thinking I can leverage this. But what if that number is noise? There are 2 levels of noise. Level one is that most influencers have 150-250K followers. With that range, 200K is just normal. If you want the best, look for numbers that are outside the average range. The second level of noise is that these followers could be just casually interacting around a hype, nothing useful to you. Maybe they really have no value as a fanbase- no sales, no hitting the like button, etc. Noise is background information that is of no use to you.

4. Test and Re-test: Consider the following example. You are talking with a friend and (s)he is not friendly. You wonder why and think that perhaps your friendship is changing or (s)he had a bad day or you said something that was unpleasant. Instead of drawing such conclusions, test and retest. Try having a similar conversation again or perhaps ask how their day was. Perhaps you are concluding that your boss is cranky on Wednesdays. Don’t just test this hypothesis for Wednesdays, test this observation for all days. Maybe your boss is always cranky, or it was random crankiness – work stress? This is tricky; because, if done wrong, you walk right into the confirmation bias.

5. Make educated guesses: Look for anchors. People ask leading questions that contain information that primes others to think in a certain way. For example, ‘he isn’t that bad a guy’. Someone is more likely to respond saying ‘Yeah he isn’t that bad’, but the answer could very well be ‘He is an awesome guy’ if the question were ‘He is a pretty good guy’. When trying to make a real educated guess, rethink assumptions, spot anchors, use data, and try to work out an answer.

6. Avoid misattributions: Sometimes, we get attracted to advertisements based on things unrelated to the respective products. There are images that evoke emotional responses. Try to isolate that emotion. Its purpose might be to compensate for the lack of useful content or amplify a desirable feature. This is closely related to misattribution. Let us look at mobile applications. Sometimes great restaurants make terrible apps and reviewers rate the app for what it is – an app. When you see a low rating, can you analyze if the poor rating is for the food or the app? Even though an app sucks, the food can be good but the app warrants a low rating. We misattribute this and assume the food is bad. Is the food bad because the app sucks? No. The food could be brilliant AND the app can suck, the app shouldn’t change the perceived quality of the food.

7. Have multiple perspectives: You can look at a situation from a different person’s point of view (empathy) or even literally look at something from a different angle. In both cases, you will get new information. Your opinions could change. It’s easier to think from someone else’s perspective than to think from an imaginary perspective. For example, flying can be thought of from a pilot’s point of view, or a passenger’s, or the technical assistant’s point of view. But there are so many more vantage points – from a customer care’s point of view, a flying bird’s point of view, an alien’s point of view. The thing is that you don’t have to know how someone else thinks, your brain will conjure approximations and assumptions that change your perspective regardless of it being useful, and that is important.

8. Assume you don’t know what you don’t know: In many situations, it is not possible to understand the clockwork that leads to a phenomenon. Let go of assumptions. Accept that there are factors at play that could be beyond your comprehension. The unknown unknown . You wouldn’t know what you don’t know you could know. For example, there is a common debate among audiophiles, lay people, musicians, and musical technicians about the file size of a song and it’s quality. A person who knows what a song waveform looks like can argue that larger files have more content in it. That is a valid premise, but there is an unknown assumption, that some of the content actually translates into better quality. This assumption is wrong. The unknown unknown for many is that it is ok to remove some content because our brain doesn’t register certain frequency changes. The brain masks extremely low frequencies that follow high frequencies. For all practical purposes, those frequencies are useless to humans.





The benefits of overcoming cognitive biases and thinking errors

Short answer – better thinking, decision making, and perception

Longer answer – Humans have advanced, technologically and socially, largely due to the pre-frontal cortex and the frontal lobe which are implicated in executive functions (Siddiqui, et al. 2008). That is, decision making, planning, problem-solving, complex analysis of situations, etc. Cognitive biases interfere with these functions. By bringing these errors into awareness and mitigating them, you will process and understand the information around you better. You will know how to undertake better decisions in stressful and relaxed situations alike. You will shop better, you will manage your resources better and you will have healthier conversations with lesser misunderstanding. Personal, workplace and social interactions will significantly improve as you will have learned how to make better judgments.





Did you like this article? If yes, you will love this book- The art of thinking clearly. It’s written by the guy in the video above.



Now, I suppose, you’ll have a few strategies in your quiver to make good decisions by overcoming cognitive biases. Have fun thinking objectively!





Read more: 4 cognitive biases that you should be aware of

Resources:

Dobelli, Rolf. 2013. “The Art of Thinking Clearly.” In The Art of Thinking Clearly, by Rolf Dobelli. New York: Farrar, Straus and Giroux.

Kahneman, Daniel. 2011. “Thinking fast and slow.” In Thinking fast and slow, by Daniel Kahneman. New York: Farrar, Straus and Giroux.

Kahneman, Daniel, and Amos Tversky. 1974. “Judgement under Uncertainty: Heuristics and Biases.” Science 1124-1131.

18.558007 73.8075201

Hey! Thank you for reading; hope you enjoyed the article. I run Cognition Today to paint a holistic picture of psychology. Each article is frequently updated with new research findings. I’m an applied psychologist from Pune, India. Love sci-fi, horror media; Love rock, metal, synthwave, and pop music; can’t whistle; can play the guitar.