Great, how am I supposed to remember all of this?

You don’t have to. But you can start by remembering these four giant problems our brains have evolved to deal with over the last few million years (and maybe bookmark this page if you want to occasionally reference it for the exact bias you’re looking for):

Information overload sucks, so we aggressively filter. Noise becomes signal. Lack of meaning is confusing, so we fill in the gaps. Signal becomes a story. Need to act fast lest we lose our chance, so we jump to conclusions. Stories become decisions. This isn’t getting easier, so we try to remember the important bits. Decisions inform our mental models of the world.

In order to avoid drowning in information overload, our brains need to skim and filter insane amounts of information and quickly, almost effortlessly, decide which few things in that firehose are actually important and call those out.

In order to construct meaning out of the bits and pieces of information that come to our attention, we need to fill in the gaps, and map it all to our existing mental models. In the meantime we also need to make sure that it all stays relatively stable and as accurate as possible.

In order to act fast, our brains need to make split-second decisions that could impact our chances for survival, security, or success, and feel confident that we can make things happen.

And in order to keep doing all of this as efficiently as possible, our brains need to remember the most important and useful bits of new information and inform the other systems so they can adapt and improve over time, but no more than that.

Sounds pretty useful! So what’s the downside?

In addition to the four problems, it would be useful to remember these four truths about how our solutions to these problems have problems of their own:

We don’t see everything. Some of the information we filter out is actually useful and important. Our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there. Quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, and counter-productive. Our memory reinforces errors. Some of the stuff we remember for later just makes all of the above systems more biased, and more damaging to our thought processes.

By keeping the four problems with the world and the four consequences of our brain’s strategy to solve them, the availability heuristic (and, specifically, the Baader-Meinhof phenomenon) will ensure that we notice our own biases more often. If you visit this page to refresh your mind every once in a while, the spacing effect will help underline some of these thought patterns so that our bias blind spot and naïve realism is kept in check.

Nothing we do can make the 4 problems go away (until we have a way to expand our minds’ computational power and memory storage to match that of the universe) but if we accept that we are permanently biased, but that there’s room for improvement, confirmation bias will continue to help us find evidence that supports this, which will ultimately lead us to better understanding ourselves.

"Since learning about confirmation bias, I keep seeing it everywhere!”

Cognitive biases are just tools, useful in the right contexts, harmful in others. They’re the only tools we’ve got, and they’re even pretty good at what they’re meant to do. We might as well get familiar with them and even appreciate that we at least have some ability to process the universe with our mysterious brains.

A couple days after posting this, John Manoogian III asked if it would be okay to do a “diagrammatic poster remix” of it, to which I of course said YES to. Here’s what he came up with: