Building on the recent SSC post Why Doctors Think They’re The Best ...

What it feels like for me How I see others who feel the same There is controversy on the subject but there shouldn't be because the side I am on is obviously right. They have taken one side in a debate that is unresolved for good reason that they are struggling to understand I have been studying this carefully They preferentially seek out conforming evidence The arguments for my side make obvious sense, they're almost boring. They're very ready to accept any and all arguments for their side. The arguments for the opposing side are contradictory, superficial, illogical or debunked. They dismiss arguments for the opposing side at the earliest opportunity. The people on the opposing side believe these arguments mostly because they are uninformed, have not thought about it enough or are being actively misled by people with bad motives. The flawed way they perceive the opposing side makes them confused about how anyone could be on that side. They resolve that confusion by making strong assumptions that can approach conspiracy theories.

The scientific term for this mismatch is: confirmation bias

What it feels like for me How I see others who feel the same My customers/friends/relationships love me, so I am good for them, so I am probably just generally good. They neglect the customers / friends / relationships that did not love them and have left, so they overestimate how good they are. When customers / friends / relationships switch to me, they tell horror stories of who I'm replacing for them, so I'm better than those. They don't see the people who are happy with who they have and therefore never become their customers / friends / relationships.

The scientific term for this mismatch is: selection bias

What it feels like for me How I see others who feel the same Although I am smart and friendly, people don't listen to me. Although they are smart and friendly, they are hard to understand. I have a deep understanding of the issue that people are too stupid or too disinterested to come to share. They are failing to communicate their understanding, or to give unambiguous evidence they even have it. This lack of being listened to affects several areas of my life but it is particularly jarring on topics that are very important to me. This bad communication affects all areas of their life, but on the unimportant ones they don't even understand that others don't understand them.

The scientific term for this mismatch is: illusion of transparency

What it feels like for me How I see others who feel the same I knew at the time this would not go as planned. They did not predict what was going to happen. The plan was bad and we should have known it was bad. They fail to appreciate how hard prediction is, so the mistake seems more obvious to them than it was. I knew it was bad, I just didn't say it, for good reasons (e.g. out of politeness or too much trust in those who made the bad plan) or because it is not my responsibility or because nobody listens to me anyway. In order to avoid blame for the seemingly obvious mistake, they are making up excuses.

The scientific term for this mismatch is: hindsight bias

What it feels like for me How I see others who feel the same I have a good intuition; even decisions I make based on insufficient information tend to turn out to be right. They tend to recall their own successes and forget their own failures, leading to an inflated sense of past success. I know early on how well certain projects are going to go or how well I will get along with certain people. They make self-fulfilling prophecies that directly influence how much effort they put into a project or relationship. Compared to others, I am unusually successful in my decisions. They evaluate the decisions of others more level-headedly than their own. I am therefore comfortable relying on my quick decisions. They therefore overestimate the quality of their decisions. This is more true for life decisions that are very important to me. Yes, this is more true for life decisions that are very important to them.

The scientific term for this mismatch is: optimism bias

Why this is better than how we usually talk about biases

Communication in abstracts is very hard. (See: Illusion of Transparency: Why No One Understands You ) Therefore, it often fails. (See: Explainers Shoot High. Aim Low! ) It is hard to even notice communication has failed. (See: Double Illusion of Transparency ) Therefore it is hard to appreciate how rarely communication in abstracts actually succeeds.

Rationalists have noticed this. ( Example ) Scott Alexander uses a lot of concrete examples and that should be a major reason why he’s our best communicator. Eliezer’s Sequences work partly because he uses examples and even fiction to illustrate. But when the rest of us talk about rationality we still mostly talk in abstracts.

For example, this recent video was praised by many for being comparatively approachable. And it does do many things right, such as emphasize and repeat that evidence alone should not generate probabilities, but should only ever update prior probabilities. But it still spends more than half of its runtime displaying mathematical notation that no more than 3% of the population can even read. For the vast majority of people, only the example it uses can possibly “stick”. Yet the video uses its single example as no more than a means for getting to the abstract explanation.

This is a mistake. I believe a video with three to five vivid examples of how to apply Bayes’ Theorem, preferably funny or sexy ones, would leave a much more lasting impression on most people.

Our highly demanding style of communication correctly predicts that LessWrongians are, on average, much smarter, much more STEM-educated and much younger than the general population. You have to be that way to even be able to drink the Kool Aid! This makes us homogeneous, which is probably a big part of what makes LW feel tribal, which is emotionally satisfying. But it leaves most of the world with their bad decisions. We need to be Raising the Sanity Waterline and we can’t do that by continuing to communicate largely in abstracts.

The tables above show one way to do better that does the following.

It aims low - merely to help people notice the flaws in their thinking . It will not, and does not need to, enable readers to write scientific papers on the subject.

. It will not, and does not need to, enable readers to write scientific papers on the subject. It reduces biases into mismatches between Inside View and Outside View. It lists concrete observations from both views and juxtaposes them.

These observations are written in a way that is hopefully general enough for most people to find they match their own experiences.

It trusts readers to infer from these juxtaposed observations their own understanding of the phenomena. After all, generalizing over particulars is much easier than integrating generalizations and applying them to particulars. The understanding gained this way will be imprecise, but it has the advantage of actually arriving inside the reader’s mind.

It is nearly jargon free; it only names the biases for the benefit of that small minority who might want to learn more.

What do you think about this? Should we communicate more concretely? If so, should we do it in this way or what would you do differently?

Would you like to correct these tables? Would you like to propose more analogous observations or other biases?

Thanks to Simon, miniBill and others for helping with the draft of this post.