A conspiracy theory is an invitation to an exciting alternative reality where nothing is quite as it seems. There is fun to be had defying conventional wisdom, sifting through signs, uncovering lost knowledge and secret plots. But we don't generally believe stuff just for the fun of it. For us to really believe something it has to seem plausible. How can we be so sure that our journey off the intellectual beaten path and down twisting trails of conspiracy theory has led us to the truth, while the scientific mainstream is deluded or deceptive? Sometimes all it takes is our own overly optimistic brain telling us we understand the world in far greater depth than we actually do.

The Bike Test

Contemplate the humble bicycle. A bike is a fairly simple device: two wheels, a frame, handlebars, pedals, and a chain (for our purposes we can ignore all the complicated gears and stuff). There’s a good chance you’ve owned one at some point in your life, or at least ridden one. And if you live in a city, you probably see them every day, adorning lampposts or hurtling toward you as you cross the street. You know a thing or two about bicycles, right? So this should be a piece of cake. Below is a doodle of a bike, but it’s missing a few key parts. I just want you to fill in the rest. Sketch in the missing bits of frame, the chain, and the pedals. It doesn’t matter if your lines are a bit wobbly. This isn’t a test of artistic ability. Do it in your head if you don’t have a pen handy. I just want you to indicate roughly where the different bits ought to go to make the bike work.

How did you do? When psychologist Rebecca Lawson set people this challenge, around half of them got something wrong — despite the fact that virtually all of them knew how to ride a bike, and many had one sitting at home. And these weren’t trivial mistakes. They were design flaws that would render the bike a useless hunk of junk. Some people drew the bicycle’s frame connected to both the front and back wheels, which would make it difficult to turn the handlebars. Some drew the pedals attached to the center of one of the wheels, which would make it tough to reach them with your feet. The most common mistake was putting the chain in the wrong place; some people drew it looped between the front and back wheels, which would also make steering tricky. The right answer is the frame juts down from below the seat to a spot between the wheels, and connects from there to the back wheel and to the frame up by the handlebars, making two triangles. The pedals go in between the wheels, and the chain loops around pedals and the back wheel (see bottom image). This setup allows the pedals to turn the chain and drive the back wheel, while leaving the front wheel free to turn. Did you make any mistakes? Was it harder than you thought it would be? This deceptively simple task reveals that a lot of people lack a basic understanding of how bicycles work. By itself, that’s not all that revealing. Clearly we can get by in life without understanding the finer points of bicycle design. What’s interesting is that we don’t realize our lack of understanding — until we’ve been forced to demonstrate it by completing the doodle, and we find ourselves faltering. Before confronting people with the doodle, Lawson asked them how well they reckoned they understood the basic mechanics of a bicycle. On a scale from one (meaning “I know little or nothing about how bicycles work”) to seven (meaning “I have a thorough knowledge of how bicycles work”), people rated themselves, on average, around a four or a five — reasonably knowledgeable. But for a lot of them, it turned out their understanding was an illusion. When their mistakes on the diagram were pointed out — when they were directly confronted with their own ignorance, in other words — the illusion suddenly faded. It was a surprising experience. One earnest test subject summed the sorry state of affairs: “I think I know less than I thought.”

Overconfident Brain

It’s not just the mechanics of bicycles that we think we understand better than we really do. We’re overly confident about how well we understand all sorts of things. In an extensive series of experiments, Yale grad student Leon Rozenblit and his adviser Frank Keil asked people how well they thought they understood devices ranging in complexity and familiarity from can openers to zippers to a helicopter. In each case, most people initially reckoned they had a reasonably detailed idea of how the thing works. When researchers asked them to write a step-by-step explanation of how exactly a can opener opens cans, or how a helicopter flies, however, many people came up short. Like Lawson’s deflated cyclists, many of Rozenblit and Keil’s test subjects expressed “genuine surprise and new humility at how much less they knew than they originally thought.” And it doesn’t stop there. People overrate their understanding of simple physics problems, such as what trajectory a falling object will follow, and more complex natural phenomena, such as how earthquakes occur and why comets have tails or how rainbows are formed. People think they understand the law and political policies better than they really do. As Dan Simons and Chris Chabris note in The Invisible Gorilla, the tendency for projects like Boston’s Big Dig to go staggeringly over budget and past deadline shows that even experts sometimes overestimate how much they know when they’re planning a project. We might think we understand something in depth, but when it comes time to put our money where our mouth is, it often turns out our understanding leaves a lot to be desired.

Credit sondem/ Shutterstock

The Unknown Unknowns

Why do we so often misjudge the depth of our understanding? It doesn’t seem to be that we’re just telling ourselves a flattering lie, or trying to sound impressive. Offering people cold hard cash to assess their understanding or abilities honestly and accurately — even the princely sum of one hundred dollars — doesn’t reduce their overconfidence. Neither does forcing them to justify their assessment of their own abilities to their peers, and thus face the prospect of appearing arrogant or foolish for an overconfident assessment. The real reason for our overconfidence comes down to a metacognitive glitch. Metacognition is just a fancy way of saying “thinking about thinking.” When you say something like “I'm good at math,” or “I’m easily distracted,” you have made a metacognitive insight. But, like licking your own elbow, it turns out thinking about our own thinking isn’t as easy as it seems like it should be. There are limits to our ability to accurately assess what we know, and particularly to realize how much we don’t know. Former U. S. secretary of defense Donald Rumsfeld famously summed up the problem:

As we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know... It is the latter category that tend to be the difficult ones.

Rumsfeld’s phraseology might get your brain in a twist (and he took some flak from persnickety linguists for it), but the point he’s making is an important one. Let’s go through each category in turn. The known knowns are easy. What's the capital city of England? I know the answer: It’s London. When I read the question, it’s as if my brain types it into a mental search engine, and up pops the answer. The known unknowns don’t present much of a problem either. What's the capital city of Namibia? I don’t know. But I know that I don't know. Clicking the search button brings up a blank page with an error message, “Your search did not match any documents.” Leon Rozenblit found that when it comes to simple pieces of trivia, such as capital cities, we don’t tend to overestimate our knowledge. We either know the answer or we don’t; we have no problem telling the difference. We experience a known unknown as a neat little gap in our knowledge waiting to be filled. (Thanks to a vastly superior search engine, I have now filled my blank: The capital city of Namibia is Windhoek.) But then there are the unknown unknowns. The unknown unknowns are sneakier. They’re an intellectual blind spot, and our brain loves to fill in blind spots. As Rebecca Lawson, Leon Rozenblit, and others have discovered, we are especially prone to blind spots when it comes to physical phenomena or devices like earthquakes and bicycles, and complex systems like law or politics. The problem, Rozenblit explained, is that most of us are not bicycle mechanics or political scientists, but we all have a passing familiarity with some of the surface features of bikes and politics. This smattering of vague knowledge can get us into trouble, because it takes a bit of expertise just to know how much you don’t know about something. Without it, it can be hard to tell the difference between a deep pool of understanding and a shallow puddle. Psychologist David Dunning explains it more bluntly: “An ignorant mind is precisely not a spotless, empty vessel.” It’s filled with information — all the “life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches” — our brain indiscriminately uses whatever is at hand to plaster over the intellectual blind spot. For Rebecca Lawson’s test subjects who rated their understanding of bikes as pretty strong and then proceeded to mess up the diagram, how a bicycle works was an unknown unknown. When Lawson asked them to rate their understanding, they submitted a mental search, and it didn’t turn up an empty page with an error message. Instead, they saw a page that appeared, at first glance, to be filled with a healthy amount of information — the names of the various parts, fuzzy memories of what their childhood bike looked like, experience of riding bikes, maybe even knowledge of how to fix a puncture. They mistook all this fairly shallow knowledge for a deep understanding of how the parts actually function together to make the wheels turn. It wasn’t until they were forced to inspect the page more closely — when the pesky psychologist confronted them with their mistakes on a little doodle — that their illusory understanding faded like a mirage. As Rozenblit and Keil put it, we sometimes mistake a skeletal, incomplete sketch for a vivid, blueprint-like” sense of how things work. Or, to paraphrase Chris Chabris and Dan Simons, you think you understand how a bicycle works, when all you really understand is how to work a bicycle. This might be hard to get your head around. After all, we have a blind spot when it comes to our intellectual blind spots. Maybe, though, you’ve experienced the uneasy feeling of having a blind spot of your own unexpectedly revealed. Perhaps while chatting with friends over a drink you’ve found yourself launching into a lecture on some topic you’ve recently become opinionated about. Mid-sermon, someone interjects with an innocent question, like how exactly do cap-and-trade policies influence global carbon emissions? Suddenly you find yourself at a loss. You thought you were on firm intellectual ground, but you discover that your understanding has departed terra firma and is stranded, like Wile E. Coyote, in midair. Sometimes we can’t see where our understanding ends until long after we’ve merrily skipped over the cliff edge. If you’ve never had this experience, either you’re a genius, or you only ever talk about things you have genuine expertise with, or you’ve got a particularly bad case of the unknown unknowns.

Excerpted from Suspicious Minds: Why We Believe Conspiracy Theories, by Rob Brotherton. Used with permission.

Top image by ArpornSeemaroj/ Shutterstock