Source: Dean Drobot/Shutterstock

Once, when I asked my five-year-old nephew who ate the last piece of cake, he said it was his invisible friend. I replied that Mr. Invisible must not be a very good friend, because he ate the last piece of cake and left my nephew to take the blame. My nephew agreed, as that was exactly what Mr. Invisible wanted me to think, having pulled off the perfect .

I doubt my nephew really believed his invisible friend existed. But he may have believed he convinced me that Mr. Invisible was real. How is it that someone can cling to a belief even after objective facts show it to be inaccurate?

We all naturally strive to reduce uncomfortable thoughts and emotions that sit poorly with our dearly held beliefs. Such is the way when we deny evidence of an unfaithful partner or of the abysmal performance of a beloved sporting team. Our beliefs become impervious to the facts in a process psychologists call cognitive immunization.

Cognitive immunization helps to explain why some beliefs become even stronger when challenged. They also help to explain how we cannot let go of some beliefs in the face of overwhelming contradictory evidence.

Immune beliefs are almost impossible to challenge with reasoning and structured argument. Try the following experiment: Google "greatest ever superhero" or something similar, and glance through the stunningly vigorous mass of blog, forum, and web article debate. Our mental firewalls defend immune beliefs well, so it really doesn’t matter if Superman should logically be able to defeat Batman to the keyboard warrior already committed to the opposing position.

Classic psychology studies show that we have trouble remembering the times when our personal beliefs have failed the test of outside evidence. This is because our minds automatically neutralize clashing information—such as that awkward moment when practitioners of a doomsday cult realize that the world did not come to an end when predicted. They just need to reset the date to accommodate a variable that went uncalculated in the initial forecast. . .

In fact, one characteristic of strong and beliefs is their internal logic and structure, even when they defy logical verification as a whole. As a result, believers come to arguments well-prepared, having become adept at using their —the natural inclination to avoid any information that contradicts a strongly held belief, while seeking out information that strengthens it.

What sticks in the mind does not necessarily have much to do with how we reflect upon its legitimacy. And, if the ideas stick and help us get by, we find ways of working through any hiccups.

What matters, once we accept a belief, is whether it continues to be useful. When it does, what matters is a rigorous defense of those advantages. Research suggests that we employ five major belief-enforcing techniques:

We isolate ourselves from people who hold outside beliefs in order to shield our ideas from even the possibility of contrary voices and arguments. Forms of isolation play a role in most group memberships, ranging from strong examples such as military basic training, to subtle examples such as a spouse who tries to exclude one of his or her partner’s unappreciated friends.

We try to reduce our direct exposure to other beliefs and ideas that might challenge our own. We can see stronger examples in hardline nation states with totalitarian regimes that ban media and free speech. At the same time, all forms of use similar principles, whether in selecting appropriate texts for the classroom or in prescribing the best nutritional advice.

We connect our beliefs to powerful emotions. One approach involves negative emotions to belief failures. The obvious example is the of an unpalatable afterlife as a result of non-compliance to a religious doctrine. On the other hand, we also scare our kids deliberately in order to shape their behaviors and steer them away from risk, whether in the form of electricity or pools, or both at the same time.

We associate with like-minded groups in which we work together to undermine rival beliefs and the groups proposing them. Targeting competing beliefs is common in , especially along party and ideological lines. Academics have also made this into a fine art under the rubric of the scientific method by highlighting the weaknesses in theoretical adversaries’ arguments while ignoring their strengths.

A final technique for immunizing our beliefs relies on repetition. Repetition is, of course, the backbone of all learning (for better and worse), including the essentials, such as grammar; the extraneous, such as sporting allegiances; and the repugnant, such as .

These five natural techniques for protecting our beliefs suggest that minds did not evolve to evaluate what is or is not the truth. Our minds were equipped through evolution with an impulsion to create, transmit, and defend beliefs that are useful, whether true or not. Although accurate beliefs can of course be useful, useful beliefs are not necessarily accurate.