At the GLP, we write a lot about controversial science and health issues of which people hold very, very strong opinions: GMOs, prenatal genetic testing, ‘franken-farmed’ salmon and babies with mitochondria donors. Often, it seems like people aren’t even involved in the same discussion as one ‘opponent’ glosses over the next’s point, then makes their own which is subsequently ignored. If the point of conversation is to gain knowledge or convince others, these are discussions are decidedly unfruitful.

Why do people hold so firmly to ideas, so much so that they become beliefs, about issues in which they don’t have a true ‘dog’ in the fight? The answer is because we are the dog. Our brains evolved in a tribal society where an us-against-them mentality was not only advantageous it was tantamount to survival. We still have that capacity in our brains, but now, in rich, Western culture, we’ve taken that tribal stance and applied it to issues like vaccinations, gluten consumption and eating organic says Harriet Hall at Science-based Medicine:

Religions and ideologies play into the hero plot since they match up well with the individual’s moral hunches and provide external justification. They validate emotional instincts, provide purpose and a common enemy. They can be useful but can also be dangerous; people have died for false beliefs…Some people accept a belief only if it can be shown to correspond to reality; others accept beliefs just because they are part of a coherent system.

Compounded with an affinity for like-minded people, we build our own ideological tribes, effectively echo chambers for our own thoughts, thoughts that perpetuate themselves every time we hear them reverberated back to us. We are neuro-chemically confirmation bias addicts.

Furthermore, cognitive science is showing that the simpler an argument is, the better we are at believing it, like ideological branding. Stephen Colbert, host of Comedy Central’s Colbert Report, called this phenomena “truthiness: truth that comes from the gut, not books.” As Katy Waldman at Slate reports, Colbert was on to something:

Cognitive psychologist Eryn Newman, who works out of the University of California–Irvine, recently uncovered an unsettling precondition for truthiness: The less effort it takes to process a factual claim, the more accurate it seems. When we fluidly and frictionlessly absorb a piece of information, one that perhaps snaps neatly onto our existing belief structures, we are filled with a sense of comfort, familiarity, and trust. The information strikes us as credible, and we are more likely to affirm it—whether or not we should.

And once our brains have bought all-in to one story, it becomes increasingly more difficult to change, especially when presented with contradicting fact. Take the example of parents who choose not to vaccinate their children and the pediatricians who try to change their minds. When presented with information that autism diagnosis and vaccinations were not linked, the strategy backfired:

Surveying 1,759 parents, researchers found that while they were able to teach parents that the vaccine and autism were not linked, parents who were surveyed who had initial reservations about vaccines said they were actually less likely to vaccinate their children after hearing the researchers messages.

In other worlds, the facts these researchers used to try to reeducated the parents only served to turn up the volume in their echo chambers. Are our minds really that unchangeable? It’s unclear. Obviously, some individuals do change their minds over the course of their lives. Sometimes this happens when personal circumstances change—increasing political conservatism as people age is a good example of that. Other times this just happens. Perhaps these individuals would be worth studying to see what arguments or methods work best.

Meredith Knight is a blogger for Genetic Literacy Project and a freelance science and health writer in Austin, Texas. Follow her @meremereknight.

Additional Resources: