In the early days of my skeptical career I spent time investigating and deconstructing classic pseudosciences, like belief in Bigfoot, astrology, UFOs, and ghosts. I was often challenged as to why I even bothered – these are all silly but harmless beliefs. Is it really worth the time to dissect exactly why they are nonsense?

But my fellow skeptics and I knew the answer. We were interested not so much in the beliefs themselves but the believers. How does someone get to the point that they believe that the relative position of the stars at the moment of their birth could influence the wiring in their brain and even their destiny? At the time I think the answer most activist skeptics, including myself, would give was scientific illiteracy. People simply lack knowledge of science and fills the gaps with entertaining fantasy.

Lack of scientific knowledge definitely plays a role, and is an important problem to address, but it was naive to think it was the main cause. Such explanations do not survive long with contact with actual believers. It becomes rapidly clear that the primary malfunction of true believers is not a lack of information or scientific savvy. It’s something else entirely.

My explanations for why people believe nonsense then evolved into stage 2 – a lack of critical thinking skills. Scientific knowledge needs to be coupled with an understanding of epistemology (how we know what we know), logic, cognitive biases and heuristics. This view, that belief in nonsense is mainly a failure of critical thinking, is a lot closer to the truth. Our strategy for fighting against belief in pseudoscience and magic evolved into promoting not only scientific literacy but critical thinking skills.

The topics on which we focused also shifted, I think partly reflecting this changing view of what, exactly, we were combating. At this time we were experiencing the rise of alternative medicine, and anti-vaccine views, 9/11 and other conspiracy theories. These were topics with more meat, and more cultural significance. We still got frequent questions about why we bother, but mostly from true believers who were annoyed that we were calling them out of their bad arguments and false facts.

Then social media happened. I don’t think this fundamentally changed people, the nature of belief, or even the topics we dealt with. What it did do, combined with other social forces I will also discuss, is dramatically increase the echochamber effect. It became increasingly easy for people to hunker down in ideologically pure corners of the internet to cultivate their narrative and insulate themselves from any dissent.

At the same time we saw the rise of ideological media. David Roberts wrote a great summary of this a few months ago, and what he called “tribal epistemology.” He was writing from a political point of view, but the phenomenon extends also to conspiracy thinking, alternative medicine, food fanatics, natural-is-best warriors, vaccine deniers, and even flat-earthers. This, I think, is stage 3 – the primary cause of belief in nonsense is narrative or tribal thinking, which is only facilitated by scientific illiteracy and lack of critical thinking skills.

People tend to make sense of the world through stories, or narrative thinking. There is nothing inherently wrong with this, and it seems to be a core way in which our brains work. We are storytelling animals. But the narrative should be only a tool, subservient to facts and logic, and flexible and adaptable to change. What happens too often, however, is that the narrative takes control. It is no longer determined by facts – it determines what facts we believe.

I have written about narrative thinking in terms of alternative medicine, organic farming, and other issues. I do think understanding the role of narratives is critical to understanding why we believe what we believe and in combating harmful nonsense.

What Roberts is talking about is what happens when the narrative becomes a part of someone’s tribal identity. That is tribal epistemology – something is true if it supports my tribe, and it is fake if it is inconvenient or antagonistic to my tribe.

I agree with his basic premise, because not only do people rely upon this simple rule to determine what to believe, they may even do so explicitly to replace traditional institutions of knowledge and rules of evidence. Proponents of alternative medicine want to literally change or even abolish the methods by which we determine what is safe and effective in medicine. They want to change the rules of science, or abandon science entirely, and disconnect regulations from scientific evidence, and even to eliminate the standard of care. The changes they are attempting to impose on the institutions of medicine are more important than any one snake oil treatment they happen to be promoting.

We are now seeing the same thing in the political arena. The institutions of information – academia, the media, experts – have been attacked as illegitimate in order to fend off their pesky standards of facts, transparency, and fairness because their conclusions were ideologically inconvenient. That, primarily, is what is meant by a “post-truth” world. We no longer share common institutions or standards when it comes to information. Every ideological group can have their own media, their own experts, and their own information. Everything else is fake, or part of some conspiracy, or hopelessly compromised by conflicts of interest.

Of course, the world is messy, and there is often a kernel of truth to the excuses people use to dismiss information they don’t like. Media outlets are biased. Scientists do make mistakes and disagree with each other at times. There is uncertainty about everything. There are serious quality control issues in every institution.

However, ditching common institutions and standards was not motivated by these flaws. The flaws were simply used to justify opposing them, which was primarily for ideological reasons. The end result is an historic level of polarization, with different tribes lacking any common ground. We can’t even agree on basic facts, or how to determine what a fact is.

Many of us have probably experienced this first hand with people we know well. I have. If not, then you only need to spend a little time in social media to see its effects.

The narrative has won. Institutions of knowledge and legitimacy are failing. No one knows the path forward, but we better figure it out. Meanwhile, I will continue to be a stage 3 skeptic – fighting for science, and critical thinking, and against tribal epistemology.