When the Oxford dictionary named post-truth 2016’s word of the year, it was hard to disagree with their choice. Political debate was occurring in complete absence of fact. Measles outbreaks and record temperatures were roundly ignored by anti-vaxxers and climate skeptics. Opinions were being formed in the gut or heart instead of the brain. And with 2017’s word of the year looking like a fight between alternative-facts and fake-news, the trend doesn’t appear to be abating.

So why is it that objective facts appear to be losing ground to emotional appeals and outright deception? Part of the answer seems to be that the way we have evolved to acquire knowledge is now confronted by a historically unprecedented informational behemoth: the internet.

It’s who you know, not what you know

Humanity’s social nature may well be our defining evolutionary trait. It has conferred many advantages, including the way we accumulate knowledge. As part of a socially cohesive society, individuals can focus on developing expertise on a select few tasks, while outsourcing to other community members for the rest. A plumber can outsource medical knowledge to a doctor, who can outsource mechanical knowledge to an engineer and so on, with each benefiting as if the knowledge were their own. This pooled expertise is known as the community of knowledge.

While we are reliant on others for this pooled knowledge, we treat it as if it were our own, forming what researchers refer to as transactive memory. This was demonstrated by researchers from Brown University who presented participants with a made-up press clipping detailing a novel scientific discovery. The more confidently the expert in the story expressed their understanding of the discovery, the more confident the reader was of their own understanding.

Of course, this misappropriation of communal knowledge doesn’t convey any actual expertise. This can be exposed by asking someone to rate their understanding of how everyday items like flush toilets or zippers work, both before and after they are asked to explain in detail exactly how they function. It is only after we attempt to explain the mechanism in depth that the shallowness of our knowledge becomes apparent, a phenomenon know as the illusion of explanatory depth.

Ignorance on a grander scale

We can begin to see how mass misconception occurs by observing how our social reliance for knowledge interacts with our social reliance for decision making. In a similar way to how we socially outsource via transactive memory, it’s also mentally efficient for us to observe a herd mentality and follow other’s decisions.

This concept was eloquently explained by University of California economists Bikhchandani, Hirshleifer, and Welch, who described how, when making a decision, a rational individual will be heavily influenced by a consensus of those who decided before them, even if it runs counter to their personal evidence. The more people within the consensus, the less likely an individual will trust their own evidence and run against the herd.

This means an individual’s personal evidence may have pointed to the opposite of what they ended up choosing, yet upon choosing based on the consensus provided by those before them, they now add themselves to said consensus.

It’s not difficult to see how this process, labelled informational cascades, describes how an incorrect opinion can rapidly spread. If the initial group of individuals base their decisions on incorrect evidence, the seeming consensus can make people ignore counter evidence, and much like how a single spooked cow can start a stampede, one alternative-fact can start a movement.

I may not know science, but I know who I like.

While informational cascades can quickly lead to a community accepting a false idea, the fact that many individuals within this group may have disregarded personal evidence in deference to the decisions of others means that, should compelling contradictory new evidence be publicly released, the consensus can quickly and dramatically reverse, much like a stock market bubble bursting.

However, when we analyse evidence, not every source is treated equally. We apply much greater weight to those we like, (even if one source is a friend and the other is an expert) and we actively disregard information if it’s provided by sources we don’t like. Our brain’s confirmation bias also preferentially seeks out sources that conform to our worldview, and by selectively associating with like-minded individuals, we form opinion echo chambers, where the informational cascades are formed only on the opinions of similarly minded individuals.

These echo chambers not only form our opinions, but also protect and reinforce them from new information. Furthermore, on an individual level, we can be extremely stubborn when defending our worldview. This was somewhat comically illustrated by an experiment which showed that we are far more critical of others than of ourselves, to such an extent that simply presenting our own opinions as somebody else’s will cause us to reject them almost 60% of the time.

The internet just broke our brains.

These cognitive limitations have always been present, but with the advent of the internet, the resulting mistruths have left the dark corners and entered the mainstream. The internet has been described as a stimulus on steroids which permits exploitation of our cognitive blind-spots to a far larger degree.

Firstly, it has allowed previously isolated individuals with extreme or unvalidated ideas to form congregations. Their larger membership gives their niche views previously unobtainable plausibility, which paves the way for an informational cascade as more members join and their credibility continues to grow.

The sheer volume of news sources and non-factual opinions then plays on our dependency on the community of knowledge by creating enough noise that well researched facts lose their potency to correct the incorrect opinions. The multitude of news sources also allows our confirmation bias to pick and choose our information to match our already held beliefs.

These beliefs are then strengthened within the echo chamber of our social groups, where other ideas are also shared and learned, with studies showing that belief in one conspiracy theory generally leads to others.

Changing Hearts and Minds

This may sound dire, and with the President of the USA providing a daily twitter stream of blatant mistruths (see below) and the leader of Russia overseeing an unprecedented misinformation campaign, you’d be forgiven for being sceptical about our brain’s ability to adapt.

But it may be helpful to view this era of misinformation in the same vein as obesity. Evolutionarily it made sense to store fat and favour high calorie foods, but technological and nutritional advancements made it necessary to change how we look at diet.

Similarly, our brains are confronted with an evolutionarily unprecedented informational stimulus, and learning how to effectively counter the exploitation of our evolutionary knowledge pathway will require further progress. As this pathway shows, conspiracy theory belief isn’t sinister, it’s merely people rationally processing their version of reality.

But like obesity, the problem needs to be addressed. And while endeavours such as monitoring news credibility on Facebook and Google will undeniably help, our understanding of the brain suggests that the best method to changing people’s understanding is through exposing their illusion of explanatory depth.

This is supported by recent studies, that show how extremist views on topics like global warming were tempered by both educating people about the basic mechanisms involved, or by asking them to explain in depth the mechanism behind their alternative theories.

Obviously creating environments where education of sects of society that have an inherent distrust of scientists will be more complicated than simply having people in lab coats hand out textbooks. But understanding that fringe beliefs originate from a place of rationality should give us some hope that science’s overwhelming advantage in explaining the mechanism behind complicated issues can eventually reverse the informational cascades behind many widely-held misconceptions.

Title Image: xkcd (CC BY-NC 2.5)