From 9/11 to the Paris attacks, Ebola and Islamic State, the Internet enables dubious explanations of significant events to spread faster than ever. And often, simply dismissing them has the unwanted effect of emboldening believers. Is there a more compassionate and constructive way to respond?

Guardian contributor David Shariatmadari writes:

[…] Rob Brotherton, whose new book, Suspicious Minds, explores the traits that predispose us to belief in conspiracies. He cautions against sitting in judgment, since all of us have suspicious minds – and for good reason. Identifying patterns and being sensitive to possible threats is what has helped us survive in a world where nature often is out to get you. “Conspiracy theory books tend to come at it from the point of view of debunking them. I wanted to take a different approach, to sidestep the whole issue of whether the theories are true or false and come at it from the perspective of psychology,” he says. “The intentionality bias, the proportionality bias, confirmation bias. We have these quirks built into our minds that can lead us to believe weird things without realising that’s why we believe them.” “Whenever anything ambiguous happens, we have this bias towards assuming that it was intended – that somebody planned it, that there was some kind of purpose or agency behind it, rather than thinking it was just an accident, or chaos, or an unintended consequence of something.” This intentionality bias, Brotherton says, can be detected from early childhood. “If you ask a young kid why somebody sneezed, the kid thinks that they did it on purpose, that the person must really enjoy sneezing. It’s only after about the age of four or five that we begin to learn that not everything that everybody does is intended. We’re able to override that automatic judgment. But research shows that it still stays with us even into adulthood.” […] Like most personality traits, proneness to intentionality bias varies across the population. “Some people are more susceptible to it than others.” And, Brotherton explains, there is a small but reliable correlation between that susceptibility and belief in conspiracy theories.

Social psychologist Karen Douglas is “wary of rubbishing all conspiracy theorizing as dangerous,” Shariatmadari writes:

“Thinking in that way, it must have some positive consequences. If everybody went around just accepting what they were told by governments, officials, pharmaceutical companies, whoever, then we would be a bunch of sheep, really” [Douglas says]. On the other hand, the effects of certain theories on behaviour can be damaging. Douglas’s own research […] has shown that exposure to the idea that the British government was involved in the death of Princess Diana reduced people’s intention to engage in politics. Similarly, subjects who read a text stating that climate change was a hoax by scientists seeking funding were less likely to want to take action to reduce their carbon footprint. And anti-vaccine conspiracy narratives make people less likely to vaccinate their children, a clear public health risk. Should we try to stamp conspiracy theories out, then? Part of Brotherton’s argument is that they’re a natural consequence of the way our brains have evolved. Not only that, but trying to disprove them can backfire. “Any time you start trying to debunk conspiracy theories, for the people who really believe, that’s exactly what they would expect if the conspiracy were real,” he says. [Viren Swami, professor of social psychology at Anglia Ruskin university] sees things differently. “Experimental work that we’ve done shows that it’s possible to reduce conspiracist ideation.” How? Swami found that people who had been encouraged to think analytically during a verbal task were less likely to accept conspiracy theories afterwards. For him, this hints at an important potential role for education. “The best way is, at a societal level, to promote analytical thinking, to teach critical thinking skills.” But that’s not all. When people have faith in their representatives, understand what they are doing and trust that they are not corrupt, they are less likely to believe in coverups. That’s why political transparency ought to be bolstered wherever possible – and corporate transparency, too. “A lot of people have trouble accepting a big organisation’s or government’s narratives of an event, because they’re seen as untrustworthy, they’re seen as liars,” argues Swami.

Continue reading here.

— Posted by Alexander Reed Kelly.