The usual caveats apply – this is one study in a limited context showing only correlation and using a psychological construct. I also have to be careful because the study confirms what I already believe. Having said all that, it is interesting and is probably telling us something about people with extreme political views, especially when other research is considered.

The study involves individuals with radical political beliefs, as measured by a standard questionnaire. It has already been established that those with more extreme beliefs espouse greater confidence in their knowledge and beliefs. However, it is not clear how much this is due to an overconfidence bias vs a failure of metacognition. In other words – do people who are overconfident about their political beliefs like to portray themselves to others as being confident, or do they simply lack insight into the correctness of their own beliefs (a metacognitive failure). The current study tests the latter factor.

The researchers, lead by Steven Flemming at University College London, looked at, “two independent general population samples (n = 381 and n = 417).” He gave them a challenge in which they had to estimate the number of dots on two images, and decide which one had more. They also had to say how confident they were in their judgement. Further, if they got the answer wrong, they were given further information in the form of another image with dots which should have helped them improve their estimate. They were then asked to restate their confidence.

The study found that those with more radical political views indicated higher confidence in their choices, even when they were wrong, and less of a tendency to update their confidence with new information. In other words – you might say they are opinionated and stubborn. This comes as absolutely no surprise if you have ever interacted with someone with extreme political views.

What this study cannot tell us about is the arrow of cause and effect. One possibility is that those who lack the metacognitive ability to properly assess and correct their own confidence levels will tend to fall into more extreme views. Their confidence will allow them to more easily brush off dissenting opinions and information, more nuanced and moderate narratives, and the consensus of opinion.

At the same time I find it plausible that those who become radicalized into extreme political views may adopt overconfidence and stubbornness as a motivated reasoning strategy, in order to maintain their views, which they hold for emotional and identity reasons. This may become more of a general cognitive style that they employ, rather than being limited to just their radical views.

I do think the former hypothesis is stronger, however – that the metacognitive failure comes first and leads to extreme views. This is more plausible because other studies have shown that people engage in motivated reasoning only for views which have emotional significance for them. They revert to more flexible thinking styles for beliefs that are emotionally neutral. The researchers in this study deliberately chose an emotionally neutral task – estimating dots on an image – for this very reason.

It’s also possible that both arrows of causation are simultaneously correct, that some people have a tendency to fall into extreme views because of below average metacognition, but then those extreme views reinforce this cognitive style in order to maintain themselves.

Also – prior research suggests this is not the only factor involved in extreme political views. Culture, upbringing, and exposure matter also. Other cognitive styles are also important, such as intuitive vs analytical (with intuitive being more susceptible to radical views).

Anything as complex and multifaceted as our political views is going to have complex and multifaceted causes. There is likely a host of factors relating to cognitive style, emotional outlook, metacognitive ability, and other personality traits that add together to make someone more or less susceptible to accepting extreme views, conspiracy theories, and paranormal and superstitious beliefs. These traits then interact with education, culture, exposure, trauma, and other environmental factors to determine if a person comes to accept radical ideas, which specific ideas, and to what extent.

The one factor that we have the most control over in all of this is education. People can be taught metacognition as a skill, like anything else. It may be easier or harder for each individual, but it is ultimately teachable.

We also can address extreme views by directly addressing them. It is unclear, however, how much of an impact this has. Based mostly on my personal experience and those of others engaged in skeptical outreach, it seems that direct debunking is mainly useful for those in the middle. For the majority of average people who have typical metacognitive skills, and no particular motivated reasoning, factual information is useful to counter misinformation from extremists.

However, factual information is much more effective if it is combined with also addressing the metacognitive issues – critical thinking. And I think overall teaching critical thinking itself is the most important thing. Otherwise you are just playing whack-a-mole with an endless series of specific dubious claims and false narratives.

We also have some degree of influence over exposure, but this is very limited in a free society that values free speech. To be absolutely clear – I wouldn’t have it any other way. Free speech is just too important. But what we can do is make sure that our social media algorithms are not rigged to maximize radicalization through automatic exposure to increasingly extreme views. Further, there are contexts in which editorial quality control is appropriate, such as academic journals and quality journalism. And finally, we can make efforts to get more moderate and nuanced views greater exposure, to crowd out the extreme views as much as possible.

And there is one last thing we can do – study the phenomena of belief and cognition to better understand it, as in the current study. This is a vitally important area of research.