Sometimes, the things we believe and the things we do just don’t match up – for example, office workers who think of themselves (and generally are) honest people who steal all of their personal stationary from work.

This situation can create Cognitive dissonance. Cognition is the mental process we use to form our knowledge and beliefs about the world. Cognitive dissonance happens when our beliefs (I am honest) contradict our actions (I’m stealing supplies) creating intense psychological discomfort.

There are two ways the office worker could resolve his cognitive dissonance, and relieve his discomfort. The first it to simply stop stealing – then there would be no conflict between his beliefs and his behaviour. The second option is more interesting – and is very relevant to what I am calling the Scientology mindset

He could modify his beliefs to match his behaviour. This typically comes in the form of rationalisations such as:

“They don’t pay me enough – I’m just making it fair again”

“It’s a big company, they won’t notice a few office supplies”

“Everybody does it, so it isn’t wrong”

Cognitive Dissonance in the Church of Scientology

How does this apply to Scientology? The belief of members that Scientology is a good thing comes into conflict facts which suggest the opposite – for example:

They are not getting the benefits they were promised

They are told that the Church is expanding – but know that it cannot pay the rent

They observe that advanced members, who are supposed to superior beings, are just as fallible as anyone else

The materials they are told to study do not make any practical sense

These contradictions cause cognitive dissonance. Most Scientologists resolve the conflict by acknowledging to themselves that Scientology does not work, and leaving the Church (around 90% of people who join leave within two years).

A minority of Scientologists remain loyal to the Church for years. They also experience cognitive dissonance – but they deal with it in a completely different way. Their belief is so strong that they rationalise, ignore and deny Scientology’s failures. Perhaps they need the sense of belonging, the feelings of self-worth, or the ‘mission’ that comes with Scientology membership more than they need the truth. Whatever the reason, their faith in Scientology becomes unshakable.

The first three rationalisations in the list below are built in to Scientology doctrine. The last two are typical.

“Scientology always works – so this project must have been sabotaged by Suppressive Persons, or Psychs”

“Scientology always works – so there must be something wrong with me“

“People only attack Scientology because they are afraid we will reveal their crimes“

“At least I’m doing something”

“I could demonstrate my OT powers, but I don’t want to frighten anyone”

Why Does this idea Matter?

Cognitive dissonance theory helps to explain why some people’s faith often becomes stronger in the face of facts that clearly show it to be false – and how they can gradually come to believe the most extraordinary and bizarre things despite any and all evidence.

Leon Festinger, the man who developed this theory, believed that people who avoid dissonance in this way are immune to evidence and rational argument – and this is why a long held conviction is very difficult to change.

He wrote of the difficulty of persuading a ‘true believer’ that his beliefs are misguided:

Tell him you disagree, and he turns away.

Show him facts and figures and he questions your sources.

Appeal to logic, and he fails to see your point.

This means that confronting Scientologists, and arguing with them, is usually counter-productive. If you drive people into a corner, they will fight back and defend their beliefs – and may emerge from the experience with even greater faith.

Further Reading

When Prophesy Fails

This book is a study of a small and short-lived UFO cult by a social psychologist who ‘infiltrated’ it with observers (his students). The strange behaviour that was observed there, led to the development of cognitive dissonance theory

Cult members believed that ‘the end of the world was nigh’, but wise aliens would save the faithful (i.e. them) by taking them up in spaceships before the apocalypse. What’s more, they set a definite date. Some members disposed of everything they owned in the expectation that they would soon be starting a new life on another planet.

The world did not end. Although most observers would expect the believers to lose their faith because of this undeniable fact, the social-psychologists suspected that the exact opposite would happen. They predicted that believers would invent an explanation for the delay, and recruit with renewed fervour – that the failure of the prophesy would actually reinforce the cultist’s faith. The social-psychologists were right.

Mistakes Were Made (but not by me)

When people are placed in impossible situations, there are many ways in which they can rationalise their plight. These forms of psychological self-defence are well understood. If you understand how these processes work, you have a better chance to avoid being manipulated.

This excellent popular account of the psychology of self-deception opens with an accessible discussion of cognitive dissonance theory, and continues to discuss the psychology of self-deception.

And, of course, the incomparable Dilbert