The flawed mind

When we think about human nature, the idea that we are rational beings is one that is generally accepted. Every day we are bombarded with a sea of situations which we must face by making decisions. However, research shows that we aren’t actually the author of our choices and actions that we think we are and suggests that our cognition is susceptible to biases and manipulation. Much of which we aren’t even aware of. The cause of these mental flaws lies in the design of the machinery of the mind.

The various roles and parts of the brain that function, when we think, can be thought of as being part of two systems: System 1 and System 2. System 1 is the automatic system of the mind and operates quickly with little or no control. Mental activities like orienting to a sudden sound are attributed to System 1 because the mind completes these tasks involuntarily. Evolution has shaped the various parts of the brain that makeup System 1 to be sense-making. They provide us with a continuous assessment of our environment and the main problems in it that we must solve to survive. While our need to monitor threats in the environment is less urgent these days, System 1 is always turned on and continually performs basic assessments of everything around us. An example of this is our ability to distinguish a friend from a foe at a glance.

The automatic operations of System 1 also enable it to effortlessly generate impulses, intuitions and impressions about the world around us. These are fed to System 2 which, if endorsed, are turned to beliefs and voluntary actions. Unlike System 1, System 2 does not run automatically and requires effort and attention. For this reason, System 2 takes over when there are activities that are too difficult for the automatic System 1. Mental tasks that can’t be performed involuntarily, such as counting the number of A’s in this sentence, are performed by the regions and activities in the brain that makeup System 2.

The division of labour between the two systems is highly efficient which means that the brain is able to work at optimum performance with minimal effort. This arrangement works well most of the time as System 1 perceives the outside world accurately and reacts to challenges appropriately, and System 2 benefits from this. However, System 1 can sometimes generate intuitions that are biased or that have been unknowingly influenced, and in both occasions, the brain is not aware of this ever happening. These incorrect intuitions are suggested to System 2 which, if not working with enough effort to question them, manifests them as a judgement. Thus, these errors arise from the ignorance of System 1 and the laziness of System 2. We identify ourselves with System 2 as it is this system that has beliefs and makes choices, but when these beliefs and choices are the results of unknowing laziness and ignorance, our identity must be scrutinised.

The unknown self

Some psychologists believe that the ideas that we form about the world are located in nodes that make up a vast network in the brain. This is known as ‘associative memory’. The nodes are linked to each other based on their effects (virus = cold), properties (lime = green) and categories (banana = fruit). Because of the network-like structure, when we think of an idea, this in turn activates other nodes in the network. In a given situation, the whole network of ideas is not needed and so only a few of the activated ideas are registered by our consciousness. This means that not only do we have limited access to the workings of associative memory; but some of the work is also hidden from our conscious selves.

System 1 excels at performing basic assessments of the world and uses the intuitions it creates to build a coherent story about what is happening around us. However, sometimes information about a situation may be scarce and the limited amount of information is all that we have and know. Because associative memory is designed to utilise only activated ideas, information that is not retrieved from memory might as well not exist. With the information that is received, System 1 builds the best story it can and if it is good enough, System 2 will endorse it. In other words, System 2 jumps to conclusions.

Paradoxically, the less we know, the more coherent the world seems as there are fewer pieces to fit into the story. In one study, participants were exposed to a legal scenario involving a dispute between two parties. In addition to some background information, participants talked with lawyers of both sides. Some participants were given evidence from both sides while others were given evidence biased to one side, and made aware of this fact. Therefore, those who were presented with one side could have also generated the argument for the other side. The results showed that participants who were shown one-sided evidence were more likely to judge in favour of that side while also being more confident in their judgement than those who saw both sides. This suggests that decision making is influenced not by the completeness of information, but by it’s consistency. System 1’s achievement of coherence facilitates in cognitive biases that ensures that knowing little is easier than knowing a lot.

In the 1980s, psychologists discovered that exposure to a certain word influences the choice made when evoking another word. For example, people who are shown the word ‘EAT’ and then asked to complete the following word: ‘S_AP’, are more likely to complete the word as ‘soup’. People shown the word ‘WASH’ will complete the word as ‘soap’. This is known as the ‘priming effect’, as a stimulus primes us to think a certain way.

Priming effects also take many other forms. In one study, participants were asked to assemble five random words into a coherent sentence. For one group of participants, their list of five words were integrated with words associated with old age such as wrinkle, bald or forgetful. When they had completed the task, the participants were asked to walk down the hallway to take part in another experiment, but their walk was what was actually being studied. The researchers timed how long it took the participants to walk the length of the hallway and found that those who were primed with words associated with old age walked slower. When questioned after, these participants reported noticing a theme in the words but insisted that nothing they did after the first experiment was influenced by the words they had encountered. The idea of old age had not been consciously registered but had changed their actions nonetheless. It is for this reason that priming phenomena arise in System 1 where we have no conscious access to them. Further priming studies show that the way we vote and the way we act in difficult situations are unknowingly influenced by the world around us.

Research into associative memory and priming effects suggest that we aren’t actually the autonomous author of our choices and actions. We know far less about ourselves than we think we do and our conviction that the world around us makes sense rests on our ability to ignore our ignorance. For this reason, reliance on System 1 is comforting as it reduces the anxiety we experience if we allowed ourselves to fully acknowledge the uncertainties of existence, but by doing this we are consoling in an ignorant unknowing self. When we decide to use System 2 with effort however, we are more likely to be less biased to believe certain things and act in certain ways, and more likely to take part in self-criticism.

Rationality is earned

To use System 2 when we think requires effort, attention and determination. Indeed, in studies involving self-control, System 2 has been shown to control behaviours and thoughts when performing a task. Biases and manipulation take over when System 2 is too lazy or preoccupied to question the intuitions of System 1. Although a key function of System 2 is to reason with caution what it receives from System 1, studies show that more often than not we find cognitive effort unpleasant and avoid it as much as possible.

Thus, to be a rational individual we must be thinking with effort. When making a decision or when forming a belief, we must recognise that we are entering a cognitive minefield. We must then slow down, step back and ask System 2 for reinforcement. The way to be rational is to be alert, more intellectually active, more willing to question and more sceptical of our intuitions. Only then do we think with intention and not with automation.