In a 2008 paper on neuroeconomics, Carnegie Mellon University economist George Loewenstein said: “Whereas psychologists tend to view humans as fallible and sometime even self-destructive, economists tend to view people as efficient maximisers of self-interest who make mistakes only when imperfectly informed about the consequences of their actions.”

This view of humans as completely rational - and the market as eminently efficient - is relatively recent. In 1922, in the Journal of Political Economy, Rexford G. Tugwell, said (to paraphrase) that a mind evolved to function best in “the exhilarations and the fatigues of the hunt, the primitive warfare and in the precarious life of nomadism”, had been strangely and quickly transported into a different milieu, without much time to modify the equipment of the old life.

The field of economics has since rejected this more pragmatic (and I would argue, realistic) view of human behaviour, in favour of the simpler and neater “rational choice” perspective, which viewed the power of reflection as the only force driving human behaviour.

Flickr

But to paraphrase sociologist Zygmunt Bauman, our currently held views of what is reasonable, sensible and good sense tend to take shape in response to the realities “out there” as seen through the prism of human practice – what humans currently do, know how to do, are trained, groomed and inclined to do.

We compare ourselves to people we know, and come into contact with – either through social groups, or lately, with the advent of mass and, even fragmented media, people we think are like us.

Regardless of what is happening in Greece or Spain, or Yemen, we think about our situation, first and foremost.

And if we are being told consistently that our life is bad, and is going to get worse, then we start to believe that we live in desperate times, regardless of what we might be told through statistics and economic models.

Risk as feelings: how our brain makes decisions

Under pressure or stress, it is our amygdala, the emotional centre of the brain, that takes control, even as the thinking brain, the neocortex, is still analysing and coming to a decision.

George Loewenstein and his colleagues have suggested that people react to risks at two levels – by evaluating them in a dispassionate way, but also at an emotional level.

AAP

He called this the Risk as Feelings thesis. He argued we overreact emotionally to new risks (which are often low-probability events), and underreact to those risks that are familiar (although these events are more likely to occur). So, as Loewenstein explains, “this is why people seemed to initially overreact to the risk of terrorism in the years immediately following 9/11 [and the Bali bombings], but tend to underreact to the much more familiar and more likely risks of talking on the mobile phone while driving, and wearing seatbelts”.

More and more, psychological and neurological science is discovering that much of our decision-making is made at an unconscious and emotional level. What we are now finding is that when we are thinking about mundane and simple issues, such as small calculations, the brain areas associated with rational planning (such as the pre-frontal cortex) tend to be more active.

But when thinking about difficult, exciting, interesting activities, such as investing in a new business, or perhaps buying a $10 million lottery ticket, the brain areas associated with emotion – such as the midbrain dopamine system – become more active.

Images, colours, music, even social discussion means that the midbrain emotional area becomes dominant, and the rational part of the brain finds it hard to resist the temptation. The emotional centres of the brain simply tell the rational part to shape up or ship out.

And then, a very funny thing happens. The rational part of the brain agrees, and starts to look for evidence that supports the emotional brain – it becomes an ally in the search for reasons why the emotional choice is a good one. (All of this is going on very quickly and we are not conscious of it.)

The “interpreter” function

And probably one of the most important discoveries arising from research is that the human brain contains an “interpreter” function that generates a conscious explanation for any unconsciously motivated action or unconsciously generated feeling, and makes us believe that the conscious explanation actually was the reason for the action or feeling.

So, if we are confronted with information that does not connect with our self-image, knowledge, or conceptual framework, the interpreter creates a belief to enable all incoming information to make sense and mesh with our ongoing idea of our self. As Michael Gazzaniga says in his book, The Ethical Brain: “The interpreter seeks patterns, order, and causal relationships.”

So when it comes to buying something that is based purely on chance, or something that we don’t completely understand, for most of us, it’s our emotions that make the decision – there is nothing rational about it.

Optimism bias and the effort of rejection

And once we’ve made the decision, the optimism bias, amongst other things, kicks in to protect our ego. To some degree, the optimism bias causes many of us to overestimate our degree of control as well as our odds of success.

But optimism isn’t a bad thing. If we didn’t make decisions based on emotions and optimism, we would never get out of bed in the morning, and optimism makes us feel like we are in control, which is good.

Research in human decision-making suggests that humans are “hard-wired” to believe, predominantly because it requires significant cognitive resources to test an assumption, so it is more efficient to believe in a claim than to reject it. This is why we mostly trust big institutions, well-known brands, and figures of authority, just because we don’t have the resources to test every assumption we make.

One way to think about this approach is to consider how we might assess the arguments presented to us, when we make an investment decision. Distraction, for example, is a very useful way to convince a person to accept an idea before they have had time to comprehend it. According to Harvard University psychologist Daniel Gilbert (See the video below, where he talks about happiness), once they have accepted the idea, they have to unaccept it.

In other words, the acceptance of an idea is automatic, whereas the subseqent rejection of that idea requires more effort than its acceptance.

And this flaw in our thinking can have serious consequences. What ends up happening is that we want to believe: partly because of the way that we process information, and partly because once we have accepted an idea, it is our ego’s role to do everything it can to convince is that we have made a good decision.

In the world of business, wanting to believe can mean the difference between pursuing a particular business decision, having convinced ourselves that it was the right one, and looking at the information that tells us it is not. Wanting to believe also means that it becomes difficult for us to reverse our decisions, even when the evidence tells us that we should.

So can we trained out of this?

At face value, financial literacy training seems to be a very sensible and rational response to a tricky area. But there is also evidence to suggest that the more familiar and more comfortable we become with numbers, the more likely we are to make mistakes.

Indeed, there is some evidence to suggest that experts are more prone to making poor forecasts in their field of expertise simply because of their overinflated view of their superiority, and their willingness to not always use evidence when making predictions. It’s called the overconfidence effect.

It might even be that it would be better not to just teach financial literacy, but to focus on a range of critical thinking methods that would hopefully make it easier for us to engage with those numbers.

Research suggests most of us simply don’t think about numbers when we are confronted with them – we just accept them at face value. Both Gerd Gigerenzer and Eric Sowey have argued that people are willing to accept a statistic that a perceived authority presents on trust, rather than argue back.

A more effective approach might not be to simply teach financial literacy, but to teach critical thinking - including critical thinking around numbers and authority.

The other issue to consider is when we teach financial literacy and critical thinking. I would argue we should be encouraging children to be critical thinkers as early as possible.

Encouraging children to (respectfully) ask their teachers and parents why - and the parents and teachers giving a respectful answer - is not going to lead to the downfall of society. If anything, it is going to lead to adults who may think more reflectively about their choices.

If our children have positive experiences in relation to learning early, they are more likely to stop and reflect on their behaviour later in life. They are also more likely to be better students of life, in general.

This article is an edited extract of Paul’s presentation, ‘How people make complex decisions in turbulent times’ to the ASIC Summer School 2012: Building resilience in turbulent times, February 20-21, Sydney.

The original version of this article incorrectly identified George Loewenstein as from Stanford University. He is from Carnegie Mellon University.