Behavioural development economics: A new approach to policy interventions

Allison Demeritt, Karla Hoff, James Walsh

Economists typically assume people behave in a rational and self-interested way, making standard models limited in their explanatory power. This column argues that psychological and sociological factors – though usually ignored in economic models – affect decision-making. The findings, drawn from the World Development Report, further suggest that better behavioural understanding could subsequently aid development efforts.

Economists have long known that people are rarely as coherent, unbiased, foresighted, selfish, or fixed in their preferences as standard economic models make them out to be. Until recently, however, individual behaviour that could not be explained as self-interested and rational was deemed not to matter at the aggregate level (Fehr and Tyran 2005). Policies could thus be designed as if people behaved in a rational and self-interested way, and most economists focused only on policies that changed incentives and improved the availability of information. Economists largely ignored the findings of psychologists and sociologists as a basis for policies to change behaviour.

Three main principles of the World Development Report

The World Bank’s latest World Development Report shows that incorporating behavioural insights can benefit development initiatives. In the Development Report, psychological and sociological approaches to decision-making take centre stage.

The Report draws on cutting edge work to create a framework of human thinking centred on three principles that are left out of standard economic models.

The first principle is that all people think automatically.

Automatic thinking is “intuitive, associative, and impressionistic” (Kahneman 2011). While deliberative thinking sits at the core of most economic models, the automatic system is “the secret author of many of the choices and judgments you make” (Kahneman). It enables “fast and frugal” decisions (Gigerenzer 2000), but can also create systematic mistakes in judgment.

Consider savings for health care. Sixty-three percent of under five-year deaths arise from illnesses that could be prevented with readily available health products (Jones et al. 2003). An experiment run in the malaria-infested region of Kenya tested interventions to address the problem (Dupas and Robinson 2013). One group received information on the benefits of saving for health expenses. Another received three additional things: a lockable box, a key, and a passbook for recording their savings objective. Six months later, the individuals in the second group had saved much more and had been much more likely to be able to pay for a medical emergency than those who had been given only information. The researchers asked the subjects why the boxes had helped them save. Subjects reported that the boxes provided an easy way to save small change (33%) and increased the barriers to spending because the money was not immediately at hand (32%). When asked why they had not used a lockable box before, most answered that they had “never thought of it” (88%). The results suggest that the lockable box and passbook facilitated ‘mental accounting’ – a cognitive process in which categorising funds changes financial behaviour (Thaler 1990).

The second principle of thinking in the Development Report is that humans think socially.

People often value conformance to social norms and act in ways prescribed by their social environments. In contrast to the self-interested and selfish actor assumed in most economic models, real people are interested in others’ welfare and are often willing to cooperate as long as others are doing their share. And many people will punish others who are not doing their share, even at cost to themselves (Fehr and Gachter 2000). Many groups can thus sustain very high levels of cooperation even if some people in the group are purely self-interested.

Interventions can harness social thinking to achieve development objectives, as an experiment in India on group lending showed (Feigenberg et al. 2010). Groups receiving loans were randomly assigned to meet either once a week or once a month. Meeting frequency had economic impacts. Compared to those who met monthly, individuals meeting weekly were three times less likely to default on subsequent loans and were more willing, nearly two years after the experiment ended, to pool risk with their former group members.

The third principle of thinking that the Report introduces is the least familiar to economists. It states that people think with mental models.

In order to make sense of the vast array of information in their environment, people draw on conceptual tools such as categories, social identities, schemas, and taken-for-granted worldviews to derive meaning from situations. The institutions in the environment shape how people think and the alternatives they can imagine (Douglas 1986). Mental models often become naturalised, however, such that the categories, social identities, and patterns are seen as the natural or inevitable state of affairs even though other perspectives are perfectly possible, perhaps preferable, and often in existence in other communities.

One example of a powerful mental model is the ‘culture of honour’, which holds that a slighted person must stave off humiliation by punishing his insulter. A famous experiment in the US examined the culture of honour by having the experimenters’ accomplice bump into subjects in a hallway and call them names. The insult caused a surge in hormones among subjects who were from the US south but not those from the US north (Nisbett and Cohen 1996 and Figure 1). Southerners’ heightened physiological responses and comparatively aggressive behaviour after being insulted provide some evidence that culture shapes how people think and how they interact in society.

Figure 1. Physiological responses to insult by US Northerners and Southerners

Source: Nisbett and Cohen(1996)

The impact of honour on development

Deeply held cultural mindsets can have an impact on development. An experiment in India suggests that a culture of honour impedes the creation of efficient conventions that could help sustain economic development (Brooks et al. 2015). A simple example of a coordination problem in village life is the draining of waste water to keep the lanes dry. A villager prefers dry lanes and will drain his water as long as others do. But if others do not, he experiences a loss from wasting his time draining, only to suffer from lanes that are muddy and slippery from others’ water. Researchers sought to mimic this problem in an experiment. They invited high-caste and low-caste men to play a two-person game in which there were high pay-offs to coordination on the good equilibrium (where everybody drains his waste water), low but positive pay-offs to coordination on the bad equilibrium (where nobody does), and losses to a player if he is the only one to drain his waste water (coordination failure). The study found that high-caste pairs were much less likely than low-caste pairs to adopt the efficient convention (Figure 2, panel A). Why? Because the high-caste men responded adversely to coordination failures (panel B of Figure 2). When a high-caste man chose to drain his water in a round of decision-making but his partner did not, in the next round the high-caste man was unlikely to try again to cooperate. From there on, the pair was unlikely to settle on the efficient and cooperative convention. As panel A shows, even if only one of the partners is high caste and frames his experiences through the culture of honour, his behaviour can make it much less likely that the pair adopts the efficient convention. A follow-up survey asked men about appropriate responses to various slights they might encounter in village life. The results indicated that the caste-based behaviour observed in the game is representative of how subjects respond to real slights. Thus, the culture of honour in north India may be inhibiting the development of better conventions.

Figure 2. How caste culture affected a game of coordination in India

Framing and decision-making

People do not always evaluate information consistently, for they respond not just to the objective facts but to the way in which the facts are formulated. Tversky and Kahneman (1981) conducted a seminal experiment on the framing effect. They asked students to indicate which of two projects they would favour in response to “an unusual Asian disease that is expected to kill 600 people”. Half the students were given one frame of the decision problem, and the other half was given another frame.

In frame 1 (the gain frame), the students chose between (a) an option that would save one-third of the population or (b) a gamble in which there was a 33% chance of saving everybody, and a 66% chance of saving nobody.

In frame 2 (the loss frame), the students chose between (a) an option in which two-thirds of the population would die or (b) a gamble in which there was a 33% chance that nobody would die, and a 66% chance that everybody would die.

Although both policy options (a) and (b) in the two frames were identical, the preferences were very different. When presented with the gain frame, students were more likely to choose the option with certain outcomes; only 28% chose the gamble. When presented with the loss frame, they were more likely to take a risk to save more lives, and 78% chose the gamble. The shift from the gain frame to the loss frame was accompanied by a large shift from risk aversion to risk taking. People hold contradictory attitudes. A change in frame can thus shift preferences. A similar study of World Bank staff found suggestive evidence that the framing of a policy affects its desirability among development professionals, too. Development practitioners, just like everyone else, are subject to biases.

Concluding remarks

Since framing and context influence attitudes and decision-making, in many situations it is not possible to deduce the best policy from theory. The World Development Report emphasises that development policy can be improved by experimenting with different variations of a proposed intervention. To illustrate this point, consider again the experiment on health savings. The lockable box intervention was one of five interventions tested in a randomised controlled trial. Researchers didn’t know ahead of time which would work and which would fail because they didn’t know exactly why people weren’t saving. Testing several interventions gave researchers insight into the barriers to saving and a better understanding of the fine supports that would shift behaviour. The results of the study were not obvious or inevitable.

The parsimonious explanatory power of standard economic models has made them powerful in spite of their limitations. The Report shows that expanding our understanding of the rich set of factors that influence decision-making can aid development efforts. Incorporating behavioural insights from psychology, sociology, and other sciences can help policymakers develop innovative and sometimes low-cost interventions that help people advance their goals and increase their well-being.

References

Brooks, B, K Hoff, and P Pandey(2015), “‘Cross me and I’ll punish you’: Can culture block efficient conventions?”, University of Chicago, manuscript.

Douglas, M (1986), How Institutions Think, Syracuse, NY, Syracuse University Press.

Dupas, P, and J Robinson (2013), "Why Don't the Poor Save More? Evidence from Health Savings Experiments", The American Economic Review, 103(4): 1138-71.

Fehr, E and S Gachter (2000), “Cooperation and Punishment in Public Goods Experiments”, The American Economic Review 90(4): 980-94.

Fehr, E and J-R Tyran (2005), “Individual Irrationality and Aggregate Outcomes”, Journal of Economic Perspectives 19(4): 43–66.

Feigenberg, B, E Field, and R Pande (2013), "The economic returns to social interaction: Experimental evidence from microfinance", The Review of Economic Studies 80.4: 1459-1483.

Jones, G, R W Steketee, R E Black, Z A Bhutta, S S Morris, and the Bellagio Child Survival Study Group (2003), “How Many Child Deaths Can We Prevent This Year?”, Lancet 362 (9377 ) : 65–71.

Nisbett, R E and D Cohen (1996), Culture of honor: The psychology of violence in the South, Colorado: Westview Press.

Thaler, R H (1990), “Anomalies: Saving, Fungibility, and Mental Accounts”, The Journal of Economic Perspectives 4(1), 193–205.

Tversky, A, and D Kahneman (1981), "The framing of decisions and the psychology of choice", Science 211 (4481): 453-458.

World Bank Group (2015), World Development Report 2015: Mind, Society, and Behaviour. Washington, DC: World Bank.