Antony Davies: I'm Antony Davies.



Erika Davies: I'm Ericka Davies.



Antony Davies: And this is Behavioral Economics.



Erika Davies: At some point in our distant past, a human who had food met another who had a spear. The two exchanged and departed better off than when they met.



Antony Davies: Early in their lives, children learn the importance of exchange. We call it sharing, but we teach it as a quid pro quo. You share your toys with your friend and your friend shares his toys with you. That's exchange.



Erika Davies: The discovery of exchange led to the discovery of specialization. Humans learned that they had more to exchange when each person concentrated his time and energy doing what he did best. Those we manual dexterity spent more time making spears, those with strength spent more time hunting. They traded spears for meat, and both were better off.



Antony Davies: This dance of specialization and exchange was the birth of the economy. Fast forward tens of thousands of years. The economy has grown in size and complexity, humans are now so specialized that many don't know how to hunt or to grow food. They rely on others to obtain food for them. In exchange, they spend years of their lives developing complex skills that make other people's lives better in ways that were impossible before.



Erika Davies: As these exchange grew, humans formed questions about the economy itself. What is the natural of property and prices? How is it that the economy appears to work without anyone guiding it?



Antony Davies: From these questions, economists developed a body of theory to explain how humans behave when their unlimited desires collide with their limited abilities. This body of theory is economics.



Erika Davies: One of the principles underlying economics is that humans behave rationally, that they make choices they believe will bring them happiness now or in the future.



Antony Davies: But if we look around it appears that humans aren't that rational. People smoke and then suffer ill health, students put off studying and then regret it. All people behave irrationally sometimes. And that raises the question of whether economics itself is built on a false premise. Is it possible that human irrationality nullifies economic theory?



Erika Davies: To understand what economists mean when they say that humans are rational, let's examine molecules. Fluid mechanics explains how fluids behave. Our understanding of fluid mechanics enables us to design ship hulls with minimal drag, and airplane wings with maximum lift.



Antony Davies: While individual molecules behave unpredictable, the behavior of a group of molecules is so predictable that we bet our lives on the predictability every time we step on a plane or a boat.



Erika Davies: And this is where our irrational humans comes in. Unlike individual molecules, individual humans are at least sometimes predictable because they're at least sometimes rational. And that means that when they are in groups, they become highly predictable.



Antony Davies: People do make irrational decisions, but we tend to learn from our mistakes and from the mistakes of people around us. Meanwhile, many apparently irrational choices people make can be in fact quite ration. Take for example a student who chooses to party now and study later. This choice appears irrational to the professor.



Erika Davies: But is the student really making an irrational choice? Possibly not. When the student chooses to party instead of studying, she's making a rational choice to take leisure time from her future self and enjoy it now. In exchange, her future self will have to study.



Antony Davies: The professor might consider this folly, but he has different preferences and abilities and so might be less willing to make that exchange for himself.



Erika Davies: This difference in preferences and abilities is what makes our modern economy possible. People who are more willing to put off consuming tend to save their money, which is then borrowed by people who are less willing to put off consuming. Among them, entrepreneurs who want to try out new ideas.



Antony Davies: And when those entrepreneurs ideas succeed, we get the automobile, the plane, the computer, the internet and all sorts of other things we wouldn't have, but for the fact that a couple of cavemen once traded spears for food, setting in motion the wondrous cooperative venture we call the economy.



Erika Davies: One of the wonders of the modern economy is all the variety available. Think about shoes. Humans invented shoes tens of thousands of years ago. Over those thousands of years, the shoe hasn't changed much. Yet, despite how simple they are, the number of varieties of shoes is almost uncountable. There are so many that you could sit in an auditorium with hundreds of people and still be wearing a unique style of shoe.



Antony Davies: Humans like variety and shoes aren't the only thing that come in many varieties. So, too, does everything from pencils and cookware to computers and calls. But an abundance of variety presents a paradox of choice. How can a person make a rational choice when the number of options is so large that the cost of searching for the best option exceeds the benefit of finding it?



Erika Davies: Notice that a person actually makes two decisions. The first is the search decision, to keep looking for better shoes or to stop. The second is the purchase decision. When done searching, which pair to buy? At some point, he will have looked at enough shoes that even though he knows there must be better ones out there, it just isn't rational to keep searching because he's already found a pair that is good enough. Humans do this all the time. We choose colleges without having researched all possible colleges, we choose life partners without having dated all available people, we buy houses and choose jobs without having looked at all possible houses, or all possible job openings. We know that beyond some point, it is actually irrational to keep searching.



Antony Davies: How is it possible to make a rational purchase decision without seeing all the possible options? Humans have evolved low cost shortcuts for increasing the odds of finding the better options. These shortcuts are called heuristics. When faced with more options than can be explored, heuristics help people distinguish the better options from the worse ones.



Erika Davies: For example, suppose you want to pick a college. There are thousands from which to choose, and each is unique in some important way. To really research all the important attributes of all the colleges would require years of work. Even if you bought a book that contained every detail of every college, these details are likely to have changed by the time you're finished reading. How is it possible to make a rational choice?



Antony Davies: One thing you can do is to ask people you know where they went to college. This is a heuristic. By asking other people what colleges they chose, you're setting these people up as proxies for yourself. You're relying on their experiences to inform your decision, hoping that your experience will be similar to theirs.



Erika Davies: Another useful heuristic is the one we use when we judge alternative options relative to one we're familiar with. People use this heuristic often in selecting a life partner. Rather than attempt to compare and contrast the attributes of all eligible partners, we select one, sometimes at random, to date. We date this person until someone better comes along. Then we date that person until someone even better comes along. We keep doing this until we make the rational decision that it isn't worth looking any further and we marry the person we're dating. This heuristic is useful because it is easier to compare two things and judge, which one is better, than it is to select the best from among a large set of things.



Antony Davies: Faced with the problem of choosing an option when the act of choosing is costly, humans have evolved heuristics, shortcuts that employ their own past experience and the experiences of people around them to help them make more rational choices. This is just one example of how humans rely on each other to make decisions in this complex web we call the economy.



Erika Davies: Making rational choices is expensive. It takes time and energy to collect the information necessary for rational thought, and it takes mental effort to apply rational thinking to the information we gather. To defray these costs, humans rely on mental shortcuts to make decisions. We call these heuristics. For example, we might choose options that our friends choose, choose the same thing we've chosen in the past, or choose the first thing we encounter. Employing heuristics doesn't require the time, energy, or mental effort that rational thought requires. But, the probability of making a bad choice is higher when employing heuristics.



Antony Davies: Compared to rational thought, heuristics present a trade off. They're easier to employ, but more likely to yield poor choices. And it turns out that humans tend to employ heuristic, well, rationally. We're more likely to employ heuristics rather than rational thought when the effort we save from employing them outweighs the cost of making a poor decision.



Erika Davies: In practice, we usually employ a combination of heuristics and rational thought. For unimportant decisions, like choosing a disposable pen, we rely more heavily on heuristics. We just pick the one that attracts our attention first, or the one we recognize from past use. For more important decisions, such as buying a house, we rely more heavily on rational thought.



Antony Davies: But occasionally, humans rely too heavily on heuristics when we should be relying more on rational thought. When this happens, we say that the humans have succumb to cognitive biases. Cognitive biases are errors in decision making cause by an inappropriate reliance on heuristics.



Erika Davies: In 2002, a teenager committed suicide by flying a private plane into a building in Florida. An investigation revealed that the teenager was taking a prescription drug for sever acne. Further investigations revealed other instances throughout the country in which teenagers who were on the same drug also committed suicide.



In response to this news, many people wanted to ban the drug. This is an example of a heuristic at work. Instead of collecting more information, people choose to call for a ban based on the first information they encountered. But, the first to ban a prescription drug has grave implications, and when a choice involves grave implications, over reliance on heuristics can lead to cognitive bias.



Antony Davies: The cognitive bias in this case was that people's emotional reactions prevented them from asking an important question: how many teenagers taking the medication did not commit suicide? Subsequent researched showed that teenagers who took the prescription drug were not at great risk of suicide. In fact, it appeared more likely that the cause of the suicides was dispar over sever acne rather than the drug used to treat the acne. Yet, because of a public outcry driven by a cognitive bias, we came dangerously close to banning this drug.



Erika Davies: When used inappropriately, heuristics result in cognitive biases that lead us to make poor decisions. The decisions are poor not because the humans are irrational, but because we've misapplied a tool that evolved to help us make complex choices. Because poor choices have consequences, we have incentive to learn from our mistakes. Where the consequences of making a mistake are great, and they fall on those making the mistakes, they learn faster. Where the consequences are less or they fall on others, the actors learn more slowly.



This process of making choices and enduring the consequences encourages us to make better choices and provides the dynamic force that is the engine of this wondrous enterprise we call the economy.



Free market economists argue that people should be left alone to make decisions for themselves. A person knows what is best for himself because only the individual has knowledge of his own circumstances and desires.



Antony Davies: But what if there were a decision that both required specialized knowledge to make, and, which most people, if they have the specialized knowledge, would tend to make in the same direction. For example, suppose your employer gave you the option of keeping your entire paycheck, or putting five percent into a retirement fund, and having the employer contribute an additional five percent. Where most people are concerned, the better option is the retirement fund. If all workers had the specialized knowledge necessary to make that choice well, that's what most would choose.



Erika Davies: But, few workers have the specialized knowledge required to make that choice well. Is there a way to encourage people to make what is, for most of them, the right choice, without forcing all of them to make that choice? The answer is what economists call nudging. We can nudge the working in the right direction by adjusting his default option.



Antony Davies: For his retirement plan, the worker faces two options. He can choose to participate or he can choose not to participate. But there's a hidden third option. He can choose not to make a choice at all. Doing nothing is itself a choice.



What happens when worker chose to do nothing? Under current law, if the worker choose to ignore the choice, he is not enrolled in a retirement plan. In other words, the don't participate option is the default option. Don't make a choice is the same as don't participate. By definition, there's always a default option. It's what happens when the person does nothing. The idea behind nudging is that since there is a default option anyway, why not make that default option, the option that is better for most people under most circumstances?



Erika Davies: By changing the default option to participate, the worker would automatically be enrolled in the retirement plan unless he specifically chose to opt out. Nudging is setting the default option to that which most people would select if they had the knowledge required to make a rational choice. Nudging sounds like a great way to help people make rational choices, but it isn't fool proof. Those who do the nudging can end up nudging people in the wrong direction.



Antony Davies: For example, in the United States, the default option is that medical drugs are prohibited unless the government can verify that they are both safe and effective. For someone dying of a disease for which there is no cure, experimental drugs can mean the difference between possible life and certain death. But experimental drugs have not been shown to be either safe or effective, so their use is largely prohibited. For the terminally ill, this nudge is wrong. It isn't rational to prohibit someone who is dying from taking an experimental drug because the drug might kill them. The default, the nudge, should be the opposite. At least for terminally ill people, experimental drugs should be permitted unless the government can demonstrate that they are harmful.



Erika Davies: Every choice a human faces comes with a default option. When we can correctly set the default option to be that which most people would pick in most circumstances, we can help people to make rational choices that will be most beneficial to them. Done incorrectly, nudging encourages people to make irrational decisions, but done correctly, nudging is one way humans can help each other negotiate the sometimes complex decisions we have to make as members of the wondrous venture we call the economy.



Economics gives us insight as to how humans behave when our unlimited desires collide with our limited abilities. These insights enable us to predict how car buyers will alter their behavior when gas prices rise, how students will alter their behavior when the government subsidizes college loans, and how cigarette manufactures will alter their behavior when the government regulates vaping.



Antony Davies: Public choice is a field of economics that takes what we understand about human behavior, and applies that knowledge to humans who behave in the public sector, politicians, bureaucrats, lobbyists, and votes. But because the humans who occupy the public sector are not different from those who occupy the private sector, we can use economic principles to predict how the [inaudible 00:15:17] will behave.



Erika Davies: For example, voters have less incentive to engage in the voting process, when the benefits of getting their way in the voting booth are small, but the cost of casting a vote is large. This can lead to people voting for policies that are bad for society.



Antony Davies: Suppose 100 people are asked to vote on a proposed law. The law says that the government will tax 90 of these people $10 each, burn half of what's collected, and give what's left to the remaining 10 people. If we allow these people to vote on the law, what will be the outcome? The 90 who would be taxed don't like the law and will vote against it. The 10, however, stand to gain $45 each. They like the law and will vote for it. The proposal will be defeated by a vote of 90 to 10.



Erika Davies: But now, suppose that it is costly to vote. It is costly just to stay aware of the laws that the government is proposing. It is costly to read the laws, it is costly to understand how the laws will affect you, and it is costly to physically get up, go to a polling station, vote, and come back home. In our thought experiment, we can simulate this voting cost by charging each person $15 to vote. It doesn't matter how you vote, yes or no, but to vote at all, you have to pay $15. Who would want to vote? The 10 people will definitely want to vote. If this law is passed, each will gain $45. That more than covers the $15 cost to vote. What about the 90? They won't vote. The law is clearly bad for them, but the cost of living with the bad law is less than the cost of voting against the law. So, the 90 will all stay home. The law will pass by a vote of 10 to zero, and society will be worse off.



Antony Davies: This is called the principle of concentrated benefits and dispersed costs. The law represents a cost to society. One group of people will pay $900 in taxes, the law also represents a benefit to society. Another group of people will receive $450. The cost to society is greater than the benefit to society, and so society would be better off if the law were defeated. But it isn't defeated. Why? Because the $450 benefit is shared by a small group of people, so each person in the small group has a strong incentive to vote for the law. But the $900 cost is spread over a large number of people, so each person in the large group has a lesser incentive to vote against the law. And if the incentive to vote against the law is less than the cost of voting against the law, those against the law won't bother to vote.



Erika Davies: The principle of concentrated benefits and dispersed costs can create a strong incentive for people to vote for laws that actually are bad for society. This is just one example of how public choice economics can help us to understand better the behavior of people in the public sector. In the same way that economics helps us to understand and predict the behaviors of consumers and producers in the private sector, it also helps us to understand and predict the behaviors of voters, politicians, and bureaucrats in the public sector.



Antony Davies: And the better we understand how humans behave, the better able we are to appreciate the appropriate role of government in a society of individuals made interdependent through the relationships we call the economy.