0:33 Intro. [Recording date: February 13, 2018.] Russ Roberts: I'm both excited and slightly embarrassed to say that our topic for today is skin in the game. It's in a way our third episode on the topic. We did one episode on the paper you wrote with Constantine Sandis that had the title "The Skin In The Game Heuristic for Protection Against Tail Events." Then last August of 2017 we did have an EconTalk episode on the book that's coming out shortly. Nassim Nicholas Taleb: You mean on some aspects of the book. Russ Roberts: Some aspects of the book when it was in draft form. And today we're going to talk about a number of topics from the book that we didn't get to. Nassim Nicholas Taleb: Which are in fact central. Russ Roberts: Which in fact are central. I mean, I don't know how we had that other episode, but we managed it somehow; and I'm sure we're going to get into some other things, as well. But, our topics for today are rationality broadly defined; decision-making under uncertainty. And I think we're going to get to religion as well. Nassim Nicholas Taleb: And the notion of survival. Russ Roberts: And the notion of survival. Which is actually, the more I think about it, and the more I read your work, the more I think of it as being central to the lessons that you have to teach in terms of decision making under uncertainty and skin in the game. So, I'm going to start with two kinds of probability that you talk about. One is ensemble probability and the second one is time probability. Set this up with the example from the casino that you used in the book. Nassim Nicholas Taleb: So, most people have the illusion that you can compute probabilities--what we call state space in finance by looking, say, at the returns on the market, what people are making, returns on businesses. And that we would apply to you. In fact, if you have the smallest probability of an absorbing barrier, then you're never going to be able to capture that market return or that ensemble return. Russ Roberts: But explain what an absorbing barrier is, first. Nassim Nicholas Taleb: Yeah. And absorbing barrier is a point that you reach beyond which you can't continue. You stop. So, for example, if you die, that's an absorbing barrier. So, most people don't realize, as Warren Buffett keeps saying, he says in order to make money, you must first survive. It's not like an option. It's a condition. So, once you hit that point, you are done. You are finished. And that applies in the financial world of course to what we call ruin, financial ruin. But it can be any form of ruin. It can be ecological ruin; it can be personal ruin. It could be the death of a community. Whatever it is. So, let's isolate the point with the following thought experiment. You send a hundred people to a casino, and the casino, you don't know the return from the casino. It's just set up, some weird person, and you don't know if the person who set it up is giving you the edge or not. Okay? So, you send them--you give them, each of the people, him or her, an allowance, and ask them to gamble for an entire day. So, and then you compute the expected return of the casino per day by what comes back. So, if Person Number 27 goes bust, loses everything, would it affect Number 28? Russ Roberts: Not at all. Nassim Nicholas Taleb: Not at all. Okay, so you probably will have a certain number of people go bust in your sample, but you don't mind; you count that as zero and you compute the expected return; and you can figure out if it's a lunatic or a very smart person running the casino. You get the exact return per day of the casino for the [?] strategy. Now if, on the other hand, you found one person, say yourself; you go 100 days to the casino, with the same strategy, and on day number 27 you are bust, will there be another 29? Russ Roberts: There will not. Nassim Nicholas Taleb: Exactly. So, that's the absorbing barrier. So, eventually if you have an absorbing barrier the question is not, you know, whether you are going to survive or not. The question is when are you going to go bust? Because eventually you are going to go bust. So, your return, your expected return, if you have a strategy that entails ruin, is exactly, the expected return, is--depending how you calculate it, you are going to lose everything. The expected ruin, you can count it at negative infinity if you are using log, or negative 100%, or whatever it is. So, any strategy that has ruin will eventually have, if you extend time to infinity, will have -100% return. And that's not very well understood, because a lot of people engage in strategies that entail ruin, not realizing that eventually it's going to catch up to them. But, one thing that I learned when I was a trader, the very first lesson from all traders, was, 'Listen. Take all the risks you want. But make sure you're going to be here tomorrow.' The game is about being in the office tomorrow at 7 a.m., because you can always start early. And that was the game. You can take all the risks you want. And effectively, every single surviving person, they take [?]--all these people, all they worry about, is ruin. They don't worry about return, all this complicated stuff. In finance, emerged two paradigms. One, Markowitz, which is entirely academic, not even used by Markowitz himself, which is like computing complicated probabilities of what may happen with returns, [?], very complicated. And then the other one is a very simple one that focuses on two things: what you expect to make, adjusted every day, and survival. Make sure you don't go bust. So, almost all traders that survive use the latter. Okay? And every single academic who went to trade--and we counted, I think in 1998, how many academics went bust after the LTCM [Long-Term Capital Management]--academics you mean in finance, not in mathematics--and we noticed that close to 100%. There's only one person who may have survived, in the 1998 collapse when Long-Term Capital Management was effectively [?] a short-term firm went bust making bets on small probabilities.

7:50 Russ Roberts: So, let me restate this a little bit. I think--in thinking about the casino, there's a presumption that the odds are in favor of the casino. You started out by saying we don't know how the casino owner is setting things up; but if you have a long-running casino like in Las Vegas today, the odds are slightly in favor of management. And so, one way to say what you just said is: You can't have a lifetime strategy of earning money by going to the casino. Nassim Nicholas Taleb: No, that's not what I'm saying. Actually, what I'm saying is even stronger. I am saying that even if you have the edge, in the presence of the probability of ruin, you will be ruined. Even if you had the edge. Russ Roberts: If you play long enough. Nassim Nicholas Taleb: If you play long enough. Unless you engage in strategies designed by traders and rediscovered by every single surviving trader, very similar to what we call, something called the Kelly Criterion, which is to play with the house money. In other words, you start betting in a casino, the strategy is as follows: You go with $100, whatever you want; and you bet $1. If you lose your bet less than a dollar, you bet, say, 90 cents, or whatever; and if you make money, you start betting with the house money. And this is called, playing with the market money or playing with the house money. And so increase your bet as you are making money, and you reduce your bet as you are losing money. And that strategy is practically the only one that allows you to gamble or engage in risky strategy without ruin. Russ Roberts: It challenges the--in other words--think about it as an asymmetry there between wins and losses: that one might think of as--I don't, but many people think of as irrational. But you are saying it's not irrational; and more than that, often we as economists make fun of people who say, 'Well, I was way ahead and I took a big gamble because I wasn't using my own money. I was using the house money.' And economists look at that and laugh and we say, 'But it's your money. You could have walked away. You could have kept it.' And you are saying that it's actually rational to treat the money you win differently from the money you lose. Nassim Nicholas Taleb: Exactly. Behavioral economists have something called mental accounting, which states exactly what you just said: that treating money according to the source is irrational because these are one-period models. That's how they view the world, as a one-shot experiment. They don't view the world as repetition. A repetition of bets. So, if you look at the world as repetition of bets, under condition of survival, then mental accounting is not only not irrational but is necessary. Any other strategy would be effectively irrational.

10:46 Russ Roberts: So, I'm going to read a long quote from the paper, which I think sums this up really well; and it's shockingly provocative. Especially when we think about what is going through people's heads when they are sitting in an experiment that we are trying to generalize from. This is what you say: The flaw in psychology papers is to believe that the subject doesn't take any other tail risks anywhere outside the experiment, and, crucially, will never take any risk at all. The idea in social science of "loss aversion" has not been thought through properly--it is not measurable the way it has been measured (if it is at all measurable). Say you ask a subject how much he would pay to insure a 1 percent probability of losing $100. You are trying to figure out how much he is "overpaying" for "risk aversion" or something even more foolish, "loss aversion." But you cannot possibly ignore all the other financial risks he is taking: if he has a car parked outside can be scratched, if he has a financial portfolio that can lose money, if he has a bakery that may risk a fine, if he has a child in college that may cost unexpectedly more, if he can be laid off, if he may be unexpectedly ill in the future. All these risks add up, and the attitude of the subject reflects them all. Ruin is indivisible and invariant to the source of randomness that may cause it. So that's a very, I think a very deep insight into how carefully we have to be interpreting what seems to be a very clean experiment: Your willingness to pay for insurance, say, of a particular event. Nassim Nicholas Taleb: Okay. Let me sample it[?] by my methods. The way you approach a problem, say an economic theory, and you wonder if it changes if you make things dynamic, not static, you see, like in other words it's not a one-shot experiment but many, many experiments. Or many repetitions of the same risk. And the second one is what I call, you perturbate. In other words, you just assume that you may have the wrong model here and there. And so these two tricks effectively pass a lot, much of the results of the hero[?] economics. Not the psychology. The psychological experiments are fine. But the hero[?] economics are trying to make something, derive the rules from simplified sets. And let me also add another dimension that people miss. Say I ask an economist or a person who studied economics but not well enough. And what's the risky scenario? He or she would answer, 'Well, my death[?]?' And then, I would say, 'Well, do you have family? Can something be worth just your death?' And effectively they say, 'Oh, yeah, yeah, my death plus the deaths of my parents and children, and cousins, and pets and so on.' I say, you continue, 'How about the ruin of your tribe?' They say, 'That's worse than--that's a set that's worse than the previous one. And then, till you hit the environment, and earth. And then, what you notice that they effectively, intuitively, when they don't, you know, repeat what they've learned at school, they will consider a risk based on both repetition and life expectancy that is reduced by taking that risk. So, for example, if I cross a street, I am not--I am of course reducing my life expectancy maybe by a second, or not even by a nanosecond. Russ Roberts: The expected value. Nassim Nicholas Taleb: Exactly. The life expectancy, I am reducing it. But if I am taking a risk for something higher than me, namely a tribe, the tribe is supposed to survive longer than me. And of course humanity, supposed to have an extra few billion years, so you are reducing from that, the value of that. And of course, when you talk about ecosystem, you'd like to be permanent or whatever you can call permanent--billions of years. And reducing that, I think by some actions. So, the ranking of the risk based on lifetime, the life expectancy that you are reducing, is something that has not been in literature[?]. So, when we did our precautionary principle and I had a talk with you about that, our point was that humanity to survive forever, so if you take these smaller pieces of risk that threaten, okay, humanity, or threaten something we call total human[?] extinction or extinction risk, then you are gambling with something much more dangerous. And there is a pyramid of ruin risks. My ruin is not a big deal. I would just [?] I think I've listened to your podcasts extends my life. So maybe I lived another 50 years. And 50 years is, yeah, I reduced my life expectancy by a little bit. It's not a big deal. But if I reduce life expectancy something that should survive an extra billion years, that's a big, big, big cost. And effectively, you can phrase that in terms of cost/benefit along these lines and obtain results that are vastly different from what is believed by the so-called risk community.

16:00 Russ Roberts: You criticize your critics. When you talk about the precautionary principle, they respond, 'But you do cross the street.' So, even though the expected loss is very small because the odds of being struck by a car are very small, you do cross the street. You do take some risk of ruin. You don't just stay home in your bed. And, what's your response to that? Nassim Nicholas Taleb: My response is when I, the way to treat these risks is how many times over my life will I cross the street. Okay, several thousand times. Crossing the street reduces your life expectancy by 1 in 47,000 years. It's not a big deal. So, the--crossing the street basically is close to zero risk for me, because my life expectancy is not infinite. But if you made humanity cross the street, that would be a problem, because it would reduce life expectancy commensurably. So, the problem of these analyses that people throw around is that they ignore the value from life expectancy of whatever you are threatening. Russ Roberts: So, give the-- Nassim Nicholas Taleb: And repetition. But let me give you one simple example of how they miss repetition. The way they treat--and I say it in a chapter on rationality and on risk, on rationality--survival is what matters first. Okay? So, people have developed the instinct paranoia. Paranoia based off[?] basically with we survive as a species, humans, however you define it, as whatever species we were, we have to have had some paranoia otherwise we wouldn't be here. Being here after millions of years. So, people develop good reasoning. So, if you ask a psychologist--if you narrow the experiment the way they do it, and you say, 'Okay, why shouldn't I smoke a cigarette?' In a one-shot[?] experiment, it makes a lot of sense. The risk is tiny and the pleasure is good. So I should smoke a cigarette. But, your grandmother would say, 'I've never seen someone smoke a cigarette and enjoy it and not smoke another one.' So your grandmother will think in dynamic terms. Because that's how we think. We think in dynamic terms. You see? Paranoia, locally, for example: If someone points out the risk of some, whatever, terrorism or something, we are--apparently we are overestimating. But people don't understand that if you eliminate paranoia, you've eliminated eventually the human race. You have to have that paranoia for anything that entails, you know, massive tail risk. And that's the only way to do it. But I see it, you know, in reduced form in trading. You see traders that basically are paranoid about anything that will bankrupt them. But they don't care about variation. They don't--they bet, they speculate so long as they know they are not going to be, you know, to produce way extinct[?]. Yeah. It worked out. So that's the idea of separating these risks and the risk of being wiped out and what are you wiping out? Are you wiping out a community, or are you wiping out something? And, in the process, and talking to my co-author Sandis, who, you know, does Philosophy, various philosophy of action and he does Ethics. And we encounter, we saw the paradox that remained unsolved, as follows. Aristotle, in his Nicomachean Ethics has various statements that encourage courage [?] as the highest virtue, and at the same time prudence, the highest virtue. Now, and also, there is a belief among the ancients that you should have all virtues or you have none. So, in other words, if you have one virtue, you should have all the others. Okay? And also there's another belief that one virtue equates to all the others. There's an equivalence. So, whatever. So, it looks like paradox: How could you be both valuing courage, you know, i.e., risk-taking, and prudence, which is the avoidance of some classes of risk? Well, it turns out courage is prudence. Because if I save a collection of children from drowning, effectively I have reduced my life expectancy. But, increased theirs. Which is longer. And more numerous children. So we understand that if you take risk for the collective, you are courageous, for yourself, but prudent for the collective. So, that is how we solve that paradox, Constantine and I. And we are going to probably publish something if we get to it, but now for view, sort of like we are confident that we have solved that paradox, that wasn't seen that way. But if you start doing the things we are talking about--dynamic, in other words things are repeated--and layering, in other words, things have higher life expectancy than others--then you can solve a lot of paradoxes. And a lot of the things that appear to be biases in the, the literature and the economics--not economics, behavioral economics literature--are not really--what, they are biases, maybe? But they are not bad biases. They are necessary biases. You have to be necessarily careful about paranoia, about the survival, particularly of something much higher than you.

21:41 Russ Roberts: So, one example you use in the book which I think brings this home is the smoking example that you just mentioned. Let's just structure it a slightly different way. A hundred people smoking one cigarette a day might be relatively harmless. One person smoking a hundred cigarettes a day is not so good. And you can't--as you point out often in the book, scaling is tricky. You can't just say, 'Well, if 100 people smoking one cigarette is not so bad, then one person smoking a hundred is the same thing.' But they are not the same thing. Nassim Nicholas Taleb: They are not the same. They are not the same. This is an Antifragile that people are starting to get now, 5 years, 6 years later. And, as I say, if a hundred people jump 1 meter it's not the same risk as one person jumping 100 meters. So, I mean, because you have acceleration. So, and you have accumulation. And these things are, in fact, well understood by out-psychologically[?]. We are excellent risk managers when we are left on our own. And it's not some psychologist who just read a few books and knows maybe mathematics who is going to make us look irrational. And try to nudge us into some different behavior. The point is, we have survived so much. We have huge track record. And any statistician would say that something with such a track record has to have some evidence of skills in surviving. Russ Roberts: I have to confess that when I worked in a racetrack in Monmouth, New Jersey, for a summer, my grandmother did tell me not to place any bets. She was a wise woman. I, of course, was a--I thought I was a wiser 18-year-old. And since I had promised I wouldn't make any bets, I did keep that promise--sort of. I would occasionally--well, actually once a day, I would split a bet with a woman who worked in the kitchen. I was the ice man. And it turned out, we did okay. We didn't go on to that second bet. But I think that's what she was worried about. And correctly so. She was worried about me losing my summer money, my summer's earnings, through addictive behavior. And I think it's a very interesting challenge to think about life as one-shot deals versus longer-term dynamics. You know, one more cookie is always harmless. But, when it's 10 because you had 9 before, it's not so harmless. So it's hard to keep that in mind. It's a good thing to think about. Nassim Nicholas Taleb: Yeah. Thanks. Let me make a confession about gambling. I've been a--I've traded for so many years. And I have such an allergy for gambling. I've never gambled. Every year I go to Las Vegas for a seminar or a conference where you drink, you eat, you do a lot of things. But the gambling table, I can't even concentrate on the table. I mean, I try to watch a game, I can constantly--there is something about it that is so contrived that you really have to have a certain mindset to gamble that's not that of a trader. A trader doesn't like constrained rules. You see? And I know very few traders who gamble. Some of them play bridge. Some play poker, slightly a different dynamic. But gambling is not something that attracts a trader. There is something--plus there is something horrifying about somewhere, about entering a trade knowing you are losing. You see? Russ Roberts: Well, I think there's an opportunity here. Someone out there listening should fund or create the documentary: Nassim Taleb at the Mirage. They would follow you through the casino; we would allow you to expound on the things you are talking about in the first few minutes of this conversation. I see it as sort of a stop action, Claymation kind of thing. I think it would be awesome.

25:28 Russ Roberts: I want to ask you a question about--well, first, let's talk about religion. Now, a lot of people--it's very fashionable--that's a disrespectful word. I'm going to rephrase that. A lot of smart people are very critical of religion these days. And, one of the things that you hear is that religion is irrational: There's no evidence for it; it's a superstition that was comforting to people before we had the enlightenment. And you argue in the book that religion--that's not the right way to think about the rationality of religion. And the fact that certain religions have survived for a long time shows that they are "rational." And your definition of rationality in that context is the same as you've been using in this gambling context, which is: It leads to survival. It promotes survival. So, talk about religion. Nassim Nicholas Taleb: Yeah. What comment I would make is that it's not the religion that survives. It's people who have it that survive. So, whatever beliefs these people have that allow them to survive cannot be discounted. By looking at their cosmetic expression. So, let me, so, religion. A few things that I talk about: Let's make sure that we don't equate all religions, because some religions are religious[?] and other religions are not. Some are more literal; others are more, let's say, semi-literal or definitely metaphorical. But, one thing about belief, okay? And [?] support of rationale. If you--and this came to me from meeting, finally, Ken Binmore, who really probably did more fundamental work, foundational work, on rationality than anyone else. And Ken Binmore effectively says that all these attacks--or, you know, on economics, economic decision-making, based on, you know, by arguing about irrationality aristophy[are at risk defining?] irrationality. You see? For example, the conventional economics don't define you as a--the economic gains as accounting. That's just a vocabulary. There are other things. So, if you for example give your money to the poor, there's nothing irrational about it. You see? So, there's some restriction incoherence. So, I thought about what he was saying and how people define rationality, and that went back to how people express what they call rational. And I notice is that usually, ex ante, hence non-empirical definition of rationality. Ex ante means that I define an action as being irrational. You know: it means that you know everything that is going to go on around that action. In other words, that your model represents the world. And we've known since Simon [Herbert Simon], other rationality, that effectively you will never be able to build a model that can understand the world. So, when I say an action is irrational, ex ante, beforehand, I'd better have a track record of that action, because we need to see if it's maybe because there are things that are not included in that model. So, for example, if I say that it is irrational to prefer A to B, B to C, but not C to A, I'd better have a good model that this holds in the real world. That's called the transitivity condition. And I have argued in Antifragile that if you expand the model to saying that for an individual it may make sense to be coherent but collectively we cannot operate what's coherent individual because you deplete resources--for example, if you always tuna to steaks, you would deplete the tuna supply. And so, therefore, you need to cycle. And nature makes you randomly change preferences. And that's a great way for things to survive. So, for example, these are the modifications to the narrowly defined, what I call baby models that you encounter in behavioral economics, and then in decision-making--and all these so-called decision sciences, what I call decision pseudo-sciences--is really [?] and I find[?] that they are irrational[?]. Or, for example, if in intertemporal preferences if someone offers you an apple today versus two apples tomorrow--well, in an ecological framework, you may say, 'Well, what if he's a person who is full of baloney? Okay. I'll take the apple now. I'm not taking it now because I prefer to eat an apple now. I'm taking it now because he may disappear[?]'-- Russ Roberts: He may not come back tomorrow. Nassim Nicholas Taleb: He may not come back. He may die tomorrow. If you include these models, then a lot of these hyperbolic discounting--well, all of these models become much more coherent. So, let me say something now about religion. So, if I judge religion without its track record, I'm going to get into a lot of theoretical--I'm not just saying empirical--theoretical mistakes, because if you think of what was, what would have happened if we didn't have these religions, I think a lot of people wouldn't haven't had the right decisions. And so religion allows you sort of intergenerationally to convey some kind of behavior. Okay? Now, if you have to give the story with a religion to justify that behavior, well, that's it. Who cares? And the example I use in the beginning is, even in science we don't have a perception when I look at Greek columns, you see, there is a distortion, for instance [?]. Religion may be a distorted view or way for us to view the world that has allowed us to survive. So, I give a lot of example of how to judge religion--you should judge it ex post, not ex ante. And I take, for example, something that seems for non-religious Jews, not, you know, not rational, which is to have 500-and-some dietary laws and two sinks in your kitchen. Now, when you think about it, it's the wrong way to just judge that on the basis of rationality. The way you've got to look at, see it, is as follows. What if Jews didn't have these dietary laws? What would have happened? Well, you know that those who eat together diet[?] together. So, they would have been more dispersed; therefore much more vulnerable. And so they owe their survival to their dietary laws. So-- Russ Roberts: If you take that view--a variant on that is that eating pork or shellfish is bad for your health in times when there's not good refrigeration, etc.--if you take that anthropological perspective, there certainly isn't a case, rather than, say, a holy or divine one, there's no case for people to keep kosher today, if that's your view, right? Nassim Nicholas Taleb: No, I really don't--we don't quite--[?] we don't fully understand the world. And a rule[?] that has survived a long time may have [?] that we haven't detected yet. You see? The idea that, you'd say, not eating shrimp is because they are impure, may be because they are impure or may be because it's good to have dietary laws. Maybe it disciplines you elsewhere. I don't buy the idea of pork being insalubrious, therefore Semitic religions except for Christianity refuse pork. The idea, to me, is probably deeper, because the Greeks also living in the same environment, the Cypriots and the Egyptians initially[?] didn't have these dietary laws. The North Africans also didn't[?] have these dietary laws and came later. So, I don't believe that we should give a lot of reasons for these--that we should go back and say it affected the [?] thing as necessarily the reason. It's a possible one, but you can never test it. We know that these religions have helped in survival, and whatever is related to survival is essential, because there's a path dependence. To do science, you must first survive.

34:10 Russ Roberts: So, I want to--do you want to say anything else about religion? Because I want to switch gears in a minute. But, do you want to say anything else? Listeners know that I keep Jewish law; there are certainly parts of Jewish law that are not easy to accept or to view as rational. But, as you say, I take the whole picture. I do not choose one by one. And the outcome for me has been very good. I don't mean this sense of very good in that I'm rich or I'm healthy or whatever. I find the practice of my religion deeply satisfying, and I'm Talebian enough to say I can't then take one plank out of the boat and say, 'Well, this one doesn't make sense.' I accept the whole thing, with all of its flaws, and the outcome, ex post, is good for me. Nassim Nicholas Taleb: Yeah; you'll notice one thing: that religions come as a package and you can't pick and select. It's not like political parties: you can be the Left with this with respect to abortion but on the Right with respect to economics. It doesn't come that way. Religion comes as a single block. You take it all or leave it all. Russ Roberts: Yeah--you're either in the club or--there are different clubs with different rules, so you can choose to that extent. Nassim Nicholas Taleb: Yeah, of course, of course. Even then--but, one thing that's misunderstood, as I started looking--I've been looking for 20-some years, I've been looking at religion mostly because I'm interested in Semitic languages, and Semitic beliefs, not so much initially in theology. And I've noticed that, you know, people calling religion--they are very confused about what they call religion. And the following will try to explain to us the main difference between atheism and secularism--why we should focus on secularism, not atheism. So, religion, and you notice that for the Jews was initially law. So, it was a legal system. But it was tribal initially; and then later on of course expanded. For the Arabs, for the Moslems, religion was law. And actually, the word 'din' in Arabic is 'law.' If you use Aramaic, you use 'nomus[?]' for 'law,' not 'din'--religion. But 'din' in Arabic means law. And 'Medinat,' this place where law prevails. And actually the name of state of Israel is Medinat, and Arabic city is Medina. So you realize that 'din' means law. But 'din' is the courthouse; and you are taking the law. And that's basically it. It's a unified body of law that you are prescribed to. That's religion. That was religion. Now, came Christianity. Christianity is, fundamentally is secular religion. Because, from Christ himself, was, you know, didn't really want to have writing[?] enrollments[?]. It was his, 'Give Caesar what belongs to Caesar.' So, it's not like, take less, take over, and Caesar, okay? So, the idea, of course, Christianity evolved here and there into theocracies. But it could not fundamentally accommodate the notion of being a [?]. Why? Very simply, it developed. And it was absorbed by the Roman Empire. It developed within a system in which law was a Roman law. And, I know the subject quite well as I was interested in school of law in Beirut, which was effectively was where law was made. And then you can see how the documents, the recent[?] Pagan scholars, were, became--when Theodosius would be made his code, the main code for the Byzantine, for the Roman Empire later on, the Theodosius Code, all he took is pagan Roman law and add the blessing, a couple of pages of blessing at the beginning. So, it was not Sharia. And you know this, so Christianity was separation between church and state from the beginning. And that separation is what a lot of the modern world developed, and the secular approach. And, the second thing I mention in the book is that when you look at the behavior of people, you should not look at what people say, but how they behave. You notice that--and the chapter is called "Is the Pope Atheist?" If you look at the behavior, anyone, anyone within these [?]branches of Christianity--of course you've got to exclude the fringe ones like Scientology and all of that--you will notice that when facing big decision they act the same way as an atheist. For example, the Pope and Richard Dawkins would go to the same hospital to get the same treatment. The difference is they would wrap it up differently. And then you notice, also, how people, atheists go to a concert where they are silent and meditative, and Christians, you know, go to the Mass where they are also doing the same thing. As a matter of fact, they are sometimes listening to the same music. So, the idea is, you get in Skin in the Game the entire concept of Skin in the Game is: Look at what people do, not what people say. So, that's what I have to say about religion.

39:33 Russ Roberts: But, I didn't understand that parallel precisely. I didn't understand your takeaway. I understand my takeaway. My takeaway comes from David Foster Wallace who says everyone worships. We all have an urge to be part of something bigger than ourselves, and some people express that through their religion; some people express that through a concert; some people express it through a sports team; some people express it through a political party, a political movement. And that sense of belonging, that tribal sense of belonging is a very powerful part of who we are. So, when I think about your point about--I'm going to expand on what you said in the book rather than what you said just now--when the Pope goes to the hospital, there are a lot of well-wishers and prayers and believers who hope to get some kind of divine response. But he also goes to the hospital. He doesn't just rely on the prayer. And similarly, when Richard Dawkins goes to the hospital, he also goes to the hospital, first. He also has well-wishers. They don't think they are bringing divine intervention, but they are hoping that he turns out all right. And there's some sort of a community response among people who like his work, just like there are people who like the Pope. What's the point of those parallels, and what does it have to do with skin in the game? Nassim Nicholas Taleb: Okay. It has to do with the following: the whole idea of skin in the game, as I outline in the Prologue, is, I don't really care what people think. I care about what they do. It's about action, not what comes behind as ornaments. Thought as ornament. I consider thought as just your background furniture. And that may lead you to certain actions. And that's skin in the game. Skin in the game is to establish a difference and the problems, a lot of the pathologies that we have in the modern world come from the fact that we forget that almost everything that was developed came from skin in the game, not from thinking. Sometimes you can find[?] thinking as justification. Like, we didn't develop the steam engine by looking at the previous work with the Greeks had one model--no, it came from developing it with our own hands. So in other words we live in a world that is very easy to capture by doing but not easy to capture by thinking. And thinking, to me, is--of course, I put it in its proper context. Russ Roberts: But are you saying that the Pope talks like a religious person but he acts like an atheist because he doesn't just rely on prayer: he actually goes to a doctor? Nassim Nicholas Taleb: Exactly. People doing things, how they would act in circumstances; and I notice that the difference between a secular Christian--a secular Christian is someone who is, they are a Catholic or an Orthodox person--and an atheist is the same facing some action. So, therefore, I don't see the point in atheism, because of that. Russ Roberts: I don't see that. Explain. Nassim Nicholas Taleb: Okay. In other words, let's not focus on what people think. Focus on what they do. And if you judge people, a Martian observes the behavior of atheists and secular Christians, they would observe the same behavior [?] account of things that matter. Russ Roberts: And so you are saying--are you suggesting that the Pope is a hypocrite for going to the hospital? Nassim Nicholas Taleb: No, not at all. That we have-- Russ Roberts: And are the atheists hypocrites for going to the concert, because they are also religious? Nassim Nicholas Taleb: No, the difference is that the idea of atheism assumes that religion is literal; and religion is about, the criticism of religion by atheists and the promotion of atheism assumes that the behavior of--it's the thought that matters, not the behavior. And the behavior of Christians is pretty much the one that atheists like. Russ Roberts: Oh, I see. Okay. Nassim Nicholas Taleb: That was my point. Russ Roberts: I get it. Nassim Nicholas Taleb: But there's another thing about religion that I'm going to say, here: that religion historically was about skin in the game. So, the gods do not like cheap talk. They like you to do something. So, you had to offer sacrifices. And it was a great model in the past, because it forced you into sacrifices. And there is something that stays with us[?], that talk is cheap, from that, in the behavior. Another thing I'm going to say about religion is I thought for a long time why the Christian religion insisted on Christ being both man and God. And the fact is, he had skin in the game by being man. And people respect those with skin in the game. Had he been God, he wouldn't have suffered. And I noticed that a lot of people who have scars effectively are exhibiting their skin in the game. They are not empty bureaucrats or something like what I call an empty suit in the book. And you would have been an empty suit if you were not harmed by anything. And I observe how Trump owes most of his appeal, all during the Republican Primary when he was standing next to people, and he looked real. Because he lost money. His adversaries were saying, 'He lost so much money.' It made him real. It's much better than someone who lives in cyberspace just writing memos, you see. And the American public understood that, something that the intellectual didn't get: That, America is not about talking. It is about doing. And, losing money is evidence that you are in a doing business, not in a talking business. Russ Roberts: Yeah, well the claim is, in Silicon Valley, that--I don't know if it's still true but it used to be the claim, that if you'd gone bankrupt, if you'd had a startup that failed--or even better, a couple--that maybe it was easier to raise money, because then you'd at least shown that you had those scars. Carrying those around with you, you'd learned something. And in theory now, you could go off and because successful. Of course, it doesn't necessarily follow. Nassim Nicholas Taleb: But that's--warriors try to show off their scars. And these scars, visibly, are a sign of--competence, 'Oh, look. He has a scar. He's a good warrior.' The person who escaped. Scars mean that you are in business. And that makes people, that creates an appeal. So, the suffering of the [?] of Christ are part of that. So, I have these things on theology that are sort of counterintuitive. But that allowed me to engage in a few discussions with people into these things, into theology. Russ Roberts: I like the line you have in the book from the Spartan mother, the mother in Sparta, who says to her son, 'Come back with your shield or on it.' I thought that's an incredibly powerful way to think about skin in the game. Right? You said, if you run away, you can run faster without your shield. You mom doesn't want you sitting[?] at home without your shield. Nassim Nicholas Taleb: Yeah, and society has put a huge premium on individual courage. [?] not to claim[?] courage not to gamble in a casino or throw yourself off a cliff, but courage in order to help others, in battle or protecting something larger than you.

46:57 Russ Roberts: I want to talk about the modern challenge that skin in the game faces that I don't think we've talked about before, which came to me as I was reading the book, now, for the second time. You give the example of Hammurabi's Code, where a builder, if the house collapses that a builder built, the house--the builder is put to death, I think. Is that right? Nassim Nicholas Taleb: Yes. It prevents a builder from having hidden risks in the foundations. Russ Roberts: Right. Because the builder knows more than the buyer. There's asymmetry of knowledge. And so, to prevent the builder from taking advantage of that, cutting corners and making a flawed building--and, if I remember correctly--maybe I'm wrong, but the building collapsing killing somebody. Nassim Nicholas Taleb: Yeah, exactly. And also, there's a symmetry[?]. If it kills the first-born son of the owner, the first-born son of the architect is put to death. Russ Roberts: So, in our modern world, I would argue, we've moved increasingly away from skin in the game. The welfare state is an example of it. Corporate bailouts that we have or examples of it--we don't like--a lot of us are uncomfortable with this idea of skin in the game. And, given how appealing it is to you, and somewhat to me, I'm thinking, 'Well, so why is that?' And one answer is, of course, that: Buildings don't just collapse because you cut corners. They collapse because of bad luck, a hurricane, a lot of things happen outside the control of the architect or the builder. The idea of executing him for something that isn't his fault doesn't sit so well with us. So, we love do-overs. We love giving people a second chance. We love extra-credit homework to get your, bring your grade up. And all these things. And of course, this encourages people to act imprudently. It has all kinds of costs. But the other side is also somewhat unpleasant to people. Nassim Nicholas Taleb: No, but I mean, the medicine--and I think a second chapter, course of second chapter, I discuss the case of medicine. Of course, if a doctor amputates the wrong leg, you have to amputate one of the doctor's legs. You know, because we start looking at things statistically. And we have looked at medical performance by doctors, or risk caused by doctors. Statistically. I mean, if you do it once, that's fine; if you do it twice, maybe; a third time, you're going to be in trouble. You see? So, the idea of an architect killing one person may be, definitely needs to be penalized. But, let's go back to the central idea that you have detected, very few economists have detected--and again, in the history of literature, in the literature of economics, we can only find two or three papers on the subject. And it is as follows. Most of economics is perceived to be incentives and disincentives. So, skin in the game would be to incentivize people if they do well, and also disincentivize them. That's not it. No. Skin in the game for me is about filtering. It's evolution. You cannot have evolution if you don't have skin in the game. In other words, you are filtering people out of the system. And I give the example of bad drivers. Now, why is it that on a highway, when I drive on a highway, you don't, I don't really encounter people who are, you know, go tapioca[?] and drive crazily, kill 30 people? Why doesn't it happen? Well, it doesn't happen because bad drivers kill themselves. Partly because they kill themselves, and partly because, okay, we catch them, we take away--we filter them out of the system by taking away their driver's license. And we're good at doing that, for those who have survived. So, the--this is filtering. Filtering is necessary for the functioning of nature. Necessary for the functioning of anything. And that's called evolution. Now, restaurants, if you allow bad restaurants to survive, soon, you know, you'd be eating cafeteria food and immortal[?] cafeterias, because basically university cafeterias are an immortal--I see the [?]relaxed state, you know, institutions. So, whereas you have the pressure and you have great food. I get my squid ink[?] in places because they are mortal[?]. So, that filtering. So now, that point of skin in the game, you pointed out a paper to me, and I found another couple, in economics, is about that if you put evolutionary filters, you get the same behavior as before irrational. And would you like to comment on that? Russ Roberts: Yeah. So, I want to back up a little bit. Because, when we did this interview last August on this, a related topic came up. And you said, 'Skin in the game is a disincentive.' And I said, 'Yeah, it's not just you get rewarded if you do well, but you get punished if you do badly.' And I totally misunderstood your point. Your point is that you don't have to be "rational," as an individual. The normal idea of skin in the game--so let me try to re-state it the way I think of it as an economist. The normal idea of skin in the game is what economists call incentives. So, if I know I can get rich, I'm going to try really hard. If I know I can lose all my money, I'm going to be cautious. And your point is, even if you're not aware of those incentives, even if you ignore the incentives, people who are wise and make good investments are going to be around, because they don't hit that absorbing barrier. And people who make bad investments are going to be wiped out, be taken out of the pool. And that is a very different level of rationality. You might call it meta-rationality, or systemic, or-- Nassim Nicholas Taleb: or people like them [?], collective-- Russ Roberts: Collective rationality. Or systemic rationality-- Nassim Nicholas Taleb: One comment here, one footnote on rationality, before you continue: the other problem that I have in my chapter on Minority Rule or Collective Behavior versus Individual Behavior, that you can easily have, you can easily have what you define as irrational people, okay?-- Russ Roberts: Rational or irrational? Nassim Nicholas Taleb: Irrational people. You can define irrationality however you want. And the collective may behave in how you may define as rational. So, collective behavior doesn't flow from the sum, a naive, you know, arithmetic sum of individual behavior. Because of fundamentalism at least built into it. Russ Roberts: That's Vernon Smith's point, right? Vernon Smith, who got the-- Nassim Nicholas Taleb: Yeah, yeah, of course, of course, of course. Vernon Smith, yes-- Russ Roberts: Vernon Smith, who got the Nobel Prize at the same time as [Daniel] Kahneman. Kahneman was saying, people do all these irrational things. And Vernon Smith's point was: Sure they do. But the market encourages, through partly this filter of what we might call profit and loss or survival and thriving, the market is going to be rational because it's going to punish people--even if they are not paying attention. They are not paying attention, they are going to be punished. It doesn't matter that they have to notice it. Nassim Nicholas Taleb: Actually, yeah. I have another argument which is [?] in the book, which is the market is not driven by the arithmetic sum of participants but by the more evaded[?] buyers, the minority rule, which we discussed last time. And if you look at it based on minority rule, then you realize that you can't really study the behavior of individual or gain any inference about behavior of the market. So, that's one thing about rationality. And I've seen, even beyond the market, when you say about humanity, humans mean having, if the humans collectively, each one has, makes a mistake of, say, for example, be having intransitive preferences--you prefer apple to oranges, oranges to pears, but pears to apple--and whether sequentially or immediately something like that collectively, it doesn't mean that the whole world will collectively be, have the same washout so beautifully in aggregation. But that's quite central beyond markets, because of when we look at society versus individual, when we look at self-esteem[?] versus other forms, more collective form, you know, of preference, I mean behavior. So, you have, you have--and mathematically you can see that very clearly, if you do the mathematics. That's what I regret--I mean, regret--the problem with all these, all these knowledging[?] consequences taken by economists make no sense when you look at collective versus individual.

55:41 Russ Roberts: And the paper I sent you that you referenced is a paper Gary Becker wrote a long time ago--I think in 1962? I can't remember the year. And Gary's gone, alas. But, he wrote a paper that I thought was kind of a silly paper as a graduate student. Not that I think anything of Gary Becker's is silly. But I never understood it. Which was: Even if people don't make rational--even if people aren't utility-maximizers--when prices change, when prices go up, they are more likely to buy less of something simply because the domain from which they can choose from has gotten smaller. And the example he gives--let's just assume people choose randomly. There's no rationality. They are not maximizing anything. And they just choose randomly. And he shows that if people just choose randomly they are more likely to choose less of something when its price goes up and more of something when its price goes down. And he used that as a justification for demand curves. Despite the fact that you might not find utility maximization very palatable. And, that's part of what you are saying. You are saying individuals could be erratic, but the system is going to purge people who make bad decisions, and enhance the survival of people who make, happen, perhaps by random choice, good decisions. Nassim Nicholas Taleb: Yeah, try and--so, some of--something on thereontologist[?] agents. And when he called, Vernon--it was, like a big revelation at the time, a wonderful idea. It's as follows. We could have zero intelligence players, and a very intelligent market. Russ Roberts: Yeah. It's crazy.

57:19 Russ Roberts: Let's talk for a minute about inequality. You make the point in the book, which reminds me a little bit of the points we talked earlier about, probably over time versus at a point in time. You argue in the book that the way people are looking at inequality is wrong: that they should look at lifetime incomes. And, if they do that, they'll see that people move in and out of different classes of income over the course of their lifetime. And that therefore there's no such thing as "THE Rich" or "THE Middle Class." Is that an accurate way to describe what you are saying? Nassim Nicholas Taleb: Yes. Exactly. I mean, the problem of looking, the measures of inequality we have are in effect ensemble inequality. In other words, you take all Americans and look at how much the winner controls, as so on. So, you say, the top 1% has 50% of the wealth, and things like that; let's have a revolution. And let's tax them. But, what people don't get is, and I'm sure statistics, that 10% of Americans will spend one year on top 1%, something like that, about half of Americans will spend one year in the top 10%. And the way to analyze inequality is not--is in fact like the same the dynamic probability of ruin. You get a look at it over time. Over your lifetime. Of course, you are going to spend years not making money. You are going to be on the bottom. And you are going to years making, having some time, making a lot of money. So, the way you look at the health of a country isn't so much in the opportunity to rise, okay, or the opportunity for the number of people who are middle class, is in the probably of losing your status of top dog. You see? And nobody--I mean, very few people look at it that way. For example, take the Forbes 500, 1985 versus 2015. You'd be shocked. Thirty years later, a very small proportion, something like 10% of families, were both[?]. See? So you have an engine in America to destroy the very strong. Although it creates inequality and does also create opportunity. And opportunity is not--I mean, if someone rises, someone at the top has to fall. And it's easy to fall in America. Take France, and you get shockingly depressing results that effectively--some people stay in the same class all their lives. You know, the upper middle class of civil servants, or friends of the state, civil servants, or heads of companies related to states. Once you did the, you study at certain universities, you are done for life. You have that effect in America, but those who rise are usually those who come from, out of nowhere. And if you take Florence and you notice that places, you know, the wealth in medieval times was in the same families as the wealth that's found today. Largely. So, people discuss mobility naively, so I just propose a measure of inequality based on transition probabilities, a completely different approach. It will give you a much rosier image of America. Now, another interesting thing that comes with America, the health of companies. The same applies to corporations. In America today, the corporation tends to stay 12 years on average in the S&P 500 [Standard and Poor's 500]. And that's very good news. It is very good news. Look, in Europe, what happens when companies becoming cozy with the state, manage to stick around. You see? So, it's the same thing--we are going to look at it, the same thing as inequality. Plus there are other metrics in inequality, very technical, and measurements of Ginis [Gini coefficients] and stuff that are not right. In other words, people give you the illusion that this has been growing over time, when they mean, it may be just the wrong computation. Russ Roberts: Yeah, well, I like to point out that if you got back to, say, 1985, some of the people in the top 1% today weren't even born. Certainly in 1970 or 1975. But, having said that-- Nassim Nicholas Taleb: But, they are talking about families. Families, also. Russ Roberts: Right, and their families were not wealthy. Nassim Nicholas Taleb: In France, it's families, yes, 60% is dominated by families.

1:01:58 Russ Roberts: So, I'm sympathetic to your point, as listeners will know. I think it's very important to remember that people can move in and out of different levels of income. But I do think, being financially well off myself, I think my children have a lot of advantages that other children don't have. And it's not just genetics. Some of it's genetic. My kids have pretty good genes, I think. But, they also have pretty good opportunities--connections I've made, things I've been able to teach them, that are going to make it more likely that they do not fall into the bottom half of the income distribution. And people in the bottom half are going to struggle to get into the top half, because they don't have some of the advantages that my children have. And people have gone so far with this to say it's immoral--just to go to an extreme here--it's immoral to read to your kids before they go to bed because it gives them a leg up on the competition. That repulses me. But, but I do accept the point that there is some--there's a much smaller chance for my children to fall into subsistence poverty, say, than somebody in the bottom half of the income distribution: they start far from it, and they have certain advantages that keep them from it. So I think there are some issues there. I think the most important thing to keep in mind is, I think that people want to get ahead. They don't necessarily want to get ahead of others. And I think we should always discourage the natural human urge to get ahead of others rather than just ahead. But, there are some challenges, I think, in the American system today that make it harder for people to be upwardly mobile that I think are bad. Nassim Nicholas Taleb: You'll encounter a problem that I didn't put in the book but I may in the future or may in some other book or writing, which is that: what do you consider as a unit? So, the remedy for that is, if I cannot transmit my wealth to my children, what's my motivation? Why do I have to work? I am working to give them a better future. So, the effort is not worth the unit. If you consider a less moderate[?] modernity and the individual is the unit, then of course it's unfair because my children are going to get more money than others. But if I consider the unit is my family and my bloodline and everything as I do, and therefore depriving me of the possibility of transmitting my wealth because your children are part of you. You see? If they are hurt, it's worse than if you are hurt. You see? How can't you give them your money? So, you've got to think along these lines, and I've been doing a lot of thinking, I didn't fully in the book about what is the definition of a unit. Is a unit you? Is the unit your tribe? Is the unit your descendent? Is the unit you and your [?]? Is the unit you and the Stanford, we're in the Club for Insightful Economic Discussions? What is your unit? And your inability to transmit some of what you have for your unit because you feel that's you is a limitation that no government should be allowed to make without further investigation, or more, deeper thinking about the problem. Russ Roberts: Well, the other point I want to make is that we shouldn't just care about how much stuff we have. Obviously, stuff's important. But what we really care about, I think, is flourishing, and using our skills. Nassim Nicholas Taleb: Yeah, but that's another thing I've noticed: all these discussions about inequality don't come from people who are at the bottom of the pyramid. They come from people, professors of--not you, of course, but left-wing professors of economics who feel they are making a lot of money, but they are envious of the richer. And I've cited lots of papers, going all the way from the ancients to the modern, less modern than we are, but people are jealous of people around them. Envious. So if you ask someone on the bottom, 'What would you like?' they'd like a better 'fridge, a new car, and that's it. But, if you ask someone, a professor at Harvard, of one subject[?] sociology at Harvard what they'd like, they would like their neighbor to be poorer. Russ Roberts: It's uncharitable-- Nassim Nicholas Taleb: This is because it is sort of the gold medal--exactly--silver medal, is who hates a gold medal. Russ Roberts: It's an uncharitable view of my fellow academics, although they're not in the Stanford Club for Insightful Economics Discussions. That's a club I'm definitely going to have to start. We're going to have t-shirts. I love that. SCIED.

1:07:18 Russ Roberts: Before we finish, I just want to add one challenge to you, Nassim, which is not--it just came to me. Which is, we were talking about the filtering power of skin in the game, and yet we do want that restaurateur to come back and make the second restaurant better. We do want Bill Belichick after his failure at Cleveland not to be wiped out; we want him to come back and try again. And that entrepreneur in Silicon Valley who has the three failures, we don't say, 'Oop, you're out of the club.' So, it is a little more complicated. Nassim Nicholas Taleb: It is and it's not. In other words, the fact is, and the beauty of the idea of skin in the game is that you should have the same, like when you drive, you have the same risk as you inflict on others. And that was the symmetry of the archetype in Hammurabi's Code, is that what you inflict on others, you should also inflict on yourself. Russ Roberts: You should eat your own cooking [?] Nassim Nicholas Taleb: Exactly. So, for tail risks, this works effectively. And for medium risks, of course you survive, but everybody survives, so you are not inflicting any big danger of others. In the previous discussion on Skin in the Game, I spoke about people being morally calibrated, most people. Everybody is morally calibrated, you see? Removing the tail risk, preventing people from coming back if they inflict a lot of risk on others. Like, for example, warriors. Every warrior--traditionally, society has--we are what we are today because warriors are in battle. So, if you are complete, uncontrollable warmongers like many people in Washington today or some journalists, many think that[?] people, you would end up dying in battle, and these people don't die in battle. So that's what I meant. The restaurant owner, of course is going to be filtered. Or, the theme[?] is going to be filtered--he won't have a bad restaurant; it will be something else. But he is not inflicting undue risk on others. He's only inflicting risks on his investors and on himself. And eventually if he is very bad he'll run out of money. Russ Roberts: Which is sufficient punishment. He's not executed. Unless he kills people through food poisoning. Nassim Nicholas Taleb: Exactly. If he eats his food, he'll be out of--exactly.