0:33 Intro. [Recording date: August 26, 2019.] Russ Roberts: My guest is physicist and author Sabine Hossenfelder.... She is the author of the book Lost in Math: How Beauty Leads Physics Astray.... I want listeners to know that this book is in many ways a tour about the state of our knowledge about the physical world. But, it's a lot more than that. Along the way you raise many philosophical and methodological questions that I think are deeply related to problems in economics, and science generally, and social science. So, at times we're going to be talking about physics; and we're also going to be talking about epistemology--that is, how do we know what we know, and the challenges of fads in science, the challenge of biases, which is a big theme of this program. And these are not just problems in physics or economics, obviously, but are the challenges of being a human being. Now, much of your book is an attack on the idea that a theory in physics needs to beautiful. What's wrong with the idea of beauty? Aren't the best theories in physics indeed beautiful? Sabine Hossenfelder: I would say, yes, they are. There is nothing wrong a priori with finding beauty in theories that describe nature. The problem begins if scientists use specific notions of beauty to develop new theories and then they are unwilling to give up these ideals of beauty if the theories do not work. Russ Roberts: And, in particular, you make some bold and, I would say, damning condemnations of some of the most exciting and really current theories in physics. I would list among them supersymmetry, the idea that there is a multiverse--that there are multiple universes coexisting at the same time. String theory. You argue that these either have no empirical implications, or the empirical implications have failed to be seen. Is that a good summary at how you look at what are the most dramatic aspects of physics really over the last 50 years? Sabine Hossenfelder: Yeah; it's basically what I think has happened. You named string theory, supersymmetry--one could maybe add to this list is the idea of a grand, unified force of the interaction. It's also a pretty idea. And, those were really good ideas in the 1970s, 1980s; and it was definitely something that people should have tried. And they tried it; and it didn't work. And then, what happened was that they didn't just stop pursuing this line of research, but instead they made these theories even more complicated and arcane, until basically they became entirely untestable. And, the multiverse that you just named is basically one step further--which is untestable from the original idea-- Russ Roberts: From the get-go. Yeah. Sabine Hossenfelder: Yeah. Russ Roberts: Although there's a Spider-Man movie--I don't know if you've seen it. I think it's called 'Spider-Man and the Multiverse' [Spiderman: Into the Spider-Verse], something like that: it's got a slightly better title than that, maybe. Have you seen that movie? Sabine Hossenfelder: No. I watched one of the Spider-Man, but not this one. Russ Roberts: You've seen, then, a tiny fraction of the Spider-Man movies. There are so many of them. It's actually quite imaginative; it may even be accurate in terms of--there's portals and ways to move between the multiverses. It's quite aesthetically pleasing, that movie: I found it extremely beautiful, actually. It's an animation, and it allows--the idea of the multiverse allows there to be different versions of Spider-Man with different characteristics sort of coexisting, and they all get thrown together. But that's someone's imagination. And as you point out, it's not empirically testable. It reminds me of this claim, which I hope we can talk about a little bit, that we're living in a computer simulation. I guess we won't know for sure until the simulator sends us an email that lets us know. But, other than that, it's kind of hard to understand how you would know such a thing. Sabine Hossenfelder: Yes. So, don't get me wrong: I very much like this idea of the multiverse. I find it very thought-stimulating. And, as you just said, it makes for great science fiction, or maybe you want to call it fantasy. My problems begin if people start to insist that it is science. So, yeah, this idea of living within a computer simulation--that's basically a form of the multiverse. Russ Roberts: Yeah; it's--I'm not going to name the person, and I don't know the name of the person--I was just so appalled at the statement that I just stopped reading. But, a person at, I think MIT [Massachusetts Institute of Technology], which is a fine university, and it was someone in a prestigious department at MIT, declared that there's at least a 50% chance that we're living--I can't say it without laughing--in a computer simulation. Because, for a person who purports to be a scientist to put a probability on something like that strikes me as--as you say, unscientific. Sabine Hossenfelder: Yeah. I think I know who you are talking about. Russ Roberts: I don't know how you say that--whether you are a frequentist or a Bayesian, I don't know how you begin to make a claim like--does that just mean--well, let's say it this way: I think one of the things your book reminds me of is that a lot of these theories have a religious nature. There's a certain holding-onto-belief that, even though the empirical evidence hasn't come forward, that is will, some day. And people want to believe that these theories are true the way religious people do. And I happen to be religious. I have nothing against religion. It's just, it's not the same as science. It seems like it's a mistake to confuse the two. Sabine Hossenfelder: Yes; that's a very interesting remark. The issue, though, is that the people who work on this are not aware that those are beliefs. They tend to think there are good scientific reasons for why this should be true. And now the problem is that if you poke them a little bit, they have no way to justify it. I would say, of course. But the issue is they never really think about what is it really that allows you to say that something exists--you know, an issue like this--or, what do we even mean by an explanation, what is the theory? All that kind of stuff. And so this is why my book is called Lost in Math, because they believe that just because they can write down some mathematics for it, it has to be true. Russ Roberts: It's a strange idea. It happens a lot in economics; we'll talk about that. And people of course fall in love with their theories. It's a human impulse. But, I have to say that I have a number of friends who are physicists. When I told them that I was interviewing you, their first question was, 'Why is that on EconTalk?' And, the answer to that is, 'Because it's interesting.' But it's also related to economics in a way that's maybe not obvious at first, but I hope will be by the end of the conversation. But, when I asked them about your claim that these theories are empirical dead ends or have failed empirical conference--in particular, predicted particles have not shown up in collider experiments--they, I have to tell you they looked at me kind of smugly and there were two responses. One was, 'We're going to find them. It just hasn't happened yet.' Which--you can believe that for a very long time, of course. Right? Sabine Hossenfelder: Yes. So, that's of course what has been going on since the 1980s. They started looking for this in the 1980s with the first searches for dark matter particles. And then--you know, these supersymmetric particles, they were supposed to appear already at LEP [Large Electron-Positron Collider]--so that was the Large Electron-Positron Collider at CERN [Conseil européen pour la recherche nucléaire] at the time, and it hasn't happened. And then it was supposed to be [?], and then it was supposed to be at the [?]LHC [Large Hadron Collider]. And again it hasn't happened; and now they want to build a yet larger collider to find the stupid particles. And, my problem with this is not so much that you want to keep pushing this energy frontier. My problem is that they're using techniques for theory development that obviously do not work. And, everything that I have learned about science tells me that you should learn the lesson and stop using these methods. And that's what's not happening. And that really, really worries me, because it means there's something really going wrong in this field.

10:02 Russ Roberts: So, they, of course, don't agree with you. Some of them are sympathetic. Your book is, as I said, it's a tour of what we know right now, what we think we know or what we might know; but it's also a set of conversations with leading physicists alive today and how they react to these kinds of concerns that you have. And then they are not as concerned as you are, overall. They are much more optimistic. And, for me as a non-physicist--it reminds me of economics, as I said--but it also reminds me of chemistry. You know, when the periodic table was first started, first discovered, there were holes in it; and people predicted that these elements would be discovered. And, of course, they were. And that led people to believe that--I don't think you mention it, but I think that moment in science, and other related moments in physics, were things that were predicted--the Higgs boson being one example where things did show up. They just think, 'Well, it's just a matter of time; we don't have powerful enough colliders. We'll eventually develop them. And all these things have to be true.' They're holding onto that. And I get that. Because the history seems to encourage that viewpoint. Sabine Hossenfelder: No, it doesn't. Russ Roberts: Okay. Why not? Sabine Hossenfelder: That's a very sloppy look at the history. So, this is one of the points that I try to get across in the book, but I know that for someone who doesn't really know the math of the theories, it's hard to understand. So, let's take this example of the Higgs boson. If you have the standard model of particle physics without the Higgs boson, this theory just does not work at LHC [Large Hadron Collider] energies. You cannot make predictions with it. You get probabilities that are larger than 1; so this is obviously mathematical nonsense. We knew that something had to happen to fix that problem. And the Higgs boson was the simplest idea that was on the market. It could have been something else, but it really doesn't matter. We knew that there would have to be something at the LHC energies-- Russ Roberts: LHC being the collider of the--the sort of, the best-- Sabine Hossenfelder: Yes, the Large Hadron Collider-- Russ Roberts: Right now the best way we have to explore and find these things. Sabine Hossenfelder: Yes. Now, a lot of physicists, especially theoretical physicists, thought that the Large Hadron Collider should also see something else--for example, these supersymmetric particles. And now, the thing is that the prediction for those additional particles is often an entirely different type, because these particles are absolutely unnecessary. And, you see, it's really a different kind of prediction. And, the same thing goes for the periodic table: Once you have discovered this pattern, for some of the elements, and you can say, 'Well, you know, we'll have to fill it in here, here, and here. Otherwise it just doesn't work.' You know, 'It has to be there,' basically. The same is not the case with supersymmetry or with grand unification or something: We can do without it. You know, the theories work just fine without it. The standard model has no problem now that the Higgs is found. And that may very well just be the end of the story. Maybe that's it. Russ Roberts: So, you are saying we are spending a lot of intellectual energy, and then physical energy, in building--a lot of people are demanding, 'Oh, we just need to build a heavier collider with more energy that would identify potentially these unseen particles so far. If we had higher energy, then these heavier particles would be observable.' And you are saying that just--in a way you are saying it's a vanity project. Sabine Hossenfelder: Well, not entirely. You can do other things with a larger collider. I mean, if you are slamming protons into each other, at the very least you will learn something about the structure of the proton. Right? So, it's not that it's entirely useless. But if you want to know something about the unification of the forces or about supersymmetry, then I would just say, 'Well, that's not a promising thing to do, because we have no reason to think that these things actually happen in the real world.' You know, they certainly help mend[?] certain types of math worlds, so to say. And, the tragedy--and this is that we do have really serious problems in the foundations of physics and do require a solution; but those don't get the attention because they are really hard to work on. So, people prefer to work on something that's simple. Russ Roberts: Yeah. Yeah. Well, I get that. I understand that.

14:57 Russ Roberts: So, I had a guest on this program a while back, Paul Pfleiderer from Stanford Graduate School of Business, who critiqued a model in economics that's sometimes called the 'as if' model. It comes from a methodological paper Milton Friedman wrote I think in the 1950s, which is to say that economists assume people are rational, that they act in their own self-interest. That, of course--those are complicated statements, by themselves. They sound straightforward to an economist but they are actually quite complicated. But, the analogy would be that, you know, when a truck driver--this is, I think, Friedman's--when a truck driver goes around a corner in the rain at night, the truck driver acts as if they understand the physics of friction and fluids, as they decide how much to brake and how fast to go. And, similarly, the claim is a baseball player going to catch a fly ball, runs to a certain space as if they know the equation of the parabola that the ball is taking, the arc of the ball, they've solved: They act as if they know the physics behind the baseball. And on a muggy night they might run to a different place because they understand that the ball is going to have a different path. And, what Pfleiderer's point is, which I think is a deep insight, is that a lot of economists made the mistake of thinking that because a model predicts well--so, the model being 'Oh. The baseball player knows physics, acts as if they know physics; and therefore I predict they will catch the ball,' or 'I will assume that the billiards player understands the end-body problem when they hit a billiard shot.' And, of course, that's not what they're doing. It's not what a baseball player does. It's not what a truck driver does. But Pfleiderer's point is that if you're not careful, you start to think that you've not just predicted well, but you've understood. And you start to believe that your model is capturing the underlying reality. As opposed to: It just predicts well. And it may abstract from so much of the reality that it's actually a very poor descriptor. Which it is in the case of the truck driver or the baseball player. And, I feel like in physics there's a tension between these two ideas. So, tell me if I'm right. There's a famous quote--you'll tell me who said it; it's in your book--'Nature speaks a language of mathematics.' And I think the incredible success of physics in predicting and in understanding our underlying reality has gotten many physicists to believe the reality is mathematics, when in fact that may not be the case. So, how do you react to that? Sabine Hossenfelder: First, I have to admit I don't know from whom the quote is. It sounds like something that a lot of people could have said. Russ Roberts: It's in your book. I forget who it is. It's somebody famous. Sabine Hossenfelder: Yeah, probably. So, this is actually a very deep question that you should consult a philosopher for. So, that's the question, basically: What do we even mean by an explanation? And I have to admit that I'm very much an instrumentalist. This is why I find this idea that reality is math just, you know, not very useful. For me, theory is something that you use to describe nature. And, it explains something if it is more useful than just collecting the data. Basically. But, yeah, you are right: I think a lot of physicists actually do think that reality is math. Though, again, and I said something similar earlier, I don't think that they really consciously think about it. It's not like they write down their philosophical background and then work with that. It's just something that they implicitly assume. And it's particularly obvious when it comes to the multiverse that we were already talking about. So, you have all this math in the theory that is entirely superfluous to describe anything we actually observe. So, I would say, 'Well, you know, that's just math.' Okay? So, I have no problem with the math being there, but I don't think that it is real. And yet, a lot of physicists think that this also has to be real: just like, all universes, all the other universes also have to be real. Russ Roberts: Does anything--you said at some point--you used the word 'superfluous'; and you said supersymmetry is just--we don't need it once we have the Higgs boson. Is there any practical difference? I mean, for example, if there are multiverses, and if they can't communicate--which I think is part of the theory--one response would be, 'Well, who cares?' Intellectually, I'm interested in how reality works, so I wouldn't say literally 'who cares?' But in terms of practical application, string theory, multiverses, supersymmetry: Do any of these at the current time have any practical application? Which is to say that if we were to build the more expensive, higher energy colliders that supersymmetry might want to test on, is there any practical justification other than just 'We'll understand potentially how the world works'? If those particles that are unseen, if they turn out to be out there, does it matter? Do any of the quantum level particles matter to human existence other than understanding? Sabine Hossenfelder: So, the brief answer is, no, they don't have practical applications. Though, I have to add a footnote here, which is that: String theory, the way that we usually talk about it, it's about unifying all the interactions and bring in a quantization of gravity. But there may be other applications of string theory that people have been pursuing. And these may have applications in condensed matter physics or stuff like this. So, this is an entirely different story. It's not really in the foundations of physics. But, there may be applications with that. When it comes to these additional particles in, like, in supersymmetry, you know, I find it possible--I do have a lot of imagination--and maybe--I don't know, in a thousand years or two thousand years or, who knows, these may be useful for something; but definitely not in the near future. You know, there's nothing that I can think of that you would actually do with them except for writing down their properties in a little booklet. So, you know, when I say that I'm not in favor of building this larger collider, it's not that I want to say I'm not in favor of ever building it. It's just that right now it seems to me we have more important things to do. It's not the right time to further push on this. Russ Roberts: So, we're not going to have teleportation-- Sabine Hossenfelder: Hah, hah-- Russ Roberts: Don't laugh! Sabine Hossenfelder: Not any time soon. Don't worry. Russ Roberts: Don't laugh! That's terrible. But I assume people have talked about it, as a possible practical application, at some point. You know. Sabine Hossenfelder: Well, there is this thing that they call quantum teleportation, right? But it's not going to teleport you to the United States or back. Russ Roberts: No, it doesn't work at, what do you call it--at the large, or bigger level. The macro level. Sabine Hossenfelder: It's not a portal. Russ Roberts: Yeah. Exactly. It's not a portal.

23:05 Russ Roberts: Before we leave this idea of beauty and the lure of beauty, I just want to read a quote from Steven Weinberg that you give us. And I'm doing this partly because I ran into a physicist at a conference the other day and I mentioned I was going to interview you, and I said to this physicist--who happened to be trained at the University of Texas where Steven Weinberg is--I said, 'What do you think of this idea that beauty is overrated?' 'Oh, no, no,' he said. And he gave me the Steven Weinberg story about horses. I thought that was--it seems to have permeated the discipline, at least at Texas. He said the following--he's talking about horse breeders. He says: [The horse breeder] looks at a horse and says 'That's a beautiful horse.' While he or she may be expressing a purely aesthetic emotion, I think there's more to it than that. The horse breeder has seen lots of horses, and from experience with horses knows that that's the kind of horse that wins races. But you make the point that that's not really a very compelling story. Why? Sabine Hossenfelder: Well, there is something to it, of course, in the sense that you gain experience from your work. You learn something about the theories that you deal with every day. And of course you try to carry forward this knowledge to do something else. And that's all well and fine as long as you stay within the same type of theories. Now, the problem starts if what you are really looking for is an entirely new type of theory. And now the issue is, if you are using your ideals of beauty or elegance that you have derived from the past theories and try to use them as guides to new theories, you may not get anywhere. So, what you are doing is basically you are putting the carriage before the horse. Russ Roberts: Hah, hah, hah. Shame on you, Sabine. Sabine Hossenfelder: And so, let me maybe add that--I quote this a few times in my book. There is a book by a philosophical by the name of McAllister, which is Beauty and Revolution in Science. And he goes through quite a number of examples, most of them in physics, actually, where what Kuhn called the Revolution in Science actually meant that the conceptions of beauty changed. One of the examples--I think that's what he starts with--is there's this step from the mechanistic world where everything is made of gears and bolts, to everything is really made of fields and particles. And then there's the switch from the classical mechanics to quantum mechanics. There are things like, well, we had this idea of the geocentric universe where all the planets were going in circles to the heliocentric solar system with elliptic orbits. And so, in each of these cases, the conceptions of beauty have really shifted. And, I think that that's what we need now. So, if you are holding on to these ideals of beauty from the past, you are doing exactly the wrong thing. Russ Roberts: It's a very deep insight. It reminds me of how--I've talked about this on the program before--in opera, for a long time, the heroine died at the end of the opera. And that was satisfying to people. They didn't go, like, 'Oh, that's horrible. I'm not going to watch that.' They thought, well, that was what art did. And a lot of books--not just operas--one of the main characters dies. And then there was a period, or there is a period, where people didn't like that. They want a happy ending. And sometimes, now, I'd say today, we like ambiguous endings. We think ambiguity is beautiful. Which is the exact opposite of a neatly tied up happy ending story. Modern art, to a large extent, is about ambiguity. And I think we've come to, you know, respect and admire that as beautiful. And I think the elliptical orbit is a perfect example. Like, 'What's not beautiful about an ellipse? An ellipse is beautiful.' You start to convince yourself. It's what you are used to; it's what everyone's doing; it's what the field is showing and all the evidence seems to confirm it. And you convince yourself that it's more beautiful: 'Actually, it's more beautiful than a circle.' You didn't realize it. When it was circles, they thought that was the highest level. But, now they've come to a different place. And I think that's a very deep insight--you could call it a cultural bias in how we think about models of the world. The other thing I think of all the time is that, when clocks were invented--and you mention this in passing--the universe was a clock, with the gears and bolts and so on. And now it's a computer. Shouldn't that tell people that their views of reality are, you know, conditioned by their technological advances, and maybe reality is actually none of these things? Sabine Hossenfelder: Yeah. You should think so, but that's not the case. You are certainly right that definitely our sense of beauty has a cultural bias. It also changes over time, as you point out. So these two things already tell you that it's a stupid idea to rely on your human sense of beauty to develop new theories. There are some ingredients to our perception of beauty I think are probably hard-wired. In particular, this idea that symmetry is beautiful-- Russ Roberts: Yeah, we like symmetry-- Sabine Hossenfelder: Yeah. You know, this is not my area, but it may be that we find symmetry beautiful because healthy organisms tend to have more symmetry. So that makes sense from an evolutionary perspective. But also, in that case, there is no reason to think that this would tell us something about the fundamental laws of nature--about properties of elementary particles and stuff like that.

29:40 Russ Roberts: You say something very deep, I thought, in the--so, I apologize about using the word 'deep,' to our listeners. I'm over-using it in this episode, but I can't help it. These are deep questions. You say, Our brains didn't develop to serve science; they developed to serve us. And what served us well during evolution does not always serve us well in science. Elaborate on that. Sabine Hossenfelder: Yes. That's kind of the key message that I was trying to get across with the book: that we have all these cultural biases and social biases and cognitive biases that affect what research we find interesting, and so also affect what we work on, what research we fund. But this doesn't necessarily have a good correlation with what is promising research, in the sense of that it has a potential to correctly describe nature. And that's a big, big problem, that is basically unaddressed, the way that we organize scientific research right now. Russ Roberts: The idea that scientists would suffer from groupthink is just so appalling to a scientist that it's very hard to look in the mirror and say that. Right? I'm a big fan of the Feynman quote: 'The first principle is not to fool yourself--and you are the easiest person to fool.' The idea that scientists have to remember that seems ludicrous. And I think, certainly in the public's eye, scientists are sitting in white lab coats, you know, looking for truth. The idea that they have tenure worries or ego involved, or they care what their friends think of them--we don't like that idea. We have a very idealized picture of a scientist. Sabine Hossenfelder: Yes; actually, I've found that it's mostly scientists themselves who have a very idealized picture of scientists. I've also learned that groupthink is a really impolite word to use. The polite word is 'social reinforcement.' But it's the same thing, of course. So, the easiest way to find out that social reinforcement is a problem in a community is that people in the community deny that something like this can possibly affect them. Russ Roberts: Right. Of course not. Only[?] other people. Sabine Hossenfelder: Yes. It's only other people. It's not ourselves. So, it's really sad. You can't even talk with them about it, because they are like, 'Oh, no. This will never affect us. We're so rational people.' Blah, blah, blah, blah, blah. Russ Roberts: 'I have a very high IQ [intelligence quotient], so I don't have to worry about this.' Yeah. Sabine Hossenfelder: Exactly. Exactly. They will not say this literally, but that's basically what they are thinking. Russ Roberts: I'm talking with Sabine Hossenfelder. Her book is Lost in Math. And I want to thank Plantronics for providing the headset she is using, which is the Blackwire 5220.

32:53 Russ Roberts: Sabine, you and I have something in common, which is--it's an interesting role, and I both love the role and I find it somewhat disturbing. And I'm going to challenge you to reflect on the role that we play in a psychological sense: I'm going to put you on the couch. And I'm on the couch on the other side of the room. Which is that we both have been going around saying that the emperor has no clothes: that there are aspects of our discipline that are either overrated or misunderstood. And when you play that role--and in economics I'm typically arguing that their[?] empirical findings are not reliable; they're not credible; they're not replicable; they are based typically on observational studies that don't hold up. And so, I go around saying that economics is a cross between epidemiology and history. Which--economists don't like that. They find that insulting. And they laugh and mock me often when I make the claim. And I feel a kinship with you because I think you're doing something similar. You are telling people who, many of whom are successful in your field, as they are in mine--you are telling them that what they are doing, they are deluding themselves to some extent. How does that feel? Do you think about that? Sabine Hossenfelder: No. It does sound very similar to what I'm doing. The reason I'm doing this is that I really, really think it's necessary. This community needs some criticism, because they are stuck with what they are doing, and they need to understand what's the problem to get out of this. So, I kind of feel like someone has to do it. And who will do it if not me? Russ Roberts: But, you don't have a Nobel Prize. Many of them do. Right? You are easily dismissed. As I am. And I--maybe I should be easily dismissed. I worry about it all the time. Maybe I'm just enjoying the fact that I like to think about how I'm not biased. Or, I don't have--you know, I like to preach humility on the program. Maybe I'm so un-humble about my humility, I've got the other problem--I've got the same problem, actually. So, do you worry about those things? Do you feel--like, when you're talking--I sensed it in your book when you are talking to Steven Weinberg and after half an hour he feels like he's talked enough and you should get out. It's kind of hard to stand your ground. Sabine Hossenfelder: Yeah. Of course. You know, they are trying to dismiss me. I know this. They will probably succeed with their attempt to ignore me. It's a little hard because, what I said in the book was basically that the [?] wouldn't see any of these particles, which at least so far turns out to be correct. Now, the particle physicists have a problem in trying to get funding for their new collider, which makes it harder to ignore me. But, yeah, you know how it goes: Public attention will move on; people will be talking about something else. That's very foreseeable. And there is a big risk that this community will just continue the same way that they have strung along for the last 40 years or something. But then, at least, I can say, 'Look, I told you that this wouldn't work.' Russ Roberts: Yeah; you took a bit of a risk in writing your book. It took a couple of years. You knew there was a chance that somewhere in that time period something might come up that would make your book obsolete. Sabine Hossenfelder: Oh, my God. You wouldn't believe how many sleepless nights I've spent over this. Especially when they had this anomaly, which is in the book--I was like, 'Oh, my God. What do I do if that's a real thing? I'll have to throw out the whole plan for the book, and then what?' Russ Roberts: So you found yourself rooting, cheering, and hoping that they didn't find anything, didn't you? Sabine Hossenfelder: Well, um--well, it's more difficult, you know, because, as someone who works in the field themselves, of course I would like to see some new data, you know, some evidence that there is more than what we have. Because that would be super, super exciting. So, it would have been pretty bad for the book, okay. But it would have been exciting to be part of this breakthrough. You see what I mean? Russ Roberts: Yeah. Sure. No, I have it in economics, too. Being a skeptic and thinking that a lot of economic empirical work is not reliable, there's a temptation for me to enjoy it--there's schadenfreude [sounds like schau'den-freud].Is that the correct pronunciation, by the way? Sabine Hossenfelder: Yeah, it's pretty good. Russ Roberts: How would you say it? Sabine Hossenfelder: Schadenfreude [sounds like schau'den-freud'-e]. Russ Roberts: Schadenfreude. I'm going to do better. You know, as an American, it's a very pretentious to pronounce it more accurately. So, I'm going to try that. Schadenfreude. There's a certain schadenfreude when a result gets--when replications fail. There've been a bunch now in psychology; there've been a bunch in medicine, a lot of findings in observational trials do not hold up in randomized control trials. So, I have to confess I kind of enjoy that. But, at the same time, like you, I'd like to know more about the world. So, it would be great if we could find out some things that are reliable. Like if coffee really is good for you--or bad for you--I'd like to know. But there are plenty of studies on both sides, which makes me think we don't know at all, rather than thinking, 'Oh, these are the good ones and those are the bad ones.' And I think that's always the temptation in economics. The ones that confirm my ideology are the good ones; the ones that are challenging to it are obviously the ones that are flawed. And I've come to believe most of them are just flawed overall. And it does lead to some psychological challenges.

38:59 Russ Roberts: When you said the anomaly that showed up, which anomaly are you referring to? Sabine Hossenfelder: Oh, this is what [?] refers to as the diphoton anomaly. So, there were too many decays into a channel that consists of two photons. And, at the time--I think this was late 2015, early 2016--a lot of physicists thought that this might be the long-sought-for evidence for whatever new particle; and there were a lot of papers written about this. Which were very well cited. There were people who wrote papers that were cited 500, 600 times in a matter of months. So, that's pretty ridiculous. So, that's the story. It's actually--I think that's a big disease in the field. And it has a name: it's called ambulance chasing. I don't know if you ever heard of it. Russ Roberts: Yeah. Sure. Sabine Hossenfelder: It's a systemic problem in the way that we organize citations, basically. That could be fixed, you know. I'm not the only one who thinks that that's a problem, but no one's doing anything about it. Russ Roberts: Yeah--I wanted to ask you about the diphoton anomaly, actually, because you write about it in the book; and I found this so delicious. Not the failure--although there's a little--the schadenfreude there is, there is some. But, you say the following, talking about this: That both experiments independently saw it [Russ: they were looking at the diphoton] substantially decreases the risk that the signal was mere coincidence. Taken together [Russ: the two experiments], they arrive at a chance of 3 in 10,000 that the excess is a random fluctuation. Meaning, the differences that were discovered, maybe, it could have been just random; but it was a very low probability. Three in 10,000 is, like, a low number. You say--this is what I love--you say, That's still far off the standard of certainty that particle physicists require for a discovery, which is roughly 1 in 3.5 million. So, in economics it's 5%. In social science. Social science, statistical significance is 95%. And there's a debate in statistics--there's a big debate--about whether we should just make it 1% instead of 5% for statistical significance, or whether we should not use statistical significance at all. But, I think it's unbelievably that in physics it's 1 in 3.5 million. Slightly more demanding standard. But then as you point out, a lot of studies assumed it was not random, went out and published a lot of papers on it. In fact, you say, the next day there were 10 papers on it. Is that right--the next day? Sabine Hossenfelder: Yes. Russ Roberts: That's incredible. Sabine Hossenfelder: Those were almost certainly people who knew in advance [?], who had heard the rumors. Russ Roberts: Or who were just waiting. But that turned out to be a statistical anomaly. Not a real thing. Not a real thing to be explained. Sabine Hossenfelder: Yes. And, you know, these things--you look at the history of particle physics, these things happen basically all the time. You know--the reason that we have this 5-sigma limit is that it just seems to work fairly well in practice. But there is no deep reason behind this. Like, it's like with the 5% that you have and that I know a lot of other disciplines use, like psychology and so on--there's no deep reason for that. It's just something that seems to kind of work in practice. So, I'm following this discussion about the issue with the p-value to some extent. It's difficult. Russ Roberts: No, it's the same issue. I mean, it's the same--it's not literally the same--but it's the same challenge. Another way to think about it--I think this is deeply disturbing but I think it's important to say: I think there's an immense amount of religious faith in peer review. And all of what we've been saying so far in this conversation is that, peer review is over-rated. As Brian Nosek, a psychologist, and a co-author wrote, 'Peer review and true are not synonyms.' And yet, when a journalist reports on something as peer reviewed or statistically significant, that allows him to say the ever common 'Studies show,' as if we've found truth. And it's just not so. Certainly in the social sciences or in epidemiology or in medicine. It can be true in--it's probably more likely to be true in physics, but maybe not. I don't know. I think it's--it's a big problem that the academic priesthood is, again, maybe doesn't have so much clothing on. Sabine Hossenfelder: Yeah. I think you're right that these issues are very closely related. So, this reproducibility crisis that we see in psychology and parts of sociology and so on is very similar to the crisis that we have in the foundations of physics, in that this problem has been known for a long time. You know--this is something that even I learned--so, I studied mathematics originally, and I had to take some courses on statistics. This whole story with the p-value hacking and the post-group selection, all of this--it was no secret that this is how you can artificially increase the significance of your findings. So, people in the field could very well have improved their methodology decades ago, but they didn't do it. Why? Because it would have been inconvenient. It would have made it much harder to produce papers. And, it is exactly the same thing that is going on in the foundations of physics: People have fallen in love with these models of supersymmetry. Why? Because they are easy to produce, and, you know, how you do the calculations. And you can put out a lot of papers in a short amount of time. And that keeps the wheels turning. And so that continues to proliferate, basically; and we're still doing this. Russ Roberts: Well, one thing you mention--I'm not sure you word it this way--but part of the problem is there are a lot more physicists and a lot more social scientists than there used to be. So, you've got to publish. It's not like Broadway. It's not like other disciplines that are winner-take-all. 'There's room for everybody. We just need more journals if we have to. We're going to get more publications.' I don't think you say there are too many physicists. I think you just say there are a lot more. But you could argue there are too many. Right? Sabine Hossenfelder: Well, I would say everything is relative. Too many for what? You know? Obviously not enough to make progress. So, the thing with--the reason why I say this, that there are a lot of physicists, is that it tends to lead to fractionalization of the communities. They tend to fall apart into over-specialized, niches. And that's not helpful to, you know, try to solve complicated problems that usually require that you connect very different regions. So, I think that's a big problem--that, basically, the way that we are organizing research right now, it tends to favor people who specialize on something. And it does not leave enough room for people who try to make these connections.

47:19 Russ Roberts: You write, just to give people a feel for what we are talking about here, you say, Since 1973 there hasn't been any successful new prediction that would supersede the standard model. And, that's a long time. We're heading toward 50 years of, you could call it stagnation. We've talked about, on the program before, with Patrick Collison, about the pace of innovation. And, some people have argued that we've figured out all the big stuff. All the low-hanging fruit has been picked. That's one argument: so, 'We've kind of figured out almost everything and the fact that we haven't figured out all these last few pieces, not so important.' It's demoralizing to the people in the field, perhaps. But other people would suggest, 'Oh, it's just a temporary slowdown. We're going to--we'll have a big breakthrough again like we always do.' What are your thoughts on that? And, in particular, do you feel like physics is less exciting today than it used to be, because of these issues? Sabine Hossenfelder: How would I know? Like, I mean, I wasn't around in the 1970s; but I assume it must have been very exciting when they found all these new particles and so on. But, I don't think that this is something that will just resolve by itself. You have probably read this book which is called The End of Science, by John Horgan. Russ Roberts: No. I've interviewed John, but I don't think I've read that book. Sabine Hossenfelder: Okay. He says that basically science is coming to an end--from the title of the book--in the sense that, not that there will be no more scientists or no more scientific research, but that we have discovered all the big insights, roughly speaking. And he tries to blame this on, there's maybe a fundamental limit to knowledge which we have just reached; or, it's limit of human cognition, basically. I find this very, very implausible, because we're seeing the stagnation all over the place. It's not just the foundations of physics, but it's other disciplines that he's talking about--for example, complexity research is one that he likes to go on about. So, we also have this problem in certain sectors of technology; maybe you could add quantum technologies there specifically, which is something that I've been looking into recently. And so, it looks very much like it is a systemic problem. And this is what makes me think it must be something about the way that we're organizing scientific research that just allows scientists to get stuck in these non-promising research fields--because they can't get out, basically; because they have to continue to produce papers. And that's easier to do on a topic where a lot of people work on already. Russ Roberts: It's true that a lot of potential--I don't know what 'a lot' means; as you say, it's relative--but, a lot breakthroughs, meaning more than you would expect, come from people outside the field, who get a fresh look, who aren't stuck in the box, who aren't stuck in a particular way of thinking. I am not impressed by how most physicists do economics. But maybe physicists will have a breakthrough in economics because they aren't constrained by the models and ways that economists are trained at the Ph.D. [Doctor of Philosophy] level in the United States and elsewhere. So it's possible that we are somewhat stuck, as you say. But it seems to me--tell me what you think of this--aren't we reaching many of the limits of data collection? I mean, it's extraordinary what we're able to see of the universe in terms of measuring particles. It's an extraordinary thing how much data we've been able to accumulate, both in terms of particle physics and also astrophysics. It's--if you'd said to someone a hundred years ago or 200 years ago about what we know now, they would have just said that's impossible. But, it does make me wonder--and this is where I sympathize a little bit with Horgan, is that our ability to continue to add to that database, in some sense, is limited in so many ways. And certainly in economics, the kind of data that we want to have or never are going to have, and I think our ability to understand, say, the macroeconomy, is fundamentally constrained because of that drawing on Hayek's Nobel Prize address--we'll put a link up to that. Do you think that's true in physics, that maybe our ability discovering new stuff is limited by the fact that we have picked the low-hanging data fruit? And there's just not much more we can gather about the origins of the Big Bang or what's happening in galaxies that are in black holes, say? There are real far away. They're hard to get information about. Sabine Hossenfelder: So, it's ultimately becoming more difficult. But, if you look at the history of physics, what has happened so far is that we have gathered new insights about the way that nature works, and that has allowed us to develop new technologies, which have led to new insights, which have led to new technologies. And that cycle seems to be broken. And it's broken not on the data collection side because we have a lot of data. It's broken on the theory-development side, where we are just missing explanations for that data. And so I think that what you say is true--it becomes more difficult to collect new data--but that doesn't explain why we're stuck. Because it's not the data collection that's the problem. Now, when it comes to economics or, you know, society in more general, I think that's a much more complicated question, how much you can even learn from the data in principle. I don't think anyone really knows, you know, in a system that's so complex, how much can you even get out of the data in principle.

53:58 Russ Roberts: Yeah, no; I think it's incredibly primitive. And it's dangerous. Because, as you point out, we look where the light is--the drunk who lost the keys looks under the lamppost. And economists look where the data are, because that's what we have. And the idea that the data aren't good enough to answer the questions we want to answer is never--that's not acceptable. When I suggest that, people say things like, 'Well, this is the best data we have.' And I say, 'But it's not good for what the question is.' 'But it's the best data!'--meaning, 'Well, what else could you do?' And the answer is, 'Well, you'd be humble about what you can understand.' But people don't like that. That doesn't sell. It's a--it's an interesting idea that we'll get, that we have a lot more data. I guess my challenge to you as an economist is: It's hard to understand why so many physicists are locked in these boxes or stuck in these theoretical straitjackets, because you are suggesting that there's a tremendous opportunity to get out of those jackets, to get out of those boxes, and look at the world in a fresh way. And usually there would be a bunch of people working on that. It is high risk. I get that. Sabine Hossenfelder: Yeah. Well, what can I say? I think they should listen to me. Russ Roberts: Is anyone, Sabine? What kind of reaction has your book gotten? Sabine Hossenfelder: Well, the people who aren't paying, they are listening to me. Russ Roberts: What do you mean? Sabine Hossenfelder: Well, I have some postdocs and they do what I say. Russ Roberts: [*guffaw and laughter*] Sabine Hossenfelder: But other than that--you know, money makes things possible. Well, um, to be fair, I think that the senior people in the field--by which I basically mean everyone who is older than me--basically don't care what I'm saying. But the younger people do. I know this both from the emails that I get and also from people who I meet at conferences or when I give lectures or something like this, because they don't--they don't want to work on something that is unpromising and waste the time of their life. And so that's certainly a big part of the audience that I've written the book for. Russ Roberts: Yeah. Einstein didn't have tenure when he worked on relativity, right? And he wasn't trying to get tenure. He was just--I think he was a clerk in a patent office. Which was a--many great thinkers have been clerks in patent offices, I think. Interesting. There's definitely a study to be done there.