1 delivered to the annual conference of the Heterodox Academy in June of 2019. (Photo above by Is the statement “We are living in a post-truth world” true? If your answer is “yes” then the answer is “no” because you’ve just evaluated the statement in an evidentiary manner, so evidence still matters and facts still matter. Harvard psychologist Steven Pinker explains why were are not living in a post-truth world in this deeply insightful cover story from Skeptic magazine 24.3 (2019). This article is based on the keynote address delivered to the annual conference of the Heterodox Academy in June of 2019. (Photo above by Jeremy Danger

Anyone who urges universities to live up to their mission of promoting knowledge, truth, and reason is bound to be confronted with the objection that these aspirations are just so 20th century. Aren’t we living in a post-truth era? Haven’t cognitive psychologists shown that humans are fundamentally irrational? Mustn’t we acknowledge that the pursuit of disinterested reason and objective truth are Enlightenment anachronisms?

The answer to all of these questions is “no.”

First, we are not living in a post-truth era. Why not? Consider the statement “We are living in a post-truth era.” Is it true? If so, it cannot be true.

Likewise, it is not the case that humans are irrational. Consider the statement, “Humans are irrational.” Is that statement rational? If it is, it cannot be true—at least, if it is uttered and understood by humans. (It would be another thing if it was an observation exchanged among an advanced race of space aliens.) If humans were truly irrational, who specified the benchmark of rationality against which humans don’t measure up? How did they conduct the comparison? Why should we believe them? Indeed, how could we understand them?

In his book The Last Word, the philosopher Thomas Nagel showed that truth, objectivity, and reason are not negotiable.2 As soon as you start making a case against them, you are making a case, which means you are implicitly committed to reason. Nagel calls this argument Cartesian, after Descartes’ famous argument that just as the very fact that one is pondering one’s existence shows that one must exist, the very fact that one is examining the validity of reason shows that one is committed to reason. A corollary is that we don’t defend or justify or believe in reason, and we certainly do not, as it is sometimes claimed, have faith in reason. As Nagel puts it, each of these is “one thought too many.” We don’t believe in reason; we use reason.

This may sound like logic-chopping, but it’s built into the way we make everyday arguments. As long as you’re not bribing or threatening your listeners to mouth agreement with you, but trying to persuade them that you’re right—that they should believe you, that you’re not lying, or full of crap— then you have conceded the primacy of reason. As soon as you try to argue that we should believe things by any route other than reason, you’ve lost the argument, because you’ve appealed to reason. That is why a defense of reason is unnecessary, perhaps even impossible.

As for the “post-truth era,” journalists should retire this cliché unless they can keep up a tone of scathing irony. It comes from the observation that some politicians—one in particular—lies a lot. But politicians have always lied. They say that in war, truth is the first casualty, and that can be true of political war as well. (The expression “credibility gap” had its heyday during the administration of Lyndon Johnson in the 1960s.) And the bending or inverting of truth by people in power has long been consequential, leading, for example, to the Spanish-American war, the First World War, the Vietnam War, and the Iraq War, right up to the near miss in the Persian Gulf in 2019.

Another inspiration for the post-truth cliché is the recent prominence of “fake news.” But this, too, is not a new development. The title of the James Cortada and William Aspray’s forthcoming Fake News Nation: The Long History of Lies and Misinterpretations in America, is self-explanatory, though the long history is by no means confined to America.3 The Protocols of the Elders of Zion, the hoaxed proceedings of a secret meeting of Jews plotting global domination, was advanced as fact by a number of prominent people in subsequent decades, including the industrialist Henry Ford. Countless pogroms, lynchings, and deadly ethnic riots have been sparked by rumors of the alleged perfidy of some minority group.



And the belief that fake news is displacing the truth itself needs to be examined for its truth. In their analysis of fake news in the 2016 American presidential election, Andrew Guess, Brendan Nyhan, and Jason Reifler found that it took up a minuscule proportion of the online communications (far less than 1 percent) and was mainly directed at partisans who were impervious to persuasion.4 This is hardly surprising: unless you were already marinated in a rightwing fever swamp, if you came across a social media post claiming that Hillary Clinton was running a child sex ring out of a Washington DC pizzeria, you would treat it as exactly what it is.

But the main reason we should retire the posttruth cliché is that it’s corrosive, perhaps self-fulfilling. The implication is we may as well give up on reason and truth and just fight the bad guys’ lies and intimidation with lies and intimidation of our own. We can aim higher.

Let’s return to the claim that Homo sapiens is irredeemably irrational, which is inspired both by research in cognitive psychology on illusions and biases and on a cartoon version of evolutionary psychology in which we are ruled by lizard brains rapidly detecting danger from simple cues. The implication is that humans can’t be expected to be cerebral if they have minds adapted to the Stone Age. As someone who knows a thing or two about both cognitive and evolutionary psychology, I’m here to tell you that this is not an accurate picture of how the human mind works.

In a wonderful paper by John Tooby and Irven DeVore (with Leda Cosmides as an unacknowledged coauthor),5 these evolutionary psychologists argue that Homo sapiens evolved to fill the “Cognitive Niche,” living by a combination of social cooperation, language, and technological know-how. The ethnographic record shows that foraging people, who provide a window into the lives of our evolutionary ancestors, build mental models of the world around them that allow them to explain, predict, and control things to their advantage. Here’s an example from Napoleon Chagnon, who spent 30 years with the Yąnomamö, hunter-horticulturalists of the Amazon rainforest. Chagnon describes one of the ways they obtained dinner:

Armadillos live several feet underground in burrows that can run for many yards and have several entries. When the Yąnomamö find an active burrow, as determined by the presence around the entry of a cloud of insects found nowhere else, they set about smoking out the armadillo. The best fuel for this purpose is a crusty material from old termite nests, which burns slowly and produces an intense heat and much heavy smoke. A pile of this material is ignited at the entry of the burrow, and the smoke is fanned inside. The other entries are soon detected by the smoke rising from them, and they are sealed with dirt. The men then spread out on hands and knees, holding their ears to the ground to listen for armadillo movements in the burrow. When they hear something, they dig there until they hit the burrow and, with luck, the animal. On one occasion, after the hunters had dug several holes, all unsuccessful, one of them ripped down a large vine, tied a knot in the end of it, and put the knotted end into the entrance. Twirling the vine between his hands, he slowly pushed it into the hole as far as it would go. As his companions put their ears to the ground, he twirled the vine, causing the knot to make a noise, and the spot was marked. He broke off the vine at the burrow entrance, pulled out the piece in the hole, and laid it on the ground along the axis of the burrow. The others dug down at the place where they had heard the knot and found the armadillo on their first attempt, asphyxiated from the smoke.6

An awful lot of rationality went into that that achievement.

Halfway around the world, the citizen scientist Louis Liebenberg has studied the tracking techniques employed by the San of the Kalahari Desert.7 They are persistence hunters who track animals by their spoor. Though most ungulates are swifter than humans, they are poor at dumping heat, and if pursued long enough will keel over and can be dispatched with a rock or spear. The San’s hunting success depends on their powers of inference: they form hypotheses about an animal’s whereabouts from sparse data in tracks, bent twigs, and displaced pebbles—often inferring the animal’s species, age, sex, and condition, which in turn allows them to predict its movements. A deeply pointed hoofprint implies an agile Springbok who needs a good grip; a shallow flat-footed one bespeaks a heavy kudu who needs to support its weight. But they don’t just engage in inference; they also engage in reasoning.8 As they pause to figure out what to do next, they engage in debate, articulate their logic, and defend it against alternatives. Liebenberg also saw many displays of skepticism, in which a young hunter challenged the guess of an older one. Indeed their skepticism extended to their myths and legends. Liebenberg recounts:

Namka, …told me the myth of how the sun is like an eland, which crosses the sky and is then killed by people who live in the west. The red glow in the sky when the sun goes down is the blood of the eland. After they have eaten it, they throw the shoulder blade across the sky back to the east, where it falls into a pool and grows into a new sun. Sometimes, it is said, you can hear the swishing noise of the shoulder blade flying through the air. After telling me the story in great detail, he told me that he thinks that the “Old People” lied, because he has never seen… the shoulder blade fly through the sky or heard the swishing noise.9

So if anyone tries to excuse irrationality and dogma by pointing a finger at our evolutionary origins, I say: Don’t blame the hunter-gatherers. Rational inference, skepticism, and debate are in our nature every bit as much as freezing in response to a rustle in the grass.

Why were truth and rationality selected for? The answer is that reality is a powerful selection pressure. As the science fiction author Philip K. Dick put it, “Reality is that which, when you stop believing in it, doesn’t go away.”10 Either there is an armadillo in the burrow or there isn’t. Those who were so hidebound by stereotype or habit that they could not deduce out where it was or how to kill it went hungry.

Closer to home, I’m often asked why I even bother to try to persuade people with data and graphs, because everyone knows that people never change their minds when faced with contradictory evidence. But this is an exaggeration. People indeed dig in and double down when evidence challenges a sacred belief that is close to their social identity. But Nyhan and Reifler have shown that evidence can change people’s minds, even on highly politicized issues, such as whether there has been a rise in global temperature (among people on the right) or whether George W. Bush’s military surge in Iraq in 2007 reduced terrorist attacks (among people on the left). When the facts were presented in clear graphs, even the partisans changed their minds.11

A third reason to stop saying that humans are irrational across the board is that many of the demonstrations of irrationality, such as in the classic experiments of Amos Tversky and Daniel Kahneman,12 depend on how the information is presented and how rationality is defined. The cognitive psychologist Gerd Gigerenzer has shown that many of illusions and fallacies vanish when information is framed in ways that harmonize with human intuition.13

So if we do have the capacity to be rational, why are we so often irrational? There are several reasons. The most obvious was pointed out by Herbert Simon, one of the founders of both cognitive psychology and artificial intelligence: rationality must be bounded. A perfect reasoner would require all the time in the world, and unlimited memory. So we often satisfice, trading accuracy for efficiency.14

Also, though reality is always a powerful selection pressure, we did not evolve with the truth-augmenting technologies that have been invented in recent millennia and centuries, such as writing, quantitative datasets, scientific methodology, and specialized expertise.

And annoyingly, facts and logic can compromise our self-presentation as effective and benevolent, a powerful human motive. We all try to come across as infallible, omniscient, and saintly. Rationality can be a nuisance in this campaign, because inconvenient truths will inevitably come to light that suggest we are mere mortals. The dismissal of facts and logic is often damage control against threats to our self-presentation.

Beliefs also can be signals of loyalty to a coalition. As Tooby has pointed out, the more improbable the belief, the more credible the signal.15 It’s hard to affirm your solidarity with the tribe by declaring that rocks fall down instead of up, because anyone can say that rocks fall down instead of up. But if you say that God is three persons in one, or that Hillary Clinton ran a child sex ring out of a Washington pizzeria, you’ve shown that you’re willing to take risks for the team.

Group loyalty is an underestimated source of irrationality in the public sphere, especially when it comes to politicized scientific issues like evolution and climate change. Dan Kahan has shown that, contrary to what most scientists believe, a denial of the facts of human evolution or anthropogenic climate change is not a symptom of scientific illiteracy16 The deniers know as much science as the accepters. They contrast instead on political orientation: the farther to the right, the more denial.

Kahan notes that there is a perverse rationality to this “expressive cognition.” Unless you are one of a small number of deciders and influencers, your opinion on climate change will have no effect on the climate. But it could have an enormous effect on how you’re accepted your social circle—whether you’re seen as someone who at best just doesn’t get it and who at worst is a traitor. For someone in a modern university to deny human-made climate change, or for someone in a rural Southern or Midwestern community to affirm it, would be social death. So, it’s perversely rational for people to affirm the validating beliefs of their social circle. The problem is that what’s rational for the individual may not be rational for the nation or the planet. Kahan calls it the “Tragedy of the Belief Commons.”17

Another paradox of rationality is pluralistic ignorance, or the “spiral of silence,” in which everyone believes that everyone else believes something but no one actually believes it. A classic example is drinking in college fraternities: a 1998 Princeton study found that the male students mistakenly believed that their fellow students thought it was cool to drink a lot, and during their time on campus gravitated toward endorsing this false norm themselves.18 The same thing happens in college women’s attitudes toward casual sex.19

How can pluralistic ignorance happen? How does a false belief keep itself levitated in midair? Michael Macy and his colleagues show that a key factor is enforcement. Not only does the belief never get challenged, but group members believe they must punish or condemn those who don’t hold it—out of the equally mistaken belief that they themselves may be denounced for failing to denounce.20 Denunciation is a signal of solidarity with the group, which can lead to a cascade of pre-emptive, self-reinforcing denunciation, and sometimes to “extraordinary popular delusions and the madness of crowds” like witch hunts and other bubbles and manias.21 Sometimes the bubble can be punctured by a public exclamation that the emperor is naked, but it takes an innocent boy or a brave truth-teller.

The drags on reason—boundedness rationality, the novelty of truth-enhancing institutions, self-presentation, costly signaling, pluralistic ignorance—are depressing in their number and weight. But there are also forces that can empower the rational angels of our nature. These rationality enhancers have been explored by psychologists such as Jonathan Baron, Dan Sperber, Hugo Mercier, Steven Sloman, and Jason Fernbach,22 and many of them draw their power from another principle articulated by Abraham Lincoln: “You can fool some of the people all of the time, and you can fool all of the people some of the time, but you can’t fool all of the people all of the time.” There are prods and nudges and norms and institutions that allow us to be more rational collectively than any of us is individually.23

One of them is unbelievably simple: have people articulate their position. It turns out that many stalwarts with a fervent opinion on, say, Obamacare, or NAFTA, are dumbstruck when they have to explain what exactly the policy is. When people are confronted with their own ignorance of the facts, they become more epistemically humble about their opinions. Related nudges include having people defend a position against alternatives in front of disinterested bystanders, or putting them in a small group that has to come to a consensus. Baron advocates that people explicitly endorse a virtue he calls “active open-mindedness”: always give second thoughts to your opinions, always seek out criticism.24 And then there’s the technique discovered long ago by rabbis: first have your yeshiva students make the strongest possible argument on one side of a Talmudic dispute, then force them to switch sides.

Knowledge of cognitive psychology itself can be helpful: people should understand and learn to avoid the common biases and fallacies that psychologists have identified, such as availability (reasoning from an anecdote), representativeness (reasoning from a stereotype), confirmation bias, and the gambler’s fallacy. Also helpful is having your feet held to the fire of empirical predictions: to put your money where your mouth is, to seek the proof in the pudding. Scientists themselves can engage in adversarial collaboration, in which theorists with opposing opinions on some hypothesis get together and come up with an empirical test they both agree will settle the question.

So humans can be collectively rational if they submit to norms that engage their rational faculties and sideline their irrationalities. Many of these norms have been implemented in institutions that are the framework of modern liberal democracies: a free press instead of government propaganda; an adversarial court system instead of trial by ordeal or justice by lynch mob; peer-reviewed science instead of authority and dogma; deliberative democracy with checks and balances instead of absolute autocracy. They work not by calling on individuals to muster superhuman rationality but by placing them in an arena in which intellectual diversity can undermine authority and conformity. As James Madison put it, “ambition must be made to counter ambition.”25

But has the day of rationality-promoting norms and institutions passed? As shocking is this may sound, they have never been more prominent. In domain after domain the world is more rational than it was just a few decades ago. Journalism is supplementing shoe-leather reporting with factchecking organizations like PolitiFact, because readers will protest if a politician’s statements go unchallenged. And instead of citing the result of a single opinion poll, with all its sampling noise, we have seen the rise of data journalism, including Nate Silver’s fivethirtyeight.com. Forecasting is no longer the dark art of pundits, gurus, and soothsayers but is being advanced by Philip Tetlock’s superforecasters, who combine data, Bayesian reasoning, and active open-mindedness to make guarded predictions about well-specified events.26

Healthcare has seen the long-overdue rise of evidence- based medicine (fulfilling the promise in the wisecrack “What do you call alternative medicine with evidence? Medicine.”) Criminology and policing are leaving behind the various nostrums, gimmicks, and theories of “root causes,” and embracing numbercrunching systems like CompStat which capitalize on the fact that a large proportion of violence can be attributed to a small number of areas and perpetrators— if you can figure out where and who they are, you can bring the murder rate down by a lot quickly, as New York City did in the 1990s (a 75% reduction in less than a decade). The world of philanthropy is being reshaped by effective altruism, which tries to distinguish acts of do-gooding that kindle a warm glow in donors from those that measurably improve the lives of beneficiaries. Psychotherapy is moving beyond the couch and the notepad and using Feedback- Informed Treatment, in which the mental health of patients is tracked day by day to see which interventions are helping or hurting them.

Governments are starting to base policies less on ideology and bureaucratic inertia and more on evidence, actually measuring what makes the streets safer and the kids stay in school. Behavioral Insights, also called Nudge (after the book by Richard Thaler and Cass Sunstein) manipulates the user interface and choice architecture of government programs to get people to do what’s in their own interests without coercion or deception.27 Socioeconomic data is no longer entombed in academic archives and proprietary databases of governments and NGOs but is easily accessible to anyone with a Web browser, thanks to opensource datasets and interactive data graphics on sites like OurWorldInData.org, GapMinder.org, and HumanProgress.org. Sports has seen the rise of Moneyball,28 in which smarter teams can beat richer teams by processing data instead of speculating over the hot stove. The on-line “rationality community,” expressed on sites like LessWrong.com, SlateStarCodex.com, and Skeptic.com seeks to glorify rationality, stigmatize cognitive biases, and improve the quality of reasoning, decision-making, and opinion formation. Even everyday fact-checking has been has been revolutionized by the urban legend tracking site snopes.com and by Wikipedia, which is now 80 times the size of the Encyclopedia Britannica and pretty much as accurate. (A recent cartoon captioned “Life before Google” shows a man on a barstool musing, “I wonder who played the skipper on Gilligan’s Island,” and his companion answering, “I guess we’ll never know.”)

Rationality, to be sure, is not increasing everywhere. In some arenas it appears to sinking fast. The most conspicuous is electoral politics, which is almost perversely designed to inhibit our capacity for rationality. Voters act on issues that don’t affect them personally, and are under no pressure to inform themselves or defend their positions. Practical issues like energy and healthcare are bundled with symbolic hot buttons like euthanasia and the teaching of evolution. These bundles are then strapped to regional, ethnic, or religious coalitions, encouraging group-affirming expressive cognition. People vote as if rooting for sports teams, encouraged by the media, which treat politics as a horse race, encouraging zero-sum competition rather than clarification of character and policy.

And as a recent New York Times op-ed (in which I played a cameo) announced, “Social media is making us dumber.”29 Not long ago many intellectuals deplored the lack of democratic access to mass media. A few media corporations, in cahoots with the government, “manufactured consent” with their oligopoly over the means of production and dissemination of ideas. As we used to say, freedom of the press belongs to those who own one. Social media held out the promise of giving a voice to The People.

We should have been careful about what we wished for. The network dynamics of social media are still poorly understood, but they do not yet host the mechanisms of vetting and reviewing that are necessary for true beliefs to bubble up to prominence from the turbid pools of self-presentation, group solidarity, and pluralistic ignorance. And they have become launch pads for spirals of moralistic grandstanding and pre-emptive denunciation.

We are now living in an era of rationality inequality. At the high end we’ve never been more rational. But at the low end there are arenas that indulge the worst of human psychology. Much work remains to be done in refining the institutions that bring out the rational angels of our nature.

And this brings me to the role of universities. Universities ought to be the premier institutions of rationality promotion. They have been granted many privileges and perquisites in exchange for fulfilling the mission of adding to the stock of human knowledge and transmitting it to future generations. State universities and colleges are underwritten by the public purse, as is a great deal of tuition and research support in private ones, together with their tax-exempt status. The extraordinary institution of tenure is designed to allow nonconforming intellectuals to express heterodox opinions without fear of being silenced or fired. Tuition is exorbitant and hyper-inflating, distended by an engorged bureaucracy and abetted in part by government subsidies and lack of regulation. Universities have also been granted credentialing and gatekeeping privileges in business and the professions, where a degree is often an entry requirement despite the questionable value added to a student’s capabilities by four years at a university, according to exit audits. (Some economists have argued that a parchment is more a proxy for intelligence and discipline than a certification of actionable knowledge and skills: a quarter-million- dollar IQ and marshmallow test.)30 Yet despite these perquisites, universities have become notorious as monocultures of left-wing orthodoxy and the illiberal suppression of heterodox ideas (I won’t review the latest follies, but will mention just two words: Halloween costumes).31 As the civil libertarian Harvey Silverglate has put it, “You can say things in Harvard Square that you can’t say in Harvard Yard.”

Should we be cynical about the modern academy as a rationality-promoting institution? Let me put the question into historical perspective, chastened by my discovery from writing two books that “the best explanation for the good old days is a bad memory.”32 As a freshman in the 1970s, one of my first experiences was watching an argument at a table in the main campus building at which an activist for the Socialist Democratic Marxist Leninist Alliance (or was it the Leninist Marxist Democratic Socialist Alliance?) shouted down a dissenting student with the proclamation “Fascists don’t have the right to speak!” Throughout the 1970s and 1980s behavioral scientists like Arthur Jensen, Hans Eysenck, Richard Herrnstein, Thomas Bouchard, and Linda Gottfredson were disinvited, drowned out, and in some cases physically assaulted. On the right, 0for example, is a 1984 poster announcing a talk by the evolutionary biologist E. O. Wilson, which improbably called him “The Prophet of Right Wing Patriarchy” and invited students to “bring noisemakers.” So when it comes to intolerant repression of non-leftist ideas, don’t blame the Millennials or the iGens. Contra Billy Joel, we Baby Boomers did start the fire—which is not to deny that it is now blazing out of control.

Why do universities fall short of what one might think of as their essential mission, promoting openminded rationality? There are several hypotheses. In The Coddling of the American Mind,33 Greg Lukianoff and Jonathan Haidt have suggested that (to oversimplify) helicopter Baby Boomer parents reared iGen snowflakes, who melt at the slightest uncomfortable thought. Another explanation points to an increase in homophily—people gravitating to people who are like them, especially liberals and their children in cities and dense suburbs—which bred a uniformity of opinion on university campuses.34 The sociologists Bradley Campbell and Jason Manning have described the rise of a Culture of Victimhood, in which prestige comes not from a resolve to retaliate against threats (a Culture of Honor) or an ability to control one’s emotions (a Culture of Dignity) but from a claim to have been victimized on the basis of race or gender, a grievance that is predictably ratified and redressed by the campus bureaucracy.35 And since any of these dynamics can weave a network of pluralistic ignorance enforced by denunciation mobs, we can’t know how many intimidated students would privately disavow intellectual orthodoxy and the culture of victimhood but are afraid to say so out of a mistaken fear that everyone else avows it.

Some of this regression is a paradoxical byproduct of the fantastic progress we have made in equality. Vanishingly few people in universities actually hold racist, sexist, homophobic, or transphobic attitudes (though they may have different views on the nature of these categories or the causes of group differences). That means that accusations of racism, sexism, homophobia, and transphobia can be weaponized: since everyone reviles these bigotries, they can be used to demonize adversaries, which in turn spreads a terror of being demonized. The accusations are uniquely noxious because it is virtually impossible to defend oneself against them. “Some of my best friends are X” is risible, and testimony about one’s unprejudiced bona fides or a track record of advancing the careers of women and minorities is not much more exculpatory. This places temptation in people’s paths to denounce others for bigotry before they are denounced themselves: it is one of the few means of pre-emptive self-defense.

Should we care about what happens in the universities? It’s sometimes said that academic disputes are fierce because the stakes are so small. In fact, the stakes are significant. The obvious one is whether universities are carrying out their fiduciary duty to advance knowledge in return for their massive absorption of society’s resources and trust. Another is their creeping influence on the rest of society. As Andrew Sullivan wrote in 2018, “we all live on campus now.”36 Political correctness and social justice warfare have descended from the ivory tower and infiltrated tech, business, healthcare, and government.

Still worse, intolerance on campus is corroding the credibility of university research on vital topics such as climate change and gun violence. Skeptics on the right can say, “Why should we be impressed if climate scientists are unanimous that human activity is threatening the planet? (Or on any other issue?) They work in universities, which everyone knows are echo chambers of PC dogma.”

A final danger to allowing universities to repress open debate is that it sets off equal and opposite backlashes. The regressive left is an incubator of the alt-right. I’ve seen it happen, including to former students. When they see that certain opinions are unexpressable, when they see speakers being deplatformed and people being assaulted or demonized for citing certain facts or advancing certain ideas, they conclude, “You can’t handle the truth!” Since they can’t discuss heterodox ideas with students and faculty in universities, they retreat into an alternative universe of discourse, mainly internet discussion groups, in which these ideas harden and grow more extreme in the absence of critical engagement. When the nuanced, statistical, multifactorial, qualified, tentative and ethically sensitiv versions of taboo hypotheses are squelched on campus, the simplistic, allor- none, single-factor, exaggerated, invidious versions blossom outside it. This happens in discussions of capitalism, the causes of being transgender, and differences between ethnic groups and sexes.

So we must safeguard the truth and rationalitypromoting mission of universities precisely because we are not living in a post-truth era. Humans indeed are often irrational, but not always and everywhere. The rational angels of our nature can and must be encouraged by truth-promoting norms and institutions. Many are succeeding, despite what seems like a growth in reason inequality. Universities, as they become infected with political conformity and restrictions on expressible ideas, seem to be falling short in their mission, but it matters to society that they be held to account: so they can repay the perquisites granted to them, secure the credibility of their own research on vital issues, and inoculate students against extreme and simplistic views by allowing them to evaluate moderate and nuanced ones.

About the Author Dr. Steven Pinker is an experimental psychologist who conducts research in visual cognition, psycholinguistics, and social relations. He grew up in Montréal and earned his BA from McGill and his PhD from Harvard. Currently Johnstone Professor of Psychology at Harvard, he has also taught at Stanford and MIT. He has won numerous prizes for his research, his teaching, and his nine books, including The Language Instinct, How the Mind Works, The Blank Slate, The Better Angels of Our Nature, and The Sense of Style. He is an elected member of the National Academy of Sciences, a two-time Pulitzer Prize finalist, a Humanist of the Year, a recipient of nine honorary doctorates, and one of Foreign Policy’s “World’s Top 100 Public Intellectuals” and Time’s “100 Most Influential People in the World Today.” He is Chair of the Usage Panel of the American Heritage Dictionary, and writes frequently for the New York Times, the Guardian, and other publications. His most recent book is Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. References This article is based on the keynote address delivered to the annual conference of the Heterodox Academy in June of 2019. https://bit.ly/2NZCOnS Nagel, Thomas. 1997. The Last Word. New York: Oxford University Press. Cortada, James W. and William Aspray. 2019. Fake News Nation: The Long History of Lies and Misinterpretations in America. Rowman & Littlefield. Guess, Andrew, Brendan Nyhan, and Jason Reifler. 2018. “Selective Exposure to Misinformation: Evidence from the Consumption of Fake News During the 2016 U.S. Presidential Campaign.” European Research Council. https://bit.ly/2CI7hA3 Tooby, John and Irv DeVore. 1987. “The Reconstruction of Hominid Evolution Through Strategic Modeling.” In The Evolution of Human Behavior: Primate Models, (ed W. G. Kinsey). Albany, NY: SUNY Press. Chagnon, Napoleon. 1996. Yąnomamö: The Fierce People. New York: Harcourt Brace, 63–64. Liebenberg, Louis. 1990. The Art of Tracking: The Origin of Science. Cape Town: David Philip. Liebenberg, Louis. 2013. “Tracking Science: The Origin of Scientific Thinking in Our Paleolithic Ancestors.” Skeptic, Vol. 18, No. 3. Liebenberg 2014, 191–92. Dick, Philip K. 1985. I Hope I Shall Arrive Soon. New York: St. Martin’s Press. Nyhan, Brendan and Jason Reifler. 2018. “The Roles of Information Deficits and Identity Threat in the Prevalence of Misperceptions.” Journal of Elections, Public Opinion and Parties. Vol. 29, No. 2., 222–244. Kahneman, Daniel. 2011. Thinking, Fast and Slow. New York: Farrar, Straus & Giroux. Gigerenzer, Gerd. 2007. Gut Feelings: The Intelligence of the Unconscious. New York: Viking. Simon, Herbert A. 1984. Models of Bounded Rationality. MIT Press. Tooby, John. 2017. “Coalitional Instincts.” Edge.org https://bit.ly/2iIhEqD Kahan, Dan M. 2017. “On the Sources of Ordinary Science Knowledge and Extraordinary Science Ignorance.” In The Oxford Handbook of the Science of Science Communication (ed by Kathleen Hall Jamieson, Dan M. Kahan, Dietram Scheufele). New York: Oxford University Press, 35–49. In his words, the Tragedy of the Risk Perception Commons; I have broadened the term. Kahan, Dan M. 2012. “Cognitive Bias and the Constitution of the Liberal Republic of Science.” Yale Law School, Public Law Working Paper 270. https://bit.ly/2Sby7FS Prentice, D. A. and D. T. Miller. 1993. “Pluralistic Ignorance and Alcohol Use on Campus: Some Consequences of Misperceiving the Social Norm.” Journal of Personality and Social Psychology. February, 64(2): 243–256. https://bit.ly/2xPTenT Lambert, Tracy A., Arnold S. Kahn, and Kevin J. Apple. 2003. “Pluralistic Ignorance and Hooking Up.” The Journal of Sex Research. Vol. 40, No. 2, May, 129–133. Macy, Michael W., Robb Willer, and Ko Kuwabara. 2009. “The False Enforcement of Unpopular Norms.” American Journal of Sociology. Vol. 115, No. 2, September, 451–490. Mackay, Charles. 1841/1852/1980. Extraordinary Popular Delusions and the Madness of Crowds. New York: Crown, 559. See for example: Baron, J. 1993. “Why Teach Thinking?” Applied Psychology, 42, 191–237. I discuss these at length in: Pinker, Steven. 2018. Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. New York: Viking. See Chapter 21. Baron, Jonathan. 2000 (3rd ed). Thinking and Deciding. Cambridge: Cambridge University Press. Madison, James. 1788. “The Federalist No. 51: The Structure of the Government Must Furnish the Proper Checks and Balances Between the Different Departments.” Independent Journal, Wednesday, February 6. Tetlock, P. E. & Gardner, D. 2015. Super forecasting: The Art and Science of Prediction. New York: Crown. Thaler, Richard H. and Cass R. Sunstein. 2008. Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven: Yale University Press. Lewis, Michael. 2003. Moneyball: The Art of Winning an Unfair Game. New York: W. W. Norton. https://nyti.ms/2mmMN5x Caplan, B. 2018. The Case Against Education: Why the Education System is a Waste of Time and Mondy. Princeton: Princeton University Press. Kors, A. C. & Silverglate, H. A. 1998. The Shadow University: The Betrayal of Liberty on America’s Campuses. New York: Free Press. Lukianoff, G. 2012. Unlearning Liberty: Campus Censorship and the End of American Debate. New York: Encounter Books; see also Enlightenment Now, 373–4, and the Web sites of the Foundation for Individual Rights in Education (thefire.org) and the Heterodox Academy (heterodoxacademy.org) From Franklin Pierce Adams. Lukianoff, Greg and Jonathan Haidt. 2018. The Coddling of the American Mind. New York: Penguin Press. See W. Wilkinson, The Density Divide: Urbanization Polarization, and Populist Backlash, Niskanen Center, June 2018. Campbell, Bradley and Jason Manning. 2018. The Rise of Victimhood Culture. Palgrave Macmillan. Sullivan, Andrew. 2018. “We All Live On Campus Now.” New York, Feb. 9. https://nym.ag/2LkVn3y

Recommended by Amazon