Human Brain Threat to Democracy?

The evidence that humans are irrational continues to mount. What does this mean for self-governance?

James Joyner · · 11 comments

Joe Keohane argues in the Boston Globe that humans are predisposed to use information to confirm their existing beliefs, which makes democratic governance impossible.

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight. In the end, truth will out. Won’t it? Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger. This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper. “The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.” These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote. This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

There’s much more to the piece, which constitutes a selective review of the political behavior literature. That was never my research interest and my readings in the subfield are something like 17 years out of date but Keohane’s presentation comports well with my observations over the past decade or so. Certainly, large numbers of people believe that Bush promised Iraq would be easy, that we found WMD in Iraq, that Obama was born in Kenya, and many other things that simply don’t jibe with the known facts.

I’m not sure what the implications of any of this are for representative democracy. Perhaps it provides support to those who argue that government by referendum is a bad idea. But does it somehow render us unable to vote for politicians who will make public policy decisions for us? I don’t see how.

There’s not much doubt that most voters are poorly informed. And, the cited studies would seem to add further reason to be dubious of the influence of negative television advertising.

But, even if people are slow to evaluate facts, that doesn’t mean they don’t form impressions that are close enough to reality. Certainly, they’re pretty good punishing politicians who underperform expectations and replacing them. That this isn’t always an entirely rational process doesn’t alter the fact that it tends to reward the political party in charge during good times and punish them during bad times.

Additionally, political parties serve as a pretty fair proxy for all manner of things which actually matter to citizens. So, even poorly informed voters can pick candidates whose values more-or-less match up with their own and, every four years, either keep the current president or his party in power or toss them out. In the off years, they can either choose candidates that encourage the president to stay course or signal that it’s time to try something new.

Finally, it’s not at all clear what Keohane’s preferred alternative is. Churchill’s old saw about democracy being the worst form of government except all others we’ve tried remains true.