The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth. According to this view, political passions essentially make people unreasonable, even — indeed, especially — if they tend to be good at reasoning in other contexts. (Roughly: The smarter you are, the better you are at rationalizing.)

Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

The implications here are profound: Reasoning can exacerbate the problem, not provide the solution, when it comes to partisan disputes over facts. Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them. It seemed as if people who were ideologically inclined to believe a given falsehood worked so hard to come up with reasons that the correction was wrong that they came to believe the falsehood even more strongly.

But this “rationalization” account, though compelling in some contexts, does not strike us as the most natural or most common explanation of the human weakness for misinformation. We believe that people often just don’t think critically enough about the information they encounter.

A great deal of research in cognitive psychology has shown that a little bit of reasoning goes a long way toward forming accurate beliefs. For example, people who think more analytically (those who are more likely to exercise their analytic skills and not just trust their “gut” response) are less superstitious, less likely to believe in conspiracy theories and less receptive to seemingly profound but actually empty assertions (like “Wholeness quiets infinite phenomena”). This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.