We’re In an Epidemic of Mistrust in Science

Academia isn’t immune to the scourge of misinformation

A family physician prepares a measles vaccine during a consultation in Bucharest, Romania on April 16, 2018. Photo by Daniel Mihailescu/AFP via Getty

Dozens of infants and children in Romania died recently in a major measles outbreak, as a result of prominent celebrities campaigning against vaccination. This trend parallels that of Europe as a whole, which suffered a 400 percent increase in measles cases from 2016 to 2017. Unvaccinated Americans traveling to the World Cup may well bring back the disease to the United States.

Of course, we don’t need European travel to suffer from measles. Kansas just experienced its worst measles outbreak in decades. Children and adults in a few unvaccinated families were key to this widespread outbreak.

Just like in Romania, parents in the United States are fooled by the false claim that vaccines cause autism. This belief has spread widely across the country and leads to a host of problems.

Measles was practically eliminated in the United States by 2000. In recent years, however, outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities. We should be especially concerned because our president has frequently expressed the false view that vaccines cause autism, and his administration has pushed against funding “science-based” policies at the Centers for Disease Control and Prevention.

These illnesses and deaths are among many terrible consequences of the crisis of trust suffered by our institutions in recent years. While headlines focus on declining trust in the media and government, science and academia are not immune to this crisis of confidence, and the results can be deadly.

Consider that in 2006, 41 percent of respondents in a nationwide poll expressed “a lot of confidence” in higher education. Fewer than 10 years later, in 2014, only 14 percent of those surveyed showed “a great deal of confidence” in academia.

What about science as distinct from academia? Polling shows that the number of people who believe science has “made life more difficult” increased by 50 percent from 2009 to 2015. According to a 2017 survey, only 35 percent of respondents have “a lot” of trust in scientists; the number of people who trust scientists “not at all” increased by over 50 percent from a similar poll conducted in December 2013.

This crumbling of trust in science and academia forms part of a broader pattern, what Tom Nichols called the death of expertise in his 2017 book of the same name. Growing numbers of people claim their personal opinions hold equal weight to the opinions of experts.

Should We Actually Trust Scientific Experts?

While we can all agree that we do not want people to get sick, what is the underlying basis for why the opinions of experts — including scientists — deserve more trust than the average person in evaluating the truth of reality?

The term “expert” refers to someone who has extensive familiarity with a specific area, as shown by commonly recognized credentials, such as a certification, an academic degree, publication of a book, years of experience in a field, or some other way that a reasonable person may recognize an “expert.” Experts are able to draw on their substantial body of knowledge and experience to provide an opinion, often expressed as “expert analysis.”

That doesn’t mean an expert opinion will always be right—it’s simply much more likely to be right than the opinion of a nonexpert. The underlying principle here is probabilistic thinking, our ability to predict the truth of current and future reality based on limited information. Thus, a scientist studying autism would be much more likely to predict accurately the consequences of vaccinations than someone who has spent 10 hours Googling “vaccines and autism.”

This greater likelihood of experts being correct does not at all mean we should always defer to experts. First, research shows that experts do best in evaluating reality in environments that are relatively stable over time and thus predictable, and when the experts have a chance to learn about the predictable aspects of this environment. Second, other research suggests that ideological biases can have a strongly negative impact on the ability of experts to make accurate evaluations. Third, material motivations can sway experts to conduct an analysis favorable to their financial sponsor.

However, while individual scientists may make mistakes, it is incredibly rare for the scientific consensus as a whole to be wrong. Scientists get rewarded in money and reputation for finding fault with statements about reality made by other scientists. Thus, when the large majority of them agree on something — when there is a scientific consensus — it is a clear indicator that whatever they agree on accurately reflects reality.

The Internet Is for…Misinformation

The rise of the internet and, more recently, social media, is key to explaining the declining public confidence in expert opinion.

Before the internet, the information accessible to the general public about any given topic usually came from experts. For instance, scientific experts on autism were invited to talk on this topic on mainstream media, large publishers published books by the same experts, and they wrote encyclopedia articles on the topic.

The internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia being a great example of a highly curated and accurate source on the vast majority of subjects. On the other hand, anyone can publish a blog post making false claims about links between vaccines and autism. If they are skilled at search engine optimization or have money to invest in advertising, they can get their message spread widely.

Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: Just consider that U.S. adults believed 75 percent of fake news stories about the 2016 U.S. presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.

Today, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts.

Blogs publishing falsehoods are bad enough, but the rise of social media made the situation even worse. Most people reshare news stories without reading the actual article, judging the quality of the story by the headline and image alone. No wonder research indicates that misinformation spreads as much as 10 times faster and further on social media than true information. After all, the creator of a fake news item is free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.

These problems result from the train wreck of human thought processes meeting the internet. We all suffer from a series of thinking errors, such as confirmation bias, our tendency to look for and interpret information in ways that conform to our beliefs.

Before the internet, we got our information from sources like mainstream media and encyclopedias, which curated the information for us to ensure it came from experts, minimizing the problem of confirmation bias. Today, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts. Moreover, some unscrupulous foreign actors — such as the Russian government — and domestic politicians use misinformation as a tool to influence public discourse and public policy.

The large gaps between what scientists and the public believe about issues such as climate change, evolution, GMOs, and vaccination exemplify the problems caused by misinformation and lack of trust in science. Such mistrust results in great harm to our society, from outbreaks of preventable diseases to highly damaging public policies.

What Can We Do?

Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia.

For example, we can uplift the role of science in our society. The March for Science movement is a great example of this effort. First held on Earth Day in 2017 and repeated in 2018, this effort involves people rallying in the streets to celebrate science and push for evidence-based policies. Another example is the Scholars Strategy Network, an effort to support scholars in popularizing their research for a broad audience and connecting scholars to policymakers.

We can also fight the scourge of misinformation. Many world governments are taking steps to combat falsehoods. While the U.S. federal government has dropped the ball on this problem, a number of states have passed bipartisan efforts promoting media literacy. Likewise, many nongovernmental groups are pursuing a variety of efforts to fight misinformation.

The Pro-Truth Pledge combines the struggle against misinformation with science advocacy. Founded by a group of behavioral science experts (including myself) and concerned citizens, the pledge calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness. Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts as more likely to be true when the facts are disputed. More than 500 politicians took the pledge, including state legislators Eric Nelson (PA) and Ogden Driskell (WY) and Congress members Beto O’Rourke (TX) and Marcia Fudge (OH).

Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance. Thus, taking the pledge yourself and encouraging people you know and your elected representatives to take the pledge is an easy action to both fight misinformation and promote science.

Conclusion

I have a dream that, one day, children will not be getting sick with measles because their parents put their trust in a random blogger instead of extensive scientific studies. I have a dream that schools will be teaching media literacy, and people will know how to evaluate the firehose of information coming their way. I have a dream that we will all know that we suffer from thinking errors and will watch out for confirmation bias and other problems. I have a dream that the quickly growing distrust of experts and science will seem like a bad dream. I have a dream that our grandchildren will find it hard to believe our present reality when we tell them stories about the bad old days.

To live these dreams requires all of us who care about truth and science to act now, before we fall further down the slippery slope.

To live these dreams requires all of us who care about truth and science to act now, before we fall further down the slippery slope. Our information ecosystem and credibility mechanisms are broken. Only a third of Americans trust scientists, and most people can’t tell the difference between truth and falsehood online. The lack of trust in science — and the excessive trust in persuasive purveyors of misinformation — is perhaps the biggest threat to our society right now. If we don’t turn back from the brink, our future will not be a dream: It will be a nightmare.