Trump appears to put far less stock in public attribution. He’s repeatedly called into question the possibility that digital investigators—whether from intelligence agencies or private companies—could piece together a cyberattack after it’s over with enough accuracy to know where it came from, despite the fact that experts regularly track down attackers by gathering digital evidence.

This attitude has trickled down to the general public. Over the weekend, two reporters for The New York Times asked Trump supporters in Louisiana and Indiana for their reactions on the intelligence community’s hacking report. Their responses ranged from skepticism (“It seems silly”) to total rejection (“I don’t believe it”).

This erosion of public confidence in analysts’ ability to identify hackers is dangerous. “Mistrust of attribution would make hacking easier, since it means retribution is harder: You need to have attribution for retribution, both to know that you are retaliating against the right actor and to convince the public you are justified in doing so if it is a public retaliation,” wrote Nicholas Weaver, a professor and security researcher at the University of California, Berkeley, in an email. “The former is unaffected, but the latter is compromised by needless mistrust.”

That mistrust spread quickly. Two years ago, the only people who concerned themselves with fact-checking cyberattack attributions were top security experts like Bruce Schneier, who wrote an article in The Atlantic arguing that the government didn’t have enough evidence to connect the Sony hack to the North Korean government. (He was convinced later that month, when the Times reported that U.S. intelligence agencies were also relying on secret evidence from the NSA and from sources inside North Korea to back up its claims.) Now, Trump’s public disavowals of hacking analyses have made it popular to question Russia’s involvement.

The increase in public mistrust in cyber-attribution mirrors the way that the language of doubt has taken hold around climate science and the trustworthiness of mainstream news reports. Fewer than half of Americans believe that climate change is the result of human activity—the conclusion of the overwhelming majority of scientists—and just below a third say they have “a great deal” or “a fair amount” of trust in the news media. A third is about the same proportion of Americans who say they believe Russia influenced the 2016 election.

Last week, danah boyd, a scholar of online communications and the founder of Data & Society, wrote that a generation of media-literacy teachings encouraging Americans to question sources and do their own research may have backfired. “Doubt,” boyd says, “has become [a] tool.”

She argues for the necessity of relying on trusted sources of information:

I believe that information intermediaries are important, that honed expertise matters, and that no one can ever be fully informed. As a result, I have long believed that we have to outsource certain matters and to trust others to do right by us as individuals and society as a whole. This is what it means to live in a democracy, but, more importantly, it’s what it means to live in a society.

But people who don’t have the tools to separate bad information sources from good ones may choose unreliable sources, or might be inclined to doubt them all. And when people in power reinforce the notion that experts can’t be trusted—whether it’s climate scientists, journalists for major publications, or medical researchers with advanced degrees—confusion only spreads further. Healthy skepticism turns to toxic, blanket cynicism.