During the campaign — and into his presidency — Donald Trump repeatedly exaggerated and distorted crime statistics. “Decades of progress made in bringing down crime are now being reversed,” he asserted in his dark speech at the Republican National Convention in July 2016. But the data here is unambiguous: FBI statistics show crime has been going down for decades.

CNN’s Jake Tapper confronted Trump’s then-campaign manager, Paul Manafort, right before the speech. “How can the Republicans make the argument that somehow it’s more dangerous today, when the facts don’t back that up?” Tapper asked.

“People don’t feel safe in their neighborhoods,” Manafort responded, and then dismissed the FBI as a credible source of data.

This type of exchange — where a journalist fact-checks a powerful figure — is an essential task of the news media. And for a long time, political scientists and psychologists have wondered: Do these fact checks matter in the minds of viewers, particularly those whose candidate is distorting the truth? Simple question. Not-so-simple answer.

In the past, the research has found that not only do facts fail to sway minds, but they can sometimes produce what’s known as a “backfire effect,” leaving people even more stubborn and sure of their preexisting belief.

But there’s new evidence on this question that’s a bit more hopeful. It finds backfiring is rarer than originally thought — and that fact-checks can make an impression on even the most ardent of Trump supporters.

But there’s still a big problem: Trump supporters know their candidate lies, but that doesn’t change how they feel about him. Which prompts a scary thought: Is this just a Trump phenomenon? Or can any charismatic politician get away with being called out on lies?

Earlier studies found that not only do fact-checks not work, but they can actually backfire

In 2010, political scientists Brendan Nyhan and Jason Reifler published one of the most talked about (and most pessimistic) findings in all of political psychology.

The study, conducted in the fall of 2005, split 130 participants into groups who read different versions of a news article about President George W. Bush defending his rationale for engaging in the Iraq War. One version merely summarized Bush’s rationale — ‘‘There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks.” Another version of the article offered a correction that, no, there was not any evidence Saddam Hussein was stockpiling weapons of mass destruction.

The results were stunning: Staunch conservatives who saw the correction became more likely to believe Hussein had weapons of mass destruction. (In another experiment, the study found a backfire on a question about tax cuts. On other questions, like on stem cell research, there was no backfire.)

“Backfire is a pretty radical claim if you think about it,” Ethan Porter, a political scientist at George Washington University, says. Not only do attempts to correct information not sink in, but they can actually make conflicts even more intractable. It means earnest attempts to educate the public may actually making things worse. So in 2015, Porter and a colleague, Thomas Wood at the Ohio State University, set out to try to replicate the effect for a paper (which is currently undergoing peer review for publishing in the journal Political Behavior).

And among 8,100 participants — and on the sort of political questions that tend to bring out hardline opinions — Porter and Wood hardly found any evidence of backfire. (The one exception, interestingly, was the question of weapons of mass destruction in Iraq. But even on that, the backfire effect went away when they tweaked the wording of the question.)

“There’s no evidence that backfire describes a common reflex of Americans” when it comes to facts, Porter assures me. (Nyhan, for his part, never asserted that backfire was ubiquitous, just that it was a possible and particularly consequential result of fact-checking.)

Stories of failed replications in social psychology often grow ugly, with accusations of bullying and scientific misconduct flying in both directions. But in this story, researchers decided to team up to test the idea again.

“If you believe in social science, this is an ideal way to resolve a dispute”

The fact that Nyhan and Reifler’s breakthrough study didn’t replicate isn’t a shocker. This happens all the time in science. One group of researchers publishes a breakthrough finding. Another lab tries to replicate it, and fails.

But instead of feuding, Nyhan, Reifler, Porter, and Wood came together to conduct a new study.

“If you believe in social science, this is an ideal way to resolve a dispute,” Porter says. “If we can devise an experiment together, then the results are going to have something meaningful to say about our differing understandings of the world.”

So the four researchers collaborated on two experiments with a wide range of people as subjects, including Trump and Hillary Clinton supporters.

The first experiment drew on Trump’s exaggerations of crime statistics.

In the experiment, participants read one of five news articles. One was a control article about bird watching. Another just contained a summary of Trump’s message without a correction. The third was an article that included a correction. The fourth included a correction, but then also a line of pushback from onetime Trump campaign manager Paul Manafort, who said the FBI’s statistics were not to be trusted. The fifth included a line where Manafort really laid into the FBI, saying, "The FBI is certainly suspect these days after what they just did with Hillary Clinton.”

The thinking here: If anyone should be able to incite a backfire effect among Trump supporters, it’s Trump’s campaign director. Manafort gives Trump supporters cover. They can reject the correction and cite one of the most influential figures in the campaign. And if there’s a time backfire ought to occur, it’s during a presidential campaign, when our political identities are fully activated.

But it didn’t happen. On average, all the study’s participants were more likely to accept the correction when they read it. Trump supporters were more hesitant to accept it than Clinton supporters. But that’s not backfire; that’s reluctance. Manafort’s assertion that the FBI statistics were not to be trusted didn’t make much of a difference either.

“Everyone’s beliefs about changing crime over the last 10 years became more accurate” in the face of a correction, Nyhan says.

The research group then conducted a second experiment during the presidential debates. This one was conducted in near-real time: On the night of the first presidential debate, the group ran an online study with 1,500-plus participants.

The study focused on one Trump claim in particular. Trump said “thousands of jobs [are] leaving Michigan, Ohio ... they’re just gone.”

This, again, isn’t true. The Bureau of Labor Statistics actually finds both states created 70,000 new jobs in the previous year. Half of the participants saw the correction; the other half did not.

Again, the researchers found no evidence of backfire. It’s worth underscoring: This was on the night of the first presidential debate. It’s the Super Bowl of presidential politics. If corrections aren’t going to backfire during a debate, when will they?

Facts sink in. But they don’t matter. Let that sink in.

In both experiments, the researchers couldn’t find instance of backfire. Instead, they found that corrections did what they were intended to do: nudge people toward the truth. Trump supporters were more resistant to the nudge, but they were nudged all the same.

But here’s the kicker: The corrections didn’t change their feelings about Trump (when participants in the corrections conditions were compared with controls).

“People were willing to say Trump was wrong, but it didn’t have much of an effect on what they felt about him,” Nyhan says.

So facts make an impression. They just don’t matter for our decision-making, which is a conclusion that’s abundant in psychology science.

(And if you’re thinking, “How could one short experimental manipulation really change how much participants like Trump?” know that other research shows it’s possible. Notably, studies conducted during the election found that just reminding white voters they may be a racial minority one day increased support for Trump.)

“The big question is: To what extent do those results generalize beyond Trump himself?” says Nyhan. “Many of his supporters may have to come to terms with his records of misstatements by the time this study was conducted.” (The researchers did not test any fact-checks of Hillary Clinton talking points.)

Nyhan doesn’t place blame on Trump supporters themselves; it’s just human nature to stand by our political party’s candidates. But he says there’s something wrong with our institutions, norms, and party leaders who enable the rise of candidates who constantly lie.

In a heartening way, the study does prove it’s possible to change a person’s mind

At least it’s nice to know that facts do make an impression, right? On the other hand, we tend to avoid confronting facts that run hostile to our political allegiances. Getting partisans to confront facts might be easy in the context of an online experiment. It’s much harder to do in the real world.

These results have not yet been peer-reviewed or published in an academic journal — so treat them as preliminary. But I did run them by several political science and psychology researchers for a sniff test.

“These two experiments are well done, and the data analysis appears to straightforward and correct: we observe clear movement on subjects’ beliefs as a result of factual corrections,” Alex Coppock, who researches political decision-making at Yale, writes in an email. “This piece is nice because it adds to the (small but growing) consensus that backfire effects, if they exist at all, are rare.”

Others commended the researchers for collaborating in the face of conflicting results. “I think this is exactly how the scientific process should operate as we try to explain human behavior,” Asheley Landrum, who researches politically motivated reasoning at Texas Tech, writes. “Social scientists, arguably, should be even more aware of motivated reasoning, recognizing that it also occurs in scientists.”

Nyhan’s research is about seeing if attitude change is possible. And this research often comes to frustrating ends. In one study, he and Reifler tested out four different interventions to try to nudge vaccine skeptics away from their beliefs. None made a difference. Though it is elusive, at the least, he found a little attitude change within himself.

“Jason [Reifler] and I have definitely updated our beliefs about the prevalence of the backfire effect,” Nyhan says. He won’t say it’s been debunked. But he’s moving in that direction.