A CBS poll out today finds that 47 percent of Americans — and a full 74 percent of Republicans surveyed — believe it’s “likely” or “somewhat likely” that President Donald Trump’s offices were wiretapped during the 2016 presidential campaign.

There is of course no credible evidence that the Obama administration wiretapped Trump Tower during the campaign. As Vox has outlined, both “the heads of the FBI and NSA categorically denied President Donald Trump’s tweets claiming that President Barack Obama ordered the US intelligence community to wiretap Trump Tower.”

But these latest poll findings are a reminder of an uncomfortable truth about people: They generally don’t base their opinions on a careful analysis of evidence. And it’s not because they are “stupid” or willfully ignorant. It’s because our brains aren’t great at rationally analyzing facts, especially in fractured, ideologically polarized times.

This frustrating trend of people thinking in terms of what supports their party, rather than facts, keeps showing up. Trump voters were more likely to say Trump had a larger turnout at his inauguration than Obama, despite obvious differences in the photos that demonstrated otherwise.

It infects liberals too. When Gallup polled Americans the week before and the week after the presidential election, Democrats and Republicans flipped their perceptions of the economy. Nothing had actually changed about the economy. What changed was which team was winning.

One of the key reasons is an idea called “politically motivated reasoning” — it’s the idea that our brains have something of an immune system for uncomfortable thoughts. We use our intelligence to protect the groups we belong to first, and reason objectively second. (Read more about politically motivated reasoning here.)

But that’s not all.

It’s extremely easy to get people to accept a lie

Recently I had a conversation with Roddy Roediger, one of the nation’s foremost experts on learning and memory. In his experiments, he shows how even small suggestions from others can push us to remember whole scenes and experiences differently.

And overall, there are three key principles that make a piece of false information more believable.

1) Plausibility — or at least the perception of plausibility.

It’s plausible that Obama — or some arm of the federal government — wiretapped Trump. An ongoing story during the Obama presidency centered on the revelations of the Edward Snowden documents that showed the government has hugely powerful capabilities for mass surveillance.

Roediger has demonstrated the power of plausibility in a simple experiment. After walking a study participant through a kitchen, a different participant (secretly working for Roediger) suggests they recall seeing a toaster on the counter. Toasters plausibly exist in kitchens. But in this case, there was no toaster.

“We tested the real subjects later and we even told them, ‘Look, the person you were working with made a bunch of mistakes, so really just rely your own personal memory for the scene’ — they still recalled the toaster,” Roediger explains.

So given that Trump’s assertion seems plausible, and Republicans are probably likely to consider him a credible source, the “wiretapping” claim sticks.

2) Suggestions and innuendo can be just as convincing as assertions.

Trump doesn’t need to directly endorse an idea or conspiracy to spread acceptance of a lie. He can lead people to a conclusion simply by suggesting it.

“It's like what lawyers try to do in court for their prosecutors or defendants,” Roediger says. “You tell a very powerful story that leads to a certain conclusion, although you never state [that conclusion].”

In the lab, Roediger will show people sentences like, “The karate champion hit the cinderblock," or, "The baby stayed awake all night."

“You test people the next day, and you say: ‘The karate champion broke the cinderblock,’ or, ‘The baby cried all night,’” he says. “They'll accept those sentences as being yes, that's what I heard you say yesterday.”

(In the case of wiretapping, Trump made the claim directly in a series of tweets, à la “Just found out that Obama had my ‘wires tapped’ in Trump Tower.” But some of his mistruths have been subtler. Consider when he alleged a connection between Ted Cruz’s father and the JFK assassination. He never directly said Rafael Cruz was in a conspiracy, but he questioned what the older Cruz “was ... doing with Lee Harvey Oswald shortly before the death.”)

3) The more we engage with the lie, the more we misremember.

There’s another reason so many may believe in the wiretapping claim: It’s been in the news a lot. And that amplifies its power.

“When you see a news report that repeats the misinformation and then tries to correct it — you might have people remembering the misinformation because it's really surprising and interesting, and not remembering the correction,“ he says.

The act of retrieving a memory and talking about it with friends makes it all the more sticky. “If I remember an event poorly and you seem to remember it really well, well, I'll update my memory using what you're saying, and that's very adaptive,” Roediger says. “But if you happen to get it all wrong, I'll update my memory with the wrong stuff too.”

On the extreme end, this leads to odd scenarios where entire groups of people have a memory of an event that never happened (check out this fascinating story about a group of people who remember seeing a movie that never existed).

Altogether, it’s easier than ever before to create false memories shared by entire groups of people. Misinformation is everywhere — outright fake stories get shared by thousands — and online social networks help spread and reinforce it. It’s amazing we can still agree on anything at all.

Further reading: understanding human psychology in the age of Trump