If you look at polling data, there are a few issues on which Republican voters seem to have changed their beliefs since Donald Trump began his campaign for the presidency.

In 2015, just 12 percent of Republicans held a favorable view of Russian President Vladimir Putin, according to Gallup. Now 32 percent of Republicans like him, the firm found in a February poll.

Or take the issue of free trade: Historically, conservatives have been in favor of it. But from 2015 to 2017, Republican support of free trade dropped from 56 percent in 2015 to just 36 percent in 2017, according to Pew.

It’s easy to look at these changing poll numbers and see something blatantly hypocritical — that these Americans are knowingly giving in to Trump rhetoric praising Putin and belittling free trade, betraying their former ideals.

But new research from psychology suggests something else is probably going on: Many political beliefs are fickle, and people probably don’t realize it when they change their minds.

Michael Wolfe, a memory and learning researcher at Grand Valley State University in Michigan, recently published an experiment in the Quarterly Journal of Experimental Psychology that found when people change their mind on a subject, they have a hard time recalling that they ever felt another way.

It’s an intriguing finding in part because it affirms that people think their beliefs are more stable than they actually are. Which means they may be less open to information that conflicts with their belief.

It’s also further evidence that despite what we may think, we don’t hold consistent ideological views. We tend to agree with whatever our leaders agree with, which is particularly worrying with Donald Trump as president, as I described in a recent story about how conservatives are realigning their views with his.

Why we don’t remember when we change our minds

Wolfe ran the study on a sample of a few hundred college students, using as a topic the effectiveness of spanking as a disciplinary measure. This subject was chosen for a few reasons: It’s one that many people have an opinion on, but it’s not so partisan or political that people would be totally unwilling to change their beliefs. It’s also a topic on which it’s relativity easy to find evidence both for and against.

First, Wolfe and his co-author asked the participants if they believed spanking is effective on a scale of 1 to 9. A few months later, they brought the participants into the lab to read arguments for or against spanking. After the prompt, the students were asked to again rate their feelings about spanking. But here’s the key: They were also asked to recall what they first thought about spanking, several months back.

On average, the students changed their minds when they read an argument that was counter to their initial belief. But most didn’t remember. It was just easier to remember the text they’d just read than to think back on their past opinions.

“We don’t go in and grab a memory like opening up a word file or reading it off a tape,” Wolfe explains. “But rather, if you ask a person at a particular time to report their belief, they construct their belief at that moment based on a combination of things that are easily available to them at that time.”

Memories aren’t retrieved; they’re constructed with cognitive shortcuts. And when memories are constructed, we can’t easily see the seams. We don’t notice that they’ve changed.

“When people try to remember a previous belief, information that’s available at a moment biases their ability to remember this old information,” Wolfe says. “They end up thinking their current belief is very similar to their previous belief.”

Here’s the key chart from the study. On the right, it shows that when a participant reads a text that counters what they initially believed, they’re around 2 points more likely on a 9-point scale to endorse it. Then look at the “recollection” scores. That’s their guess for what their original answer was. It’s much more similar to their post-reading answer than it is to the initial response.

Other studies show we have a bias to believe that our past selves are more similar to our current selves than they actually are. People in romantic relationships that have gone sour tend to misremember the fact that they were ever happy with their partners. Our memories of the past are constructed with information available in our immediate present — and we often confuse immediacy and familiarity with truth. Just repeating a lie once can make it more accepted as truth.

There’s also this fun study, published in PLOS One in 2013. In it, researchers gave participants an opinion poll to fill out, and then sneakily changed their answers. When they gave back the polls, the participants didn’t realize their answers had been changed. “A full 92 percent of the participants accepted and endorsed our altered political survey score,” the researchers concluded.

This has huge implications for how we interpret public opinion polls

There are some caveats to Wolfe’s experiment. First is that it was conducted on college students, who don’t necessarily generalize to the rest of the population. Second, this topic — meta-awareness of belief change — isn’t studied all that often, so some follow-up is needed.

And third is that college students may not care all that much about spanking, can be easily swayed, and don’t care enough to recall anything else. We’d have better awareness if we changed our minds on a topic that is more deeply connected to our identity — like access to abortion or belief in climate change.

But that third point isn’t all that much of a caveat, if you consider how we generally think about wonky issues in politics — which is not all that often or all that deeply. As political scientist Gabriel Lenz finds in a forthcoming paper in the Journal of Politics, only about 20 to 40 percent of the public holds stable views on policy. Many of us just take cues from leaders, and our parties, in forming our opinions.

In this light, it’s not super surprising that so many more Republicans are now against the concept of free trade.

“A lot of people don’t know how the parties describe themselves,” Lenz says. “And when they learn my party’s conservative [or is against free trade, etc.], they start saying they’re conservative too. ... It’s very hard to find instances where prior policy views seem to drive later voting or decisions.”

The revelation of this, that our memories are tinged by how we feel in the present, paints public opinion polls in a new light.

Recently, I reported on a public opinion poll that found 59 percent of respondents said we’re currently living through the lowest point in US history that they could remember. In the article, I didn’t consider the possibility that the participants were misremembering, that their immediate displeasure about the state of the country masked over any memory of worse times.

The lesson here: When people answer public opinion polls, they rely on mental shortcuts. They may replace a hard question (“How do I feel about international trade taxes?”) with an easy question (“What team am I on, and how would they answer this question?”).

“These shortcuts can be political ideology; it could be religiosity, deference to scientific authority,” says Dominique Brossard, a psychologist who studies public opinion at the University of Wisconsin. “People don’t see themselves as being irrational doing this.”

So that big public opinion shift in Republicans I mentioned at the top of this piece? It’s both meaningful and not. It’s not meaningful because the participants in the survey, to some extent, are just mirroring Trump’s priorities as a leader. But it is meaningful because politicians like Trump can use the public opinion poll as proof that there’s a groundswell of support for these issues. Likewise, it’s provocative to say now that it’s the lowest point in recent US history, according to a majority of Americans. But without a recording of how these people actually felt decades ago, we can’t know.