If you go by public surveys, American racism has been in decline for decades. If you look at the news, that trend line is hard to believe. The Southern Poverty Law Center has described what it calls the “Trump Effect,” a measurable rise in bigotry, assault, and intimidation even just since this year’s election. News organizations, including Slate, have been tracking what appears to be a spike in hate crimes and incidents of harassment. Maybe that decline in racism in public surveys didn’t reflect a change in attitudes. Perhaps the ascendancy of Donald Trump shows that white Americans only learned, for a time, to keep that prejudice to themselves.

But another diagnosis of our current plight suggests that racism really had been in decline until recently. Some have framed this as the “P.C. backlash” theory—the idea that “politically correct bullshit,” in Bill Maher’s phrasing, inspired millions of Americans to vote for Trump. To put that in a less Breitbart-friendly way, the idea is that the desire to counter racism might itself end up fomenting prejudice. Based on what we know about the human mind and the psychology of bias, should this “backlash” explanation of the Trump Effect carry any weight?

The notion of an automatic and unconscious psychic rebound has long been baked into pop psychology under the rubric of “repression.” According to a line of research pioneered by social psychologist Daniel Wegner in the late 1980s, something similar can happen on a conscious level, too. The more you try to banish something from your mind, the more you dwell on its absence, which in turn can set the stage for its return. In the same way aggressive dieting can lead to overeating, Wegner said, so can too much effort spent suppressing thoughts make those thoughts more intense. When he asked subjects not to think about an old flame, for example, he found they ended up more lovelorn and preoccupied than they were before. As the first wave of anxiety over political correctness broke in the early 1990s, researchers began to wonder if Wegner’s theory might apply to the stifling of racial prejudice. In 1994, psychologists in Cardiff, Wales, produced evidence that the act of deliberately suppressing your stereotypes could end up making you more biased than you were before—a backlash-type phenomenon they called “stereotype rebound.”

The Welsh team’s findings, and Wegner’s, too, should be viewed with skepticism. As recent replication projects have demonstrated, research methodologies that seemed perfectly appropriate in 1994 would be scoffed at today. But even if support for Wegner’s “thought suppression” has been exaggerated, several other lines of evidence hint at a similar effect. By the 1990s, psychologists understood that freely stated racial prejudice against black Americans was on the wane. Yet it seemed obvious that racism hadn’t really gone away. Instead, it had carried on as a subtler sickness, one that’s now less openly expressed but still drives the behaviors of both institutions and individuals. So researchers tried to figure out a way to measure all the bias that was being bottled up and determine whether it might start to leak.

One approach tested people’s “implicit bias,” the kind you wouldn’t even know you had, and found that it’s widespread. Even those of us who think we’re free of prejudice will be subject to these automatic attitudes arising from the media, early-life experiences, and other sources. (During one of this year’s presidential debates, Hillary Clinton called implicit bias “a problem for everyone.”)

Another tried to identify and understand the people who might be conscious of their prejudice but chose to hide it in surveys. Most people, when asked, will deny having racist views. It’s less obvious what drives that denial—is it a set of deeply held beliefs about the world or just a desire to fit in? In 1998, Ashby Plant and Trish Devine of the University of Wisconsin put out a simple tool for sussing out this distinction. They asked college students to rate their agreement with two different sets of statements. The first tried to get at their internal motivations—for example, how much would the students agree that “Being nonprejudiced is important to my self-concept”? The second set referred to more outward-looking motivations, e.g., “I try to hide any negative thoughts about black people in order to avoid negative reactions from others.”

Plant and Devine found that people varied in their reasons for disclaiming racial prejudice: Some felt both internal and external pressure, others only one of the two. They followed up a few years later with a study of “the implications of complying with pro-black pressure” for people in each group. The results probably won’t be shocking: The people who were highly motivated by outward-looking motivations (and less so by internal values) showed signs of “backlash,” at least in relative terms. They reported feeling more annoyed than the other students by the “P.C. standards” on campus and more angry or threatened by pressure to promote diversity.

How would this play out over time? Another prejudice researcher, Chris Crandall, proposed that a person’s motivations shift as he becomes more invested in the norms of his community. A freshman might show up on campus and find his views on, say, the role of women to be at odds with those of his peers. At first, he’d be motivated to conceal those views for the sake of avoiding conflict. Then, as he came to identify with the culture of the university, he’d adopt its values to the core. In a sense, that’s the point of having anti-racist standards, either on a college campus or in the society at large: They’re supposed to work on people from the outside in, first pushing prejudice into hiding and then helping drive that bias away for good.

The story of Derek Black, the white nationalist who disavowed his deeply held beliefs after attending a liberal college, is an example of how this can happen in practice. But when norms are too restrictive, Crandall says, the process doesn’t always work as intended. Indeed, there may be a kind of sweet spot for minimizing both prejudice and the possibility of backlash. In the context of a university, administrators should be “somewhat more disapproving of prejudiced speech than their students,” he argues, but warns that, “as your norms get more and more stringent, more and more people will reject them. You crank it up more and more, and then you just lose people.”

One study from 2011 showed how certain types of campus pressure can end up making college kids express more prejudice than they would otherwise. At the University of Toronto, social psychologists Lisa Legault, Michael Inzlicht, and Jennifer Gutsell drafted two versions of a “prejudice-reduction” brochure and handed each to several dozen undergrads, none of whom were black. The first appealed to the students’ internal motivations: “You are free to choose to value nonprejudice,” it said. “In today’s increasingly diverse and multicultural society, such a personal choice is likely to help you feel connected to yourself and your social world.” The other brochure was more prescriptive, even threatening, in its tone: “We should all refrain from negative stereotyping,” it read. “It is, after all, the politically and socially correct thing to do, and it’s something that society demands of us.” When Legault, Inzlicht, and Gutsell measured the students’ prejudice against black people with a standard questionnaire, the ones who’d received the more prescriptive brochure displayed more prejudice, on average, than the students who got either no brochure or the one with a more forgiving tone. (Inzlicht, a vocal advocate for improving research methods in his field, notes that while the results from this paper seem robust, they haven’t yet been replicated by an independent lab and should thus be treated as provisional.)

All this research seems to validate the old, intuitive (and rather glib) idea that telling people what to do sometimes makes them do the opposite. But there could be something else at play. When we call out racial prejudice—or even claim, like Clinton did, that “implicit bias is a problem for everyone”—we may reinforce the idea that it’s normal to be prejudiced. And if there’s one thing psychology studies have demonstrated repeatedly, it’s that most people want to act normal.

Lots of research shows you can influence behavior by telling people how others in their community are likely to behave and nudging them to follow suit. Ten years ago, Arizona State University psychologists Robert Cialdini and Linda Demaine applied this notion to the Petrified Forest National Park, where visitors had been taking bits of wood in spite of clearly posted rules. The researchers wanted to know if they could reduce this misbehavior by tweaking the park’s anti-theft message. Over a span of five weekends, they placed more than 1,000 bits of wood along the visitor paths as bait and swapped out different signs to see what happened.

Cialdini, Demaine, and their colleagues found the lowest rates of theft—1.5 percent of all the baits—when their signs instructed people not to steal, and added the following factual statement: “The vast majority of past visitors have left the petrified wood in the park, preserving the natural state of the Petrified Forest.” (Though the forest was losing 14 tons of wood per year, only about 1 in 20 visitors were stealing from the park.) When the anti-theft signage focused on the miscreants instead, noting that “many past visitors have removed the petrified wood from the park, changing the state of the Petrified Forest,” rates of theft shot up to 8 percent. In other words, signs implying (and decrying the fact) that a lot of people steal seemed to make the stealing worse.

People may have a similar, unfortunate response to anti-prejudice campaigns. A pair of business school professors, Michelle Duguid and Melissa Tomas-Hunt, tested this idea in several different ways for a paper published in 2015. For one experiment, they showed students a photo of an older man and had them write a story about a typical day in his life, while warning them to avoid stereotypes. Half the students were informed that, according to a very influential body of psychological research, “the vast majority of people” are afflicted by “stereotypical preconceptions” and related bias. The other half was told that a very influential body of research finds the opposite, that “very few people” are biased.

When the researchers looked at the students’ stories—measuring the extent to which they relied on notions that old people are fragile, dependent, and so forth—they found a clear difference: People who’d been told that stereotypes are common were more likely to indulge in stereotypical descriptions. Duguid and Tomas-Hunt were able to replicate this effect in several different groups of 300 subjects each, including both working adults and undergraduates, and when asking their subjects to avoid stereotypes about women and people who are overweight. “Publicizing the notion that everyone stereotypes might create a descriptive social norm for stereotyping,” they concluded. “Ironically, the very approach purported to reduce stereotyping may backfire and actually increase its occurrence.”

Though Donald Trump may have gotten 2 million fewer votes than Hillary Clinton, his election to the presidency seemed to validate the values he has come to represent. It made them seem more normal. All throughout the campaign, Trump’s most intense supporters claimed their candidate only “said what everyone is thinking,” even as they acknowledged that those thoughts weren’t always pretty. Like the thieves in the Petrified Forest, these people gleaned that their own beliefs about immigrants and Muslims, for example, weren’t so aberrant—that they might just align with everybody else’s.

There are many good reasons for journalists to highlight postelection acts of hate and racial prejudice, starting with the role such publicity plays in supporting current and potential victims. But it’s possible that outrage over Trump’s ascendancy and the Trump Effect could intensify the backlash, by reinforcing the racists’ own, misleading premise—that their views have become the norm. When the New York Times gives A1 over to neo-Nazis shouting “Heil victory” in a Washington, D.C., conference room (and when the video of that rally goes viral), the actions of several hundred racists may get amplified to the point where, for those of a certain inclination, it comes to stand in for the beliefs of 320 million other people.