Critically, this survey drew its respondents from the same general pool, which had all answered demographic questions beforehand. The only variation was the mode of interview.

The findings, released today: The Trump mode effect is definitely real. Just over 38 percent of people who answered via a web form said they supported Trump, compared to 32 percent of their peers who spoke to a call-center employee, a 6 percentage point gap. But that gap, among college-educated respondents, widened to 9 percentage points.

A similar split held true for registered voters who participated in previous elections, indicating that politically engaged people may also be more reluctant to tell a pollster their true opinion of Trump. One alternate explanation for the gap in levels of support for Trump registered by different polls has been their varying definitions of likely voters; live-interview polls tend also to use more restrictive definitions, making it hard to tease the effects apart. These results, though, imply that mode effects play a larger role than likely voter screens in the discrepancies.

“It suggests that the online polls that report higher levels for Trump have something going for them,” said Kyle Dropp, Morning Consult’s co-founder and author of the study. “It’s not clear to us whether the social desirability effects are going to grow or wane over the course of the campaign—it’s hard to say whether it’s encouraging or discouraging for Trump.

All this would seem to reinforce the idea that the gap is driven by social expectations. And yet, the study didn't see any noticeable differences based on education with automated IVR systems, and it found those with no more than a high-school education were actually more likely to tell live interviewers they supported Trump than to register those views with automated systems. So it's difficult to stitch a consistent narrative out of these findings.

Morning Consult acknowledges its analysis has flaws. Researchers recruited poll respondents online, potentially skewing the sample younger, and they compensated them. And some participants may have been turned off by the second step of calling an interviewer or filling out another survey, dropping out of the process.

But the mode effect has been studied before. Earlier this year, Pew Research Center found that people polled over the phone by live interviewers were far more likely to say they were satisfied with their family life, or to agree that gays and lesbians face discrimination. Conversely, Internet respondents were more likely to voice an unfavorable opinion of Michelle Obama or Sarah Palin.

From the report:

The social interaction inherent in a telephone or in-person interview may also exert subtle pressures on respondents that affect how they answer questions. Respondents may feel a need to present themselves in a more positive light to an interviewer, leading to an overstatement of socially desirable behaviors and attitudes and an understatement of opinions and behaviors they fear would elicit disapproval from another person.

Pew found the largest mode differences on questions where “social desirability” was most likely to play a role, such as opinions on public figures or personal issues. (They found almost no difference with fact-based questions, such as whether the respondent has a driver’s license or a passport.) This supports Dropp’s embarrassment theory—no one wants to tell a stranger that they’re miserable, or that they support a man many consider a buffoon.

What does this mean for Trump? It’s hard to say. His success online could indicate deeper support for his candidacy than polls currently suggest, as Henry Olsen recently wrote. But consider the first real test for Trump: the Iowa caucuses, where residents meet at public buildings and plead their candidates’ cases before casting a secret ballot. If some Iowans are already too embarrassed to share their true thoughts to a call-center operator, imagine how they’ll feel telling their neighbors.