The researchers then asked participants, for each version of the post, to rate factors like "how much they agreed with the message," "how accurate they found it," "how much they liked the writer," and, significantly, how likely they were to share the post with others -- either to propagate it or to argue against it.

Their findings? "Frequent users are particularly disposed to be influenced by negative racial messages." The group of more-frequent Facebook users didn't differ from others in their reaction to the egalitarian message. But those users "were more positive toward the messages with racist content -- particularly the superiority message."

Courtesy Shannon Rauch

So, oof. This is, to say the least, troubling. And yet it's also not fully surprising. The study itself, in fact, is confirming the hypothesis that Rauch and Schanz started with: "We predict," they noted, "that due to potential chronic traits and/or their adaptation to a Facebook culture of shallow processing and agreement, frequent Facebook users are highly susceptible to persuasive messages compared to less frequent users."

Facebook, for all the unprecedented connection it fosters among previously atomized people, fosters a very particular kind of connection: one that is mediated, at all times, by Facebook. And one that therefore makes very particular kinds of assumptions about how and why people connect in the first place. Facebook "connection" is defined -- semantically, at least -- by friendship. ("Facebook friends," "friending people," etc.) While it doesn't assume that every connection is an actual friend, in the narrow and maybe even old-fashioned sense of the word, Facebook's infrastructure does assume esteem among people who friend each other. (Compare that to LinkedIn, or even Twitter, which tend to take a much more pragmatic view of human interaction.)

Facebook, as a result, is structured as an aggressively upbeat place. And one potential cause of that is an overall atmosphere of social complicity. You can argue on Facebook, but it is not really encouraged. And the interactions Facebook fosters as it expands -- the status updates, the information sharing, the news consumption -- stem from that default-positive place. "Like," but not "Dislike." "Recommend," but not "Reject."

This is, of course, mostly to the good. The Internet has enough vitriol as it is; who wants to be on a site where everyone's disliking things? The question, though, is whether complicity leads to complacency. Particularly when it's made structural, as it so often is within Facebook's environment. Does this resolutely uncritical atmosphere harm people's ability to think critically?

While "it's all correlational right now," Rauch told me, the results she and Schanz got in their research could be due to the "atmosphere of agreement that Facebook provides" -- which, in turn, could lead to a "tendency for shallow processing" when it comes to the information being consumed on Facebook. Again, though: correlational. Heavy users of Facebook tend to use the site because of a desire for social inclusion. In that context, the study suggests, those users are primed to agree with fellow users rather than to criticize the information those users share. And not just in terms of their public interactions, but in terms of their private beliefs. This potent combination -- "a need to connect and an ethos of shallow processing" -- provides a warm, moist breeding ground for the spread of opinions, publicly and not-so-publicly. Racist ones among them.