The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Countering science denialism. Arguing with the anti-vaxxers/flat earthers/climate change deniers may feel futile, but research just published in Nature Human Behaviour suggests that it’s actually worth it and can be effective.

Philipp Schmid and Cornelia Betsch of the University of Erfurt in Germany conducted six experiments, online with 1,773 subjects, to see how to “counter arguments of denial at the very moment that they reach an audience, that is, rebutting deniers in public discussions.” These discussions may take place on social media or TV. “Science advocates” have often been reluctant to enter into these discussions at all, worrying they’ll do more harm than good. Here’s what Schmid and Betsch tested, per a write-up of their research by Sander van der Linden, also published in Nature:

All participants were first exposed to an interview with a science denier. Afterwards, they were randomly assigned to a topic-based rebuttal, a technique-based rebuttal or both. A topic-based rebuttal presented the reader with scientific facts countering the misinformation, and the technique-based rebuttal exposed the logical fallacies in the deniers’ persuasion technique. Thus, depending the exact condition, a science advocate was either present or absent and (if present) responded to the denier using a topic-based rebuttal, a technique-based rebuttal or a combination of both approaches. Attitudes towards vaccines and intentions to vaccinate were measured both before and after exposure. Attitudes and intentions to mitigate climate change were measured in the same way.

Schmid and Betsch found is that ignoring the deniers is dangerous: “Public discussions with a science denier have a damaging effect on the audience,” they write. After reading or listening to anti-vaxxers, for instance, subjects had less positive attitudes toward vaccination and said they were less likely to vaccinate; this was exacerbated “when no advocate for science was present.”

But, van der Linden writes, “providing either a topic-oriented rebuttal (for example, approved vaccines are a safe way to avoid disease) or unmasking the denier technique (for example, “impossible expectations,” i.e. no medical product is 100% safe) significantly and meaningfully reduces the negative influence of denialist claims. Interestingly, both techniques proved to be about equally effective.”

It was especially effective to offer rebuttals in “vulnerable subgroups,” such as people in the U.S. who identify as conservative. “Technique rebuttal reduces the influence of the denier for liberal and conservative participants, but the effect was especially strong for conservative participants,” Schmid and Betsch write.

The authors didn’t find evidence of a backfire effect — the idea that simply hearing pro-vaccination or pro–climate-change-is-real arguments causes people who don’t believe in those things to double down on their previously held beliefs.

“Audiences that were most vulnerable to messages of denial (individuals with low vaccine confidence and U.S. conservatives) benefitted the most from topic and technique rebuttal,” Schmid and Betsch write. “Thus, an advocate from science does not need to back off from audiences that are assumed to be difficult to convince.”

Of course, showing up to the debate isn’t a magical cure-all. “Facing deniers in public debates can only be one building block in the concerted effort to fight misinformation,” the authors write. But showing up — wading into the comments of that high school classmate’s Facebook post even if you think it’s pointless — seems to actually do some good, and “not turning up to the discussion at all seems to result in the worst effect.” (One exception is actual in-person debates, rather than the Twitter fight variety: “If the advocate’s refusal to take part in a debate about scientific facts leads to its cancellation, this outcome should be preferred so as to avoid a negative impact on the audience,” since hearing about anti-vaxx and anti-climate change beliefs affects audiences even when a science advocate is also present.

“Pre-bunking” works. Researchers created the Bad News Game, a browser-based game in which players pretend to be a fake news creator:

Players gain followers and credibility by going through a number of scenarios, each focusing on one of six strategies commonly used in the spread of misinformation [impersonating people online, using emotional language, polarization, conspiracy theories, discrediting opponents, and trolling people online]. At the end of each scenario, players earn a specific fake news badge…Players are rewarded for making use of the strategies that they learn in the game, and are punished (in terms of losing credibility or followers) for choosing options in line with ethical journalistic behavior. They gradually go from being an anonymous social media presence to running a (fictional) fake news empire.

“We wanted to see if we could pre-emptively debunk, or ‘pre-bunk,’ fake news by exposing people to a weak dose of the methods used to create and spread disinformation, so they have a better understanding of how they might be deceived,” said Sander van der Linden, director of the Social Decision-Making Lab at the University of Cambridge (and author of the summary of the Nature article above). The study was published in Palgrave Communications.

After playing the game, players were asked to take a survey that measured them on their ability to “recognize misinformation strategies in the form of misleading tweets and news headlines.”

The game was open to anyone; the researchers ultimately got about 14,200 participants in a sample that was “skewed toward males (75%), higher educated (47%), younger (18–29, 47%), and somewhat-to-very-liberal (59%) individuals. Nonetheless, the sample size still allowed us to collect relatively large absolute numbers of respondents in each category.”

Ultimately, the researchers found “preliminary evidence that the process of active inoculation through playing the Bad News game significantly reduced the perceived reliability of tweets that embedded several common online misinformation strategies…active inoculation does not merely make participants more skeptical, but instead trains people to be more attuned to specific deception strategies.” Moreover, there were not significant differences “in inoculation-effects across genders, education levels, age groups, or political ideologies.”

The University of Cambridge notes that “the team have translated the game into nine different languages, including German, Serbian, Polish and Greek. WhatsApp have commissioned the researchers to create a new game for the messaging platform. The team have also created a ‘junior version’ for children aged 8-10, available in ten different languages so far.”

You can play the game here.

