An online game that tasks players with crafting and distributing propaganda helps them to better detect disinformation after 15 minutes of play, a new study found.

Bad News, a browser-based game developed last year by researchers at the University of Cambridge, lets players take the role of the "bad guy" in creating and sharing misleading information online, from impersonating elected officials to peddling conspiracy theories.

For three months after the game launched, researchers gave players the option of also participating in a study, which would prompt them to rate how reliable they found a series of headlines — some were real news, and some disinformation.

After playing the game for 15 minutes, players ranked the disinformation 21 per cent less reliable, on average, than before they played the game, according to the study published last week in Palgrave Communications , a peer-reviewed academic journal. Their ranking of the real news didn't change.

"We've shown some moderate effects — I wouldn't characterize them as huge — but they're moderate and they were very robust and persistent and statistically significant," said Sander van der Linden, co-author of the study and director of the Cambridge Social Decision-Making Lab. "I was actually quite surprised by that and quite encouraged."

In the game, which was developed in partnership with Dutch media collective DROG and design agency Gusmanson, players build up a propaganda empire online. They use different tactics, such as stoking fears or playing on polarization, to build credibility and attract more followers, which earns them badges. The researchers chose these tactics based on real world examples of strategies used by disinformation networks.

To build credibility and followers, players can choose what kind of content to share, such as this conspiracy theory meme. (Screengrab/Bad News)

Creating content that preys on readers' emotions, for example, is one tactic that can earn a badge in the game. It's a method that Jestin Coler, who previously ran disinformation sites for profit , has said he used.

"Stories aim to create an emotional response to get readers to share content," Coler wrote last year. "That emotional response can be one of hope, inspiration, anger, fear, etc., but the end goal is the share. While reaching a single reader is nice, reaching that reader and their hundred(s) of contacts is far nicer."

By exposing players to these tactics in a game setting, the goal was to inoculate them against disinformation in the real world, and the study results suggest it works.

There were some limitations to the study. For one, the participants were self-selecting: people who had an internet connection and happened to come across the game, either through the university press release or a news article.

They also were aware of the purpose of the game, prompting them to be a bit more alert.

"If people know that they're meant to be looking out for instances of deception, they're going to be paying much more attention in an environment like this game than they would ordinarily," said Jon Roozenbeek, co-author of the study and a Cambridge researcher.

Even still, Roozenbeek said, the measured difference shows the game still has effect, which could translate beyond the game and help people more easily detect disinformation in the real world.