During the 2016 election cycle, Russian trolls used fake personas to share politically polarizing, pro- and anti-vaccination messages, researchers report.

Researchers identified nine types of the trolls’ personas, from fake Black Lives Matters activists to fake boosters of Donald Trump, and examined the extent to which those persona types discussed and played into ideas about vaccination, and how they did so.

“…I don’t think the Russians care about vaccines, but along the way they created and intensified this emerging divide.”

The study encompassed more than 2.8 million tweets published by 2,689 accounts operated by the Russian Internet Research Agency (IRA) from 2015-17.

“There is a real danger of health topics being politicized and used as propaganda tools. If that happens for topics such as coronavirus, people would be inclined to evaluate the importance and veracity of health messages—from either health experts, politicians, or trusted media outlets—based on how it reflects their political leanings,” says coauthor Yotam Ophir, an assistant professor of communication at the University at Buffalo.

“If people perceive health topics as being aligned with a political agenda, whether it’s left or right, then they will consequently begin to lose trust in health organizations and question their objectivity.”

“We demonstrate how IRA accounts discussed vaccines not only to sow discord among people of the United States but also to flesh out the personalities of their ‘American’ accounts in a credible way,” write Ophir and coauthors.

Why it matters now

Although these polarizing vaccination tweets made up a small portion of the messaging from the Russian trolls over the three years, these accounts used pro- and anti-vaccination tweets to help establish realistic-seeming partisan identities. With these tweets, the trolls could potentially affect attitudes, promote vaccination hesitancy, and magnify health disparities, the researchers say.

“Russian trolls worked to polarize Americans on a health topic that is not supposed to be political,” says Ophir. “As our nation deals with the coronavirus pandemic, that type of politicization poisons the well of crisis communications for COVID-19 in ways that create tensions, mistrust and, potentially, a lack of intention to comply with government orders and health directives.”

“I don’t believe the Russians wanted to sow discord around vaccines specifically, but rather chose to harness social tensions around vaccines in order to make the Republican characters they created appear more Republican and the Democratic characters they created to appear more Democratic,” Ophir says. “This intensifies a recently emerging divide where one previously did not exist.”

The Russians’ intentions in this particular case, however, don’t matter when considering the implications for public health, according to Ophir. What is pertinent is that the IRA used a public health topic to serve its own strategic and political needs that targeted Republicans and Democrats with different messages. If that proves effective, the Russians will ramp up their misinformation campaign, moving from what might be an unplanned outcome to a more persistent and focused effort.

“In recent years, we see the change already with Republicans starting to lose trust in vaccines while Democrats seem unmoved,” Ophir says. “Again, I don’t think the Russians care about vaccines, but along the way they created and intensified this emerging divide.

“Now they can target each party with different messages, spreading misinformation unequally, targeting susceptible groups with lower trust in government and science.”

Vaccine disinformation and American personas

The researchers expanded on past studies of Russian attempts to sow discord during the election and IRA tweets on vaccination. The current work included the full set of IRA Twitter activity over the three years, and ultimately included the analysis of 2.8 million tweets. Among those, the researchers identified 1,968 polarized vaccination tweets.

“We first used unsupervised machine learning to map the various topics IRA accounts were talking about,” says lead author Dror Walter, an assistant professor of communication at Georgia State University.

“We used network analysis to group together accounts that tended to discuss the same topics and used the same language. With this method we were able to identify nine different groups of users, which we call ‘thematic personas.’ We then analyzed computationally and manually how each group discussed the issue of vaccines.”

Among those personas, for instance, were one “thematic community” focused on tweeting links to hard news updates and one focused on soft news; one that was clearly pro-Trump, and one clearly anti-Trump; one that specialized in youth talk and celebrities, one that imitated African American users in topics (Black Lives Matter activism) and language, and one that focused on Ukraine. Others focused on international topics, and retweets and trendy “hashtag games.”

The researchers found striking differences in the ways different personas talked about vaccines, with the biggest differences falling across political lines. They says that the trolls attempted to cater to audiences of different political inclinations with targeted messages based on their perceived opinions about vaccines. Simply put, pro-Trump personas and African American personas were much more likely to express anti-vaccine sentiment than the anti-Trump, liberal personas.

Specifically, of the accounts using the pro-Trump persona, 17% mentioned vaccines at least once and more than half of those tweets were anti-vaccination. Among the accounts adopting the liberal, anti-Trump persona, only 2% mentioned vaccines; about half of those tweets were neutral on vaccination and over a third were pro-vaccine.

About 11% of the accounts imitating African American users mentioned vaccines. While the tweets mentioning vaccines made up a very small percentage of the total from these accounts, the sentiment among tweets that did discuss it was balanced, slightly more negative than positive.

Fears for the future

“As COVID-19 spreads disease and death across the globe and scientists race to develop treatments and a vaccine against it, I expect the Russian discourse saboteurs to resurrect the behaviors we isolated in this study,” says coauthor Kathleen Hall Jamieson, the author of Cyberwar: How Russian Hackers and Trolls Helped Elect a President (Oxford University Press, 2018) and director of the University of Pennsylvania’s Annenberg Public Policy Center.

“I have reason to strongly believe, though we don’t have the data,” says Ophir, “that Russia and other countries who try to interfere in our political discourse will use coronavirus to spread misinformation and rumors to solidify the relationships they’re building with new troll accounts that replace the ones removed by Twitter.

“The virus is not political, but when any health topic becomes a political matter at the expense of fact, the result is to base conclusions and make decisions, such as whether to social distance or not, on party loyalty, not science.”

“Even if small in magnitude, the intentional Russian spread of anti-vaccine discourse targeted at specific subpopulations that are susceptible to it (i.e., pro-Trump users and African Americans on Twitter) could be the beginning of a new front in the ongoing informational cyberwar,” the researchers write.

The analysis appears in the American Journal of Public Health. The Annenberg Public Policy Center supported the research.

Source: Penn, University at Buffalo