J.M. Berger is a fellow with the Alliance for Securing Democracy.

Over the past year, Americans have been hearing a lot about Russian attempts to sway public opinion in the United States using manipulative tactics on social media. But in the abstract, it can be hard to understand exactly what that means.

To help illuminate the issue, the Alliance for Securing Democracy recently unveiled Hamilton 68, an interactive dashboard displaying the near-real-time output of Russian Influence Operations on Twitter—or RIOT, if you’re a fan of on-the-nose acronyms. The dashboard is the product of a research collaboration that includes me, Clint Watts, Aaron Weisburd, Jonathon Morgan and the German Marshall Fund.


The network promotes a selective worldview of Western societies in decline, suffused with crime, chaos and conspiracy, and a Russia (and a Russian president) filled with strength and integrity. It produces some original content and amplifies content produced by other people, scouring the internet for messages that tear down confidence in democracies, while absolving Russia and its allies from any hint of wrongdoing.

That said, RIOT is a complex affair. Influence operations are subtle by design, and those behind them don’t usually reveal themselves. The dashboard shows a mix of Twitter account types, including openly Russian-backed media such as Russia Today (RT) and Sputnik, as well as 600 Twitter accounts linked to a less obvious influence network, which exists to amplify opinions and information that favor the Kremlin’s preferred narratives.

The output of this network is not comprehensive by any means, but it is representative of the content shared by thousands more social media accounts on Twitter, Facebook, YouTube and virtually every other major social media platform. A detailed description of how this network was identified can be found here.

The network monitored by the dashboard amplifies three types of content: Content openly created by Russia; content of unclear origin, which the Russians may or may not have had a hand in creating or shaping, directly or indirectly by driving traffic to certain themes; and content that was created by third parties not formally connected to Russia, but which reflects the messaging priorities of the influence network.

While there has been much speculation about the role of Russia in creating “fake news,” it is safest to assume that the majority of content promoted by the network falls into the first or third category, until proven otherwise. If that assumption is correct, then the majority of the content amplified by the network is produced by third parties. The users in the dashboard were chosen because they reliably amplify the themes of the broad network linked to Russian influence campaigns. Some are almost certainly in the employ of the Kremlin, but others may be more loosely connected. What we can say with certainty is that the accounts we monitor were selected because they act in a synchronized manner to promote messages we could connect back to overt Russian media and themes. We excluded one subnetwork of accounts that synchronized with this network’s activity, because technical clues suggested it may have a different origin, and we want to research the subnetwork further.

So what are the major themes and stories promoted by RIOT?

Pro-Russia Content

Russia produces an extraordinary amount of directly attributable media, such as RT, Sputnik and affiliated entities. This material is heavily promoted by the network, but it represents only a portion of the total content.

The official outlets promote Russia in general, as one might expect, with recent headlines like “Futuristic Russian bridge recognized as international masterpiece” and “Putin’s Fishing Trip: How Russian Leader Chased Giant Pike.” They also seek to promote Russian foreign policy interests in fairly obvious ways, with stories like “New ‘Planned’ Anti-Russia Sanctions Aimed to Make US Global Energy Monopolist,” “VP Pence’s Balkan remarks show Washington sliding into ‘primitive Cold War cliches’” and “Moscow Considers EU Commissioner Attack on Russian Law on NGOs Unfounded.”

Below the surface of this output, the RIOT network selects stories to promote and amplify. Often these stories come from the attributed outlets—from Aug. 1 to Aug. 9, the RIOT accounts linked to RT.com more than 1,300 times and sent nearly the exact same number of links to SputnikNews.com. The actual number is almost certainly larger; we couldn’t efficiently parse the large number of link shorteners (such as bit.ly and trib.al) used by the network.

The most-shared RT story over that nine-day period was a summary of a Facebook post by filmmaker Oliver Stone, condemning U.S. sanctions against Russia and claiming that U.S. intelligence agencies are engaged in a “false flag” war against Russia. False flags and conspiracy theories abound throughout the content of the network.

The top Sputnik story shared by the RIOT network was one of six stories attacking the Hamilton 68 dashboard itself. This particular story quoted pro-Russia Twitter users who took a dim view of the effort. Another Sputnik story quoted the infamous former Russian ambassador to the U.S., Sergey Kislyak. Other stories prominently featured in the network focused on Russian military cooperation with Iran and China.

While the Oliver Stone story was the second-most shared individual story, the majority of content in this category was amplified content created by third parties not necessarily linked to Russia, but whose content could be deployed to emphasize pro-Russian themes.

These included links from both the political right and the left, as well as more traditional news outlets. A recent story from Consortiumnews decried anti-Russia hysteria and promoted a documentary “debunking” Sergei Magnitsky, whose story has recently risen to prominence thanks to Donald Trump Jr.’s Trump Tower meeting with Russians during the 2016 campaign.

Not all of the stories linked by the network were recent. One prominently linked story was a 2016 Daily Mail article claiming the FBI tried to frame Julian Assange. Another was a 2015 article on the left-leaning Truthout accusing the U.S. of sparking the Syrian civil war, citing WikiLeaks as its source.

It is important to note here again that we are not asserting Russia is responsible for creating or shaping this content, except possibly by sending clicks to the sites in question. The Russian influence network amplifies content found online that suits its narratives, for obvious reasons. Amplifying content created by third parties is a powerful tool because it distances Russia from the content and conveys a breadth of support for the themes. It’s also just plain easier to jump on someone else’s content or trend than to create your own out of thin air.

One of the most prevalent themes pushed by RIOT is the promotion of conspiracy theories that muddy the waters regarding any wrongdoing by Russia or its allies, particularly the Syrian regime. This material is significantly promoted over social media, with occasional help from the attributed outlets. Examples over the past year include conspiracy theories seeking to discredit Bana al-Abed, a young girl in Syria who tweeted about the civil war with assistance from her mother, and reports of chemical attacks by the Syria regime, as well as the Seth Rich conspiracy theory, which conveniently exonerates Russian hackers from being the source of the WikiLeaks dump of Democratic National Committee emails during the 2016 election.

While there are examples of RIOT promoting voices from the far left, as noted above, this activity appears to be dwarfed by the scale at which it promotes the far right, both in the United States (which is the main focus for content documented in the dashboard) and in Europe.

Far-Right Content

For three consecutive days in August, the most retweeted Russia Today stories recorded by the dashboard involved scaremongering videos appearing to show refugees swarming into Spain, as well as a story alleging that the German government is suppressing news of refugee crimes.

Such click-baiting content is only the tip of the iceberg of the RIOT network’s support for far-right movements around the globe, and within the United States. On their own sites and social media accounts, RT and Sputnik tread relatively carefully in their flirtation with the far right, and they devote a significant amount of space to the far left as well.

But support for the far right is a much bigger component of the influence network, and it is much closer to the surface. It is important to emphasize that our analysis of these networks was not based on the political views of its participants. When we analyzed activity related to Russian influence, we found a large number of very active accounts that present themselves as part of the "alt-right" but which are closely linked to the RIOT network.

Some of these accounts are likely controlled by someone in Russia, but some users eagerly participate for their own reasons. The accounts monitored in the dashboard are there because they reliably amplify content related to Russia and its messaging themes. As noted above, we believe we have identified other influence operations promoting the "alt-right" on social media that likely have a different point of origin, but the subnetwork identified here is aligned with the RIOT network. We are continuing to research this issue in an effort to better quantify and further contextualize how these right-wing and pro-Russia networks intersect and overlap.

In the United States, this overlap is primarily focused on the "alt-right," the vaguely defined anti-immigrant, white supremacist movement whose center of gravity is firmly anchored on social media and the Web. While the "alt-right" has a very real base of support in the United States, it also enjoys deep and undisputed ties to Russia, many of which can be found offline in the real world. So it should not be surprising that Russian influence networks are part of the "alt-right" mix online.

In Germany, RIOT amplifies false reports about crime by refugees, through a variety of outlets, including anonymous Twitter accounts. In the U.S., the network amplifies many of the topics discussed by the "alt-right," but with a special focus on issues that are incidentally useful to Russian interests, such as the aforementioned Seth Rich conspiracy theories, which seek to divert blame away from Russia for the hack of the DNC’s emails. Stories related to Rich were tweeted more than 200 times from Aug. 1 to Aug. 9.

During the same period, the most-tweeted theme reflecting "alt-right" interests were reports on Donald Trump’s national security adviser H.R. McMaster and his relationship with Susan Rice. We were able to identify 38 distinct URLs from 21 different websites, which were tweeted at least 144 times, referencing Rice’s relationship with McMaster. Virtually all of the links we could identify as discussing McMaster pointed to hyperpartisan websites or so-called fake news websites. The RIOT network tweeted more than 560 times to content on Breitbart.com, including 32 distinct links related to McMaster, along with a host of other issues.

Hashtags, such as #firemcmaster, similarly reflected that theme, which likely originated on Reddit before being adopted by the "alt-right" on Twitter and subsequently amplified by both the RIOT network and a variety of other bots of unclear origin (not included in the dashboard).

Content related to President Trump featured heavily in the network, including a number of mainstream pieces that reflected well on his administration and also coincide with Russian foreign policy priorities (such as a Washington Post article on gains against ISIS, including cooperation with Russia, and a Weekly Standard article praising the administration’s decision to end CIA assistance to Syrian rebels).

While many of the links tweeted by the accounts we monitored were aimed at the less overtly racist portion of the "alt-right," the network tweeted 76 distinct URLs relating to conspiracy theories about the Jewish financier George Soros, a topic for which fresh content appears almost every day and in almost every corner of the worldwide Russian influence network. Most of the stories linked by this network left the issue of anti-Semitism implicit, although those undertones have been more openly discussed in recent months.

The overall hashtag counts reflect the mix of issues covered by the full network. While #MAGA was the top hashtag by a wide margin, the next several top performers showed a mix: #Syria was No. 2, followed by #Russia, #US, #Trump, #Ukraine, #Venezuela, #ISIS and #FakeNews. #FireMcMaster was 14th.

Where the "alt-right" influence can be more clearly seen in the dashboard is in trending hashtags and topics, which the "alt-right" subnetwork tends to employ aggressively. Here, content related to Rice, McMaster and Rich tends to show up frequently, as the network responds to the latest post on the topic. Other frequent hashtags and topics include the “Deep State” and attacks on mainstream independent media outlets, most notably a dramatic run of attacks on CNN in July, which took place in a variety of online locations favored by the "alt-right."

What Does It All Mean?

The activity monitored by the accounts in the dashboard—and thousands more that are not included in this dataset—is prolific, but what does it accomplish?

The question is extremely important, as the United States and Europe grapple with the onslaught of disinformation and opinion manipulation. We’re working on research to put some specifics on this question, but some aspects are already clear.

One thing that we can see in the data is that running an influence operation is a costly activity, not necessarily in money, but certainly in terms of the resources needed just to keep the storylines alive and trending.

Our initial survey suggests that it takes a lot of RIOT—including professional trolls, bots and some number of participating true believers—to sway relatively few people. The “influence” in an influence campaign is more than just sending out a lot of tweets so people see content that reflects your priorities. It also involves a lot of less visible work, using many different low-profile accounts to send modest numbers of likes, retweets and engagement that incentivize and reward real people in your audience for posting their own contributions to the echo chamber.

Efficient or not, these efforts do convince at least some people. Many of us who encounter this content on social media can point to friends, family members or high school classmates who have bought into the same narratives that Russia seeks to amplify—that democracy is broken and diversity is dangerous. Some of those real people believe and repeat fake news. Some openly admire Vladimir Putin’s autocracy and Russia’s policies. And some vote based on their beliefs—whether about Rich’s death, or Clinton’s health or the existence of a pedophilia ring in a pizza shop.

Not everyone subscribes to pro-Russian views because of an influence campaign, online or off. Americans and people in general have incredibly diverse views. Some people sincerely hold beliefs that correspond to the views promoted by Russia and came to those views through a different route. And in America, they are free to do so. We cherish that freedom. The Hamilton 68 dashboard and other unaffiliated efforts to expose Russian influence campaigns are not trying to tell people what to think. Our efforts have not silenced a single voice, and they will not.

But people have a right to know when someone is trying to manipulate them. By talking about these campaigns, we can help people understand the origins of the information they consume, and the tactics Russia employs in its campaign to sway their opinions. Armed with that knowledge, some people will certainly continue to support Russia-friendly policies and attitudes. They are welcome to do so.

But other people may decide that they don’t like being manipulated—a very American attitude indeed. Such people may find it helpful to understand the forces trying to shape the information environment that surrounds them—to “question more,” as it were. It is for them that we do this work.

