Ben Nimmo, senior fellow for information defense at the Atlantic Council's Digital Forensics Research Lab, speaks at the Atlantic Council Disinfo Week event in Brussels, Belgium on March 8, 2019.

When trying to stop the spread of disinformation by malign foreign and domestic actors online, “it’s not enough to do the fact-checking,” according to Ben Nimmo, senior fellow for information defense at the Atlantic Council’s Digital Forensic Research Lab. To really kill the power of the disinformation, “we have to do the story telling,” he argued.

Speaking at the Atlantic Council’s Disinfo Week event in Brussels, Belgium, on March 8, Nimmo suggested that too many policy makers are focused on disinformation as an information warfare problem rather than “narrative warfare.” It is not access to better or new information that is making Russian and domestic extremist propaganda more successful online, Nimmo said, quite the contrary. “We have the facts,” Nimmo explained, but “they have the stories.”

In many fake news social media postings, the information provided is often clearly distorted, but that is not the real point, Nimmo explained. “It’s much easier to tell a story when you are not [bound] by the facts,” he said. Rather, the narratives pushed by the bots and trolls are the more dangerous weapons. “They try to make you so angry and so scared that you stop thinking,” according to Nimmo. “And once you stop thinking, they’ve got you.”

Just fact-checking the information, therefore, is simply not enough to counter the disinformation, as it does nothing to make the narrative less attractive. Daniel Kimmage, acting coordinator of the US Department of State’s Global Engagement Center, which leads the United States’ counter propaganda and counter disinformation efforts, agreed that stopping at fact checks is not enough. “We should not be shy about publicly supporting what makes us strong,” he said, suggesting that Western governments need to make sure that their stories of peace, security, and freedom are also being broadcast on social media. Importantly, Kimmage said that global democracies should work together in these efforts, as “we are united in this fight; our adversaries are largely alone.”

Nimmo suggested that media groups and civil society organizations, such as the Atlantic Council’s Digital Forensic Research Lab, can also push out their own narratives, which can be just as compelling as those pushed out by the propagandists. “We in the open-source community have to speak [to readers’] interest and curiosity…and we have to seek to empower,” he said. Nimmo suggested that counterdisinformation experts should expose propaganda and disinformation campaigns in the style of whodunits: taking the reader step-by-step through the process of discovering and identifying the sources of the disinformation and those spreading it. “Share the how,” Nimmo argued. “[Take] the reader on the journey.”

Nimmo demonstrated how this could be done by detailing the identification of a false Russian news story about a Russian fighter jet which supposedly disabled a US missile cruiser in the Black Sea in 2014. Nimmo took the audience step-by-step through his process of analyzing the Russian media reports to first identify the story as fake and then move toward finding the source. Nimmo eventually revealed that the story had actually originated on a Russian satirical humor website and was completely fake.

Readers will be more likely to be engaged with content presented like this rather than boring fact checks, Nimmo argued, as well as will gain trust in media and civil society fact-checkers who show their work. Importantly, Nimmo added, experts can “actually teach [readers] the skills in advance…then it is them who is being Sherlock Holmes.”

Nimmo explained it is easy to teach readers how to identify and expose disinformation, mainly by teaching them the “three A’s” of disinformation: activity, anonymity, and amplification. Malign online actors often use social media profiles that post much more often than humanly possible (activity), have no verifiable personal information (anonymity), and primarily share content that is not original to that specific profile (amplification). Nimmo added that he has taught the three A’s to elementary school children who quickly grasped the concept and became bot finders themselves.

“If you can teach one person to do this,” Nimmo argued, “you can make them resilient to disinformation, but you are [also] going to make our whole [counterdisinformation] community that much bigger and that much stronger.” If Western civil society and media groups can transform the presence of propaganda online from dangerous distortions to fodder for compelling mystery dramas, Nimmo said, “the greater chance we will have of actually not just dealing with information warfare, but of dealing with narrative warfare, which is a much more dangerous threat.”

David A. Wemer is assistant director, editorial, at the Atlantic Council. Follow him on Twitter @DavidAWemer.