They hide behind Twitter hashtags, Facebook ads and fake news stories. They’re the work of bots and trolls, and one of the most skilled countries at deploying them is Russia. So how do these entities actually work to spread disinformation? We asked two experts. This is St. Petersburg-based activist Ludmila Savchuk. She has tracked disinformation campaigns and even gone undercover to learn how they work. And this is Ben Nimmo, a London-based analyst who focuses on information warfare. Let’s define what’s what. A bot is short for robot. It’s an automated social media account that operates without human intervention. During the 2016 presidential election, suspected Russian operators created bots on Twitter to promote hashtags like #WarAgainstDemocrats. A troll is an actual human being, motivated by passion or a paycheck to write social media posts that push an agenda. In 2015, Savchuk worked undercover for over two months at a troll factory in Russia that has gone by many names, including Glavset and the Internet Research Agency. Troll accounts are usually anonymous or pretend to be someone else, like hipsters or car repairmen. But it can even get stranger. Trolls can also set up bots to amplify a message. Facebook is one common platform for Russian trolls and bots, which, in 2016, used fake accounts to influence U.S. elections. Here’s how some experts think that played out. American officials suspect Russian intelligence agents of using phishing attacks to obtain emails damaging to the Hillary Clinton campaign. They then, allegedly, created a site called DCLeaks.com to publish them. A troll on Facebook, using the name Melvin Redick, was one of the first to hype the site, saying it contained the “hidden truth about Hillary Clinton.” An army of bots on Twitter then promoted the DC Leaks, and in one case, even drove a #HillaryDown hashtag into a trending topic. Facebook believes that ads on divisive issues created by Russian trolls were shown to Americans over four million times before the elections. Russian-linked trolls and bots also tried to exploit divisive issues and undermine faith in public institutions. Federal investigators and experts believed Russian trolls created Facebook groups like Blacktivist, which reposted videos of police beatings, or another, Secured Borders, which organized anti-immigrant rallies in real life. “Today, Russia hopes to win the second Cold War through the force of politics as opposed to the politics of force.” How can you stop them? You can’t. Even Vladimir Putin seems to agree. But ID’ing their tactics helps contain their influence. If a suspicious account is active during the workday in St. Petersburg or posting dozens of items a day, those are red flags. Decode the anonymity. Look for alphanumeric scrambles in a user’s name, and try Googling its profile picture. Look at the language. If an account makes grammar mistakes typical for Russian speakers, or changes behavior during times of strained Russian-U.S. relations, then congratulations. You might have caught a bot or pro-Kremlin troll.