Whenever the Internet Research Agency is in the news, I get a sinking feeling in my stomach. I was one of the first U.S. journalists to report extensively on the St. Petersburg-based “troll farm,” which was named in the indictment that Robert Mueller, the special counsel investigating Russian interference in the 2016 election, issued last Friday. As a result, I am often portrayed as an expert on the Internet Research Agency and Russian online propaganda. In this, I am not alone. The endless unfurling of the Trump-Russia story has occasioned an explosion in the number of experts in “information warfare,” “online influence operations,” “disinformation,” and the like. One reason for this is that the Russians’ efforts tend to be framed as a kind of giant machine, in which talking points generated by the Kremlin are “amplified” through a network of bots, fake Facebook pages, and sympathetic human influencers. The machine, we are told, is so sophisticated that only an expert, well-versed in terms such as “exposure,” “feedback loops,” and “active measures,” can peer into the black box and explain to the layperson how it works.

The thing is, I don’t really want to be an expert on the Internet Research Agency and Russian online propaganda. I agree with my colleague Masha Gessen that the whole issue has been blown out of proportion. In the Times Magazine article that supposedly made me an authority, I detailed some of the Agency’s disturbing activities, including its attempts to spread false reports of a terrorist attack in Louisiana and to smear me as a neo-Nazi sympathizer. But, if I could do it all over again, I would have highlighted just how inept and haphazard those attempts were. That the Agency is now widely seen as a savvy, efficient manipulator of American public opinion is, in no small part, the fault of experts. They may derive their authority from perceived neutrality, but in reality they—we—have interests, just like everyone else. And, when it comes to the Trump-Russia story, those interests are often best served by fuelling the fear of Kremlin meddling. Information-security consultants might see a business opportunity in drawing attention to a problem to which they (for a fee) can offer a solution. Think-tank fellows may seek to burnish their credentials by appearing in news articles—articles written by journalists who, we all know, face many different kinds of pressures to promote sensational claims. (How viral is the headline “Russian Internet Propaganda Not That Big a Deal”?) Even academic researchers, to secure funding, must sometimes chase the latest trends.

But couldn’t I be the sort of expert who tries to downplay the problem, offering a counterweight to others’ opinions? This might be appealing if the issue were being hashed out in obscure scholarly journals, rather than in an atmosphere in which every skeptical utterance about Trump-Russia becomes pro-Trump propaganda. Rob Goldman, Facebook’s vice-president for advertising, learned this lesson the hard way. Late last Friday, he argued on Twitter that, because the majority of the Internet Research Agency’s Facebook ads were purchased after the election, the group’s goal must have been not to elect Donald Trump but “to divide America by using our institutions, like free speech and social media, against us.” Perhaps Goldman hoped that, by portraying the Russians’ machinations as nonpartisan, he could appear to take the problem of online disinformation seriously without offending Trump’s supporters. But Goldman’s caution backfired. Trump triumphantly retweeted him, writing, “The Fake News Media never fails. Hard to ignore this fact from the Vice President of Facebook Ads, Rob Goldman!” In the next few days, Goldman was pilloried by the President’s critics; many pointed out that, according to the Mueller indictment, the Agency’s specific aim was to undermine Hillary Clinton and boost Trump. Goldman later apologized to his company in an internal message.

You can see how wielding my expertise has always felt like a lose-lose proposition. Either I could stay silent and allow the conversation to be dominated by those pumping up the Russian threat, or I could risk giving fodder to Trump and his allies. So, last week, when the Agency once again became the focus of the Trump-Russia story, I ignored the many media requests in my in-box and wrote a couple of short articles instead, including one about a brief telephone conversation I’d had with the alleged executive director of the Agency, Mikhail Burchik. Then, on Monday afternoon, I received an e-mail from a booker for “All In with Chris Hayes,” on MSNBC. They wanted to have me on to talk about Burchik. Figuring, naïvely, that in discussing this one development I’d be able to avoid dealing with knottier questions, I agreed.

The segment began innocuously enough. Hayes asked me about an appearance I had made on the Longform podcast, in 2015, in which I mentioned offhand that many of the accounts I had followed while reporting my Times Magazine story had switched from posting negative information about Obama to positive information about Trump. The Agency, I’d suggested with a laugh, must be pursuing “some kind of really opaque strategy of electing Donald Trump to undermine the U.S.” The fact that I was considering this possibility struck me at the time as a worrying sign that I had internalized the paranoia that defines Russian propaganda itself, which sees in every bad thing that happens to Russia the hidden hand of the United States. Both the Trump campaign and the idea of a Russian troll operation to elect him seemed like a joke back then, and I said as much to Hayes.

The last question was the one I had hoped to avoid. “It seems like, in some ways, it’s a remarkably effective model,” Hayes said, referring to the Agency’s operation. “You don’t have to pull off some enormous thing. You just have to kind of be in people’s consciousness enough, constantly, in this sort of irritant way, with ninety people you’re paying, running an operation that doesn’t cost that much money. It does seem like a good bang for your buck.” I disagreed. I said I didn’t think that what amounted to a social-media marketing campaign—one whose supposed architects had a rudimentary grasp of the English language—could sow so much discord on its own. One could argue that ninety people is about what it would take to run the digital operation of a modern Presidential campaign—to shift votes in a candidate’s favor. But numbers tell only a part of the story. In the indictment, Mueller’s team reveals that the Agency didn’t discover the idea of targeting “purple states” until June, 2016, when a Texas-based conservative activist introduced them to the term. Cambridge Analytica this is not.

The morning after the Hayes interview, I woke up to find that a journalist named Aaron Maté had clipped the video and tweeted it, along with the comment “OMG, a sober/informed Russia take on MSNBC!” (Last April, Maté argued in The Intercept that Rachel Maddow, the network’s most popular host and a strong advocate of the notion that the Trump campaign colluded with Russia, was leading her viewers on “a fruitless quest.”) The clip, which I retweeted, spread faster than anything I’d written or said about the Agency since the original article. Within a few minutes, I had been retweeted by Julian Assange, the founder of WikiLeaks, who relentlessly promotes skepticism about Russian influence. (WikiLeaks, of course, played a role of its own in the 2016 election.) After Assange, various right-wing social-media influencers piled on, including Jack Posobiec, a pusher of the Pizzagate conspiracy. Some current and former employees of RT, the Kremlin-backed news network, picked the clip up, too. It was also shared by many journalists and liberals who cast it as a welcome bit of reason amid the rising frenzy. Still, I could feel my words slipping away, becoming the foundation for someone else’s shakily constructed argument. The fact that I had been given the rare opportunity to share an opinion on national television seemed pretty much cancelled out by the ways its online audience had put it to use.