Vladimir Putin. Photo: Alexey Druzhinin/AFP/Getty Images

With pressure on Facebook mounting in anticipation of general counsel Colin Stretch’s testimony in front of the House and Senate Intelligence Committee, it can be helpful to step back and take stock of what we the public know, and what we don’t. We know, for example, that some institution linked to the Russian government — likely the infamous Internet Research Agency — bought ads on Facebook between 2015 and 2017, with the assumed intent of stoking anger and partisanship. We know that the ads concerned wedge issues like immigration, the Second Amendment, and police brutality; we even know what some of the Russian pages and accounts were. And we know that around 3,000 ads were purchased at a cost of around $100,000. Other platforms like Google and Outbrain are investigating Russian ad buys on their networks as well.

Here’s what we don’t know: whether or not, and to what extent, those ads were effective at swinging votes.

Digital advertising is a complicated business, and it can be a bit of a black box. Even with the absurd amount of data collected by Facebook and its various third-party partners, it can be hard to pin down the actual effects of a given campaign — and everyone is incentivized to play up or play down results. A fantastical theory recently floated by a marketing agency claimed that with just $42,800, a dedicated campaign could have swayed the 10,704 Michigan voters that won Trump the state. It’s a fun theory — but it’s a theory that represents an absolute edge-case scenario, in which Facebook ads are supernaturally effective and persuasive. People who’ve worked in digital advertising are more dismissive: Antonio García Martínez, a former Facebook product manager and author of Chaos Monkeys, called theories like this “utter bullshit.”

Part of the problem with assessing the effect of Russian ads is that Facebook itself sends mixed messages. According to the company, 10 million people are estimated to have seen at least one of the ads (25 percent of them were never shown to any users). Facebook rarely makes raw data public, and its public metrics can be misleading. For instance, a video view is registered if someone watches the first 3 seconds with the sound off. So, 10 million people “saw” the ads — but the number that actually absorbed what they saw is totally unclear. (And that’s setting aside that on several occasions now, Facebook has admitted to accidentally misreporting its own metrics.)

At the same time, Facebook brags about its ability to influence voters. Its business section is full of case studies about political campaigns. According to Facebook, Republican governor Rick Scott, running for reelection in 2014, used Facebook to create a “22% increase in Hispanic support,” which the case study calls “a deciding factor” in his win.

Obviously, the ad platform’s effectiveness in political campaigns lies somewhere between all-powerful and house of cards. But so long as Facebook holds back its own data, the rest of us won’t be able to tell which side it falls on. “Facebook probably has the data that would help us understand whether or not there was an impact,” Yochai Benkler of Harvard’s Berkman Klein Center for Internet & Society said in a recent interview with Select All. If Facebook released, for example, precise data about which geographical and demographic groups were targeted by Russian ads, third-party researchers could compare that information to actual and expected turnout in last year’s election. “They have the data that will help us understand,” Benkler says. “They’re not releasing it in any significant way.”

Absent that data, the best we can do is make educated assessments based on past experience. And to anyone who’s worked in online advertising or social-media management, the $100,000 spent by the Russian government is laughably small, no matter how precisely targeted. In contrast, the official Trump campaign spent $90 million on digital ads — and, unlike the Russians, had assistance from Facebook employees to target and deploy them effectively. “There’s no way $100,000 in ad budget impacted the election. It’s ridiculous,” García Martínez said.

None of this means that Facebook doesn’t need regulation. The Russian government secretly paying for political advertisements aimed at influencing a domestic election is a clear violation of U.S. sovereignty — a real problem that Facebook, and the government, must investigate and address. To their credit, both legislators and Facebook executives appear to understand the need to confront the issue. Days after Facebook disclosed its findings, Mark Zuckerberg announced that the company would now require full public disclosure of an ad’s sponsor and target audience. This week, Senators Amy Klobuchar and Mark Warren introduced the Honest Ads Act, a law that would bring digital political ads in line with the more stringent regulations that govern political ads on radio and television.

At the same time, the focus on Russian ads, no matter how well-intended, ignores the many other troubling aspects of Facebook’s influence on the election. For one — buying ads isn’t really what the Internet Research Agency does. Loosely referred to as a “troll operation,” the employees specialize in creating and performing as “sock puppets,” seemingly normal commenters and posters who are actually acting under ulterior motives.

Sock puppets, not nefariously developed and purchased political advertisements, are the IRA’s bread and butter, as recent reports out of Russia corroborate. According to former employees, “the Internet Research Agency targeted U.S. audiences in part by posting provocative ‘comments’ pretending to be from Americans on newspaper articles that appeared on the websites of the New York Times and Washington Post.”

Those comments and other reported elements of the IRA’s intelligence operation — Trump rallies organized by Russian sock puppets, for example — portray a secretive attempt to harness not the ad-buying tools of these companies, but their completely free-to-use network effects. In form and function, many of the things that the IRA is described as doing on social media and in comment sections is not all that different from what anyone else does. Russian trolls (and their profit-minded cousins in “fake news”) “didn’t make up stuff that wasn’t already part of the folklore of the right,” Benkler said. They just “circulated and cut and paste and created all sorts of remixes of all the same sets of stories that were already circulating widely, on the right anyway.” And that’s far more concerning.

At the peak of its influence campaign, the IRA had about 90 people focused on the United States, though that number has apparently now dropped to 50. When social-media companies cracked down on their sock-puppet accounts over the last two months, they suspended 118 communities that could reach an estimated 6 million users. Among the accounts was a Twitter account masquerading as the Tennessee GOP, @TEN_GOP, which had 136,000 followers — ten times as many as the actual Twitter account for the state’s Republican Party. The organization reported the fake account to Twitter three times since September 2016, and yet the company was unresponsive.

According to BuzzFeed:

All told, the account was quoted dozens of times across conservative news outlets. Fox News quoted an @TEN_GOP tweet in at least three stories, including one syndicated by the Daily Caller. The Daily Caller itself quoted it in six stories. Breitbart mentioned it in seven; Infowars in four; RedState in eight.

The Gateway Pundit, another conservative outlet, cited the Russian account in 19 different stories, ranging from one about a motorcyclist who drove through an anti-Trump protest, for which he was arrested, to a story about how it was unfair that banks had stopped lending money to French nationalist presidential candidate Marine Le Pen.

The account’s tweets often derided African-Americans, Muslims, and immigrants.

This kind of free movement of misinformation between disingenuous and malevolent outside actors, passionate true believers, and the reading and voting public should be the real concern of anyone trying to assess the effects of social media on the political process — not a relatively small number of easy-to-identify (and easy-to-regulate) advertisements. A few weeks ago in a press conference, Senator Mark Warner, who is helping lead the Senate Intelligence Committee’s Russia inquiry, said that he was more concerned about sock puppets than ad buys. “It’s the organic posts masquerading as reality that nobody has to pay for, that, to me, is the bigger concern,” said García Martínez.

Put another way, what we should worry about isn’t what Facebook was paid to do, but what it did for free. Clinton outraised and outspent Trump substantially over the course of the campaign, but his earned media — the chatter he generated — was upwards of a billion dollars, according to García Martínez. “The ability for Facebook to amplify that sort of message? That’s the scary, high-value thing.”

And what’s dangerous isn’t just that false stories and conspiracy theories can travel up what Benkler calls the “attention backbone” of social media. In disclosing the Russian ads, Facebook stated plainly that it viewed the ability to communicate across borders as a strength of the system, not a liability. It’s the corrosive effect of these “inauthentic” accounts — which Facebook has no easy way of dealing with — and the attention economy that gives Trump a freebie campaign. There is a growing tendency for people to label anyone they don’t agree with online as a bot. A few months ago, an enterprising college student retweeted by Trump was accused of not existing simply because her profile picture was of a stock photo (raise your hand if you’ve ever used an avatar that wasn’t actually your own face). During the campaign, American trolls tried to suppress the vote by creating fake promotional images telling Clinton supporters that they could vote by text. They didn’t need to set up a Facebook campaign. They were able to spread them around the internet at no cost.