When Young Mie Kim began studying political ads on Facebook in August of 2016—while Hillary Clinton was still leading the polls— few people had ever heard of the Russian propaganda group, Internet Research Agency. Not even Facebook itself understood how the group was manipulating the platform's users to influence the election. For Kim, a professor of journalism at the University of Wisconsin-Madison, the goal was to document the way the usual dark money groups target divisive election ads online, the kind that would be more strictly regulated if they appeared on TV. She never knew then she was walking into a crime scene.

Over the last year and a half, mounting revelations about Russian trolls' influence campaign on Facebook have dramatically altered the scope and focus of Kim's work. In the course of her six-week study in 2016, Kim collected mounds of evidence about how the IRA and other suspicious groups sought to divide and target the US electorate in the days leading up to the election. Now, Kim is detailing those findings in a peer-reviewed paper published in the journal Political Communication. The researchers couldn't find any trace, in federal records or online, of half of the 228 groups it tracked that purchased Facebook ads about controversial political issues in that six-week stretch. Of those so-called "suspicious" advertisers, one in six turned out to be associated with the Internet Research Agency, according to the list of accounts Facebook eventually provided to Congress. What's more, it shows these suspicious advertisers predominantly targeted voters in swing states like Wisconsin and Pennsylvania.

"I was shocked," says Kim, now a scholar in residence at the Campaign Legal Center, of the findings. "I sort of expected these dark money groups and other unknown actors would be on digital platforms, but the extent to which these unknown actors were running campaigns was a lot worse than I thought."

Suspicious Groups

To conduct her research, Kim solicited volunteers to install a custom-built ad-tracking app on their computers. Kim describes the software as similar to an ad-blocker, except it would send the ad to the research team's servers rather than block it. Kim whittled the pool of volunteers to mirror the demographic, ideological, and geographic makeup of the United States voting population at large. She ended up with 9,519 individuals altogether, who saw a total of 5 million paid ads on Facebook between September 28 and November 8, 2016.

'The extent to which these unknown actors were running campaigns was a lot worse than I thought.' Young Mie Kim, University of Wisconsin-Madison

From that massive pool, Kim took a random sample of 50,000 ads, and conducted searches for any that touched on one of eight politically sensitive topics: abortion, LGBT issues, guns, immigration, nationalism, race, terrorism, and candidate scandals (for example, Donald Trump's Access Hollywood tape or Hillary Clinton's private email server). After throwing out ads placed by the candidates or super PACs, the researchers were left with 228 individual groups. Kim then returned to the larger pool of 5 million issue-based ads to find all of the ones associated with those groups.

In total, groups that had never filed a report with the Federal Election Commission placed four times as many ads as groups that had. Until now, the FEC has failed to enforce rules about political ad disclosures online, and only recently voted to expand those disclosure requirements. That has allowed digital political ads—including the ones affiliated with the Internet Research Agency—to proliferate with no regulatory oversight.

Kim's research showed that in fact, these unregulated ads made up the majority of issue-based ads on Facebook during the course of her study. When asked for comment, Facebook referred WIRED to Mark Zuckerberg's testimony before Congress.

Among the groups that were not associated with any FEC records, Kim went on to differentiate between run-of-the mill dark money groups (think: non-profits and astroturf groups) and what she called "suspicious" groups. The latter had Facebook Pages or other landing pages that had been taken down or hadn't been active since election day. These suspicious groups also had no IRS record or online footprint to speak of at all. "Some groups, we were never able to track who they were," Kim says.