It's not hard to find extremists on the internet. But it's really hard finding out who's the most successful at spreading extremism, which can make counteracting their influence difficult. Now a pair of researchers think they've figured out how to do it – which could make extremist threats easier to identify and block.

The researchers also discovered some peculiar data about how extremists on both the far right and left use Twitter and how online extremist networks are organized. In a new report, terrorism analyst J.M. Berger his co-author Bill Strathearn found that traditional leaders on the far right are losing influence to new forms of extremist media, spread online by a small group of influential activists who are relative unknowns, but can communicate to a much larger audience of potential recruits. These activists are even attempting to make inroads into mainstream politics.

The team began by collecting 12 Twitter accounts owned by prominent self-identified white supremacists with a combined total of 3,542 Twitter followers. These accounts were for groups and individuals such as the white supremacist ideologue David Duke, various Ku Klux Klan factions, and neo-Nazi clubs like the Aryan Nations, American Nazi Party and the American Freedom Party. Next, the team narrowed in on the followers, of which 44 percent espoused what the team considered explicitly white supremacist views.

The team looked at which of those followers were interacting with others the most and who had the most influence (meaning their tweets were retweeted by others the most). And finally, which websites were they linking to? And which hashtags were most popular?

The team concluded: The most influential ideologues were highly influential among the group, and most were dabblers in a kind of 90-9-1 rule for internet skinheads: 90 percent are lurkers and rarely contribute, 9 percent contribute some of the time, and 1 percent do most of the talking and effectively control the conversation. A full list of the most influential are included in the authors' report (.pdf), published by the International Centre for the Study of Radicalisation and Political Violence, a London think tank.

It might sound obvious, but that's good news. "In short, the vast majority of people taking part in extremist talk online are unimportant," the authors write in the report. "They are casually involved, dabbling in extremism, and their rhetoric has a relatively minimal relationship to the spread of pernicious ideologies and their eventual metastasization into real-world violence."

The most prominent white supremacist leaders also suck at promoting themselves. Instead, their followers preferred to link to other websites like WhiteResister, Infowars and the white supremacist Council of Conservative Citizens. This suggests the old guard of American organized racism is "not generating daily buzz, on Twitter at least," and that "these well-known leaders of white nationalism in the United States may be losing touch with their constituents."

Here's the bad news. The most influential Twitter followers among the sample are "highly committed white nationalists unlikely to be swayed by intervention." Influential users are also "actively seeking dialogue with conservatives" through hashtags #tcot (or top conservatives on Twitter), #teaparty and #gop, as well as frequently linking to mainstream conservative websites. But only 4 percent of users identified as mainstream conservatives, which suggests the hashtags "are driven more by white nationalists feeling an affinity for conservatism than by conservatives feeling an affinity for white nationalism."

This affinity can be prevented in part, according to the authors, with several tactics. One, they claim their metrics can be used to identify casual followers, "whose interactions indicate an interest in an extremist ideology but not a single-minded obsession with it." Anti-racist activists – and mainstream conservatives in particular – could perhaps focus on these fence-sitters, keep tabs on them, and try to pull them away from becoming radicalized.

Blocking content is harder and exists without clear guidelines, but many services like Twitter have begun blocking some neo-Nazi content, though it's an uphill battle. Identifying the most diehard users first, and then blocking them, could cause the rest to whither away. Another option, the researchers suggest, is more targeted blocking of certain YouTube videos, which feed extremists' Twitter feeds.

The authors encountered a similar phenomenon when studying left-wing anarchists, though there are some differences. Anarchists on Twitter still had a small number of highly influential users in control of the conversation, but the distribution wasn't as sharply concentrated as the white supremacists. Anarchists on Twitter are less likely to identify as strongly with anarchism as white supremacists do with their ideology, and are more likely to identify with multiple political ideologies – meaning other extremist ones. They're also more likely to rely on mainstream websites for information, and the websites they prefer are mostly politically liberal.

But the anarchists' hashtags show little signs of interest in mainstream liberals as potential allies. (Their hashtags are mostly Occupy Wall Street-related.) This may reflect an ideological difference. Anarchists are "fundamentally opposed to political institutions, compared to white nationalism, which is not opposed to institutions per se," the authors write.

Nonetheless, however much the difference between extremist ideologies might seem, their behavior is still pretty similar across these boundaries. Breaking their spell has to first begin with finding out who's who.