Support for President Trump is singularly the most likely trait among Twitter users that identify as being a part of the “alt-right,” a loosely-defined faction of far-right extremists that advocate for a white supremacist political agenda, according to “Alt-Right Twitter Census,” a new research paper by J.M. Berger published by VOX-pol, a violent online extremist research group funded by the European Union.

One of the difficulties in studying and understanding the alt-right is the nebulous boundaries of the group—especially since the term “alt-right” was coined by its own members in order to define themselves. However, this research paper argues that the trait that most consistently appears among people that identify as “alt-right” is support for Trump.

“Support for Donald Trump’s presidential campaign provided a crucial, in-group definition that united a fractious group of far-right ideologues, nationalists, white nationalists, anti-Semites, homophobes, transphobes, misogynists, Islamophobes, libertarians and anti-establishmentarians,” the paper reads. “The Trump-supporting definition of the alt-right in-group allows some targets of alt-right discrimination – including a handful of African-Americans, Jews and LGBTQIA individuals – to operate within the in-group and sometimes even rise to positions of prominence.”

The research focused on 41 accounts that self-identify as "alt-right" in their Twitter handle, display name, or bio, and the 29,913 users that follow these 41 accounts. (Twitter’s API has an approximate upper limit of 50,000 users that can be downloaded and studied in a single data relatively quickly). Berger said that several hundred accounts had been on his casual radar for alt-right activity for several years, but he honed in on users that self-identify as alt-right in order to set boundaries on the far-right users included in the study, and because he needed a “pragmatically identifiable starting point.” The study analyzed 200 most recent tweets from the network of 29,913 accounts starting April 8, 2018, and this process was repeated several times through June 2018.

According to the research paper, @realdonaldtrump was the most “influential” account—which essentially measures how many Twitter interactions (like likes, replies, or retweets) a single user is associated with—across all 29,913 users. Meanwhile, @richardbspencer was the most influential account among the 41 individuals who self-identified as “alt-right” in their Twitter profiles. The most "high volume" category of content tweeted by users in the data set was pro-Trump content, followed by white nationalist content.

Across all 29,913 far-right users, “MAGA” (“Make America Great Again”) was the most common phrase to appear in a person’s bio, and #MAGA was the most commonly used hashtag (although at least 2,000 distinct hashtags referred to Trump). There were also spikes in account creations that identify as “alt-right” in the bio during crucial moments of Trump’s campaign. In 2016, more than 5,000 alt-right accounts were created. In 2017, more than 6,500 were created. And notably, January 2017, the month of Donald Trump’s inauguration, saw 748 alt-right account creations—the highest monthly alt-right account creation number at the time, and to-date.

The study notes that far-right extremist ideologies like racism, Islamophobia, xenophobia, and Nazism predate the Trump presidency. However, support for Trump allowed a diverse set of people with far-right views to find balance and grounding upon a belief that they support. In other words, the alt-right is not an “ideology” per se—it’s an axis of belief.

In an email to Motherboard, Berger said that it would be fair to say that the election of Donald Trump simultaneously was enabled by and caused the growth of the alt-right.

“What has less quantifiable in general, but more clear in this context of this study, is that support for Trump is really the dominant element that unites this diverse coalition of extremists,” Berger said. “I won't try to argue the counterfactual, but it's legitimate to ask whether this movement would hold together without the unifying factor of support for President Trump. It's possible some other anchor would emerge in his absence, but the movement would look a lot different without him.”

These findings are also significant because they represent one of the only forms of “pro-” or positive branding among people in the far-right. As stated in the research paper, people in the far-right often define themselves in the terms of things that they oppose. “Top word pairs in user self-descriptions included ‘anti-EU’, ‘anti-Islam’, ‘anti-globalist’, ‘anti-feminist’ and ‘anti-Zionist,’” the paper reads.

Ideology lies on a spectrum, so it’s worth noting that by focusing on self-identifying members of the “alt-right,” Berger selected people that have somewhat developed extremist beliefs. As explained by Robert Evans for bellingcat, “red-pilling,” or the process by which people develop racist, sexist, anti-Semitic, Islamophobic, and xenophobic beliefs, often occurs gradually. For some people, as Evans explains, support for Trump can be an entryway for people to seek out subcommunities such as the /r/the_donald subreddit or the /pol/ board on 4chan. These communities often hateful content that’s veiled through the lens of irony, or being presented as a meme—but over time, the joke fails to be a joke anymore, and people become radicalized.

“I should note that one of the more frustrating elements about covering the fascist right is that much of what they say sounds ridiculous and makes them appear less than serious,” Evans wrote. “This is why it is important to remember that these groups have a body count and represent a real threat. Their absurdity does not negate their danger.”

These subreddits and boards also provide direct connections to Nazi Discord servers, as reported by Evans, which have concrete connections to extremist groups such as Atomwaffen, a white supremacist terrorist group which has been associated with several hate crimes and murders in the US.

According to the Alt-Right Twitter Census, YouTube was the site most commonly linked to from the tweets examined in the data set, accounting for 73,645 tweets. The second most-linked site, Facebook, was linked just 23,387 times. This isn’t unexpected: YouTube is a territory with known radicalizing applications. Per a recent report by Data & Society’s Becca Lewis, collaborations between mainstream libertarian or conservative figures such as Joe Rogan or Dave Rubin and more far-right figures makes viewers more likely to engage with fringe, extremist content, and this can set them down a path toward radicalization.

Berger told Motherboard in an email that in the data set, there was clearly “some significant number” of account followers that were artificial bots, created in order to boost the perceived popularity or impact of certain accounts. Berger said that he doesn’t feel comfortable naming an exact figure; however, he said that he doesn’t believe that bots have an impact on the core ideological links examined in the study. “Typically, [bots] would have an amplifying effect rather than introducing totally new content.”

There have been some efforts to curb the influence of the alt-right on Twitter. In late 2016, Twitter cracked down on Trump-supporting bots and suspended influential fringe accounts, some with more than 10,000 followers. Earlier this year, Twitter went through another sweeping effort to weed out bot accounts. More than 70 million accounts were suspended in May and June 2018 alone. However, this wasn’t just an attempt to cleanse the platform of the alt-right: it was also a response to congressional scrutiny about Russian influence in the 2016 election. Stil, Berger told Motherboard that in order to meaningfully address the alt-right, we need to understand exactly how it works.