When looking at far-right reactions to the outcome of the latest National Action trial, I was shocked by the amount of sympathy, glorification and belittlement that I came across. “What a lad,” one 4Chan user wrote about the convicted National Action member Mikko Vehvilainen, who wanted to prepare for the coming “race war” by establishing an all-white enclave in a Welsh village. “Police found Hitler stickers in his home. Made me chuckle,” another one commented. One user even called the terrorist organisation “the only truly legit nationalist group in Britain” and another one complained that “they treat National Action like they’re the IRA, despite the fact that they have done literally nothing.”

These overtly sympathetic expressions for a group that celebrated the murder of Jo Cox, called for a “white jihad” and plotted terrorist attacks against Labour MP Rosie Cooper, should ring alarm bells. National Action is an exceptionally violent and hateful group, but it would be naive to believe that by jailing its members the threat from far-right militancy in the UK has been eroded. In the past year, sympathisers of the organisation have geared up in an attempt to normalise antisemitic, racist and anti-democratic ideologies, sometimes hiding extremely hateful messages behind ironic memes and satirical posts to escape prosecution.

The widespread approval and encouragement these actors receive from more mainstream audiences signal that National Action is only one dimension of a much bigger challenge: the injection of extremist views, dehumanising language and apocalyptic visions of the future into the heart of our society. To take one platform, extremism researcher JM Berger estimates that the “alt-right” network on Twitter currently exceeds 100,000 users – but points out that this is an absolute baseline minimum.

Many of the elements featured in National Action’s ideology and vision – from the idea of an approaching civil war of races, cultures or religions to conspiracy theories about the “global political elites” and “the mainstream media” – are also present in much larger far-right communities such as the “alt-right”, identitarian and counter-jihad circles. Extremist recruiters and influencers have learnt to use these ideological overlaps as the lowest common denominators to “unite the right”. Frustrations about free speech limitations on social media have been among the most commonly exploited grievances that have paved the way for far-right extremist groups to spread conspiracy theories and radicalise new audiences.

‘In the wake of the Charlottesville rally in the summer of 2017, Twitter shut down hundreds of white supremacist accounts.’ Photograph: AP

In the wake of the Charlottesville rally in the summer of 2017, Twitter shut down hundreds of white supremacist accounts, the gaming app Discord closed several related channels and the world’s most prominent neo-Nazi webpage, Daily Stormer, received 24 hours’ notice before losing its domain. Although these measures helped to reduce the reach of white supremacist movements in the short-term, they have also set in motion a mass exodus of far-right sympathisers to ultra-libertarian and extremist platforms, which avoid take-down measures at any cost, refuse to engage in any anti-hate speech efforts and have gradually turned into a central hub for far-right mobilisation and (dis)information sharing.

Not only are swastikas in profile pictures, Hitler’s initials in usernames and antisemitic conspiracy theories in news feeds the norm on these extremist online safe havens, you can also still find members and propaganda materials of militant neo-Nazi organisations. For example, Andrew Clarke, a self-described “former National Action activist & spokesman” still has a profile on the far right’s Twitter equivalent, Gab, while National Action speeches and propaganda videos can still be found on Bitchute, the far right’s YouTube replacement.

In March 2018, a speech about the “white genocide” by the former group leader Benjamin Raymond was published and in June 2018 a National Action promotion film was shared on Bitchute. The video, which claims to be “Banned in the UK” in its title, shows members of the terrorist group performing the Hitler salute and was viewed more than 400 times.

Gab and Bitchute are only one part of a quickly evolving alternative social media landscape. After far-right users had their accounts closed down by the crowdsourcing platform Patreon, many migrated to extremist equivalent Hatreon. Some stopped consulting Wikipedia, instead opting for Metapedia, where the Holocaust only took place according to “politically correct history”. Meanwhile, Bitchute, DTube and Pewtube established themselves as “censorship-free alternatives” to YouTube, attracting mainly conspiracy theorists and Holocaust-deniers. Even dating platforms such as Wasp Love, which are exclusively for white Anglo-Saxon Protestants (Wasps), have been created – and serve the far-right as an alternative to Tinder.

At the Institute for Strategic Dialogue we have worked with governments and tech firms to track incitements to violence and explicitly racist content on the internet and accelerate their removal. The echo that National Action created across the internet shows that online extremism needs to be seen as a cross-platform threat that will require a holistic approach and close cooperation between policymakers, the tech sector and researchers.

But ultimately, banning extremist groups and forcing platforms to remove harmful content can only be one dimension of the solution. It is much more important to immunise young people against extremist attempts to instrumentalise their fears, twist their perceptions and hijack their desires. Media literacy and digital citizenship training could be a first step to help them to brace themselves against the manipulative tactics used by extremist movements.

• Julia Ebner, an Austrian journalist, is a researcher at the London-based Institute for Strategic Dialogue