Karen Kornbluh is senior fellow and director of the Digital Innovation and Democracy Initiative at The German Marshall Fund, a non-partisan think tank and grant-making institution dedicated to deepening transatlantic ties. She is also a member of the board for the US Agency for Global Media and was previously the US ambassador to the Organization for Economic Cooperation and Development. The opinions expressed in this commentary are her own.

Since March of this year, at least three mass shootings were announced on the website 8chan with screeds seemingly designed to spread dangerous ideology. This website, which its founder has even urged be shut down, provides a home for white supremacists to recruit, radicalize and plan. Yet the company that provided 8chan with web hosting and security services, Cloudflare, didn't terminate 8chan's account until earlier this month. 8chan has since been struggling to stay online.

While the result is an improvement, for now, Cloudflare should have terminated the site's account back in March after the shooting in Christchurch, New Zealand, despite feeling "incredibly uncomfortable about playing the role of content arbiter." Internet companies have responsibilities beyond connectivity and must do better at assuming them. Hosting providers and social media platforms should refuse to be megaphones for speech that supports violent extremist ideologies. Platforms should remove all content that clearly incites violence, no matter who posts it, as well as white supremacist and nationalist content. When platforms refuse to remove this content, as 8chan did, hosting companies should refuse to provide them service.

And if the rogue websites do find a home, the major social media platforms like Facebook, Google and Twitter should work together to deny access to these sites from their platforms. This would deny white supremacists the opportunity to spread their violent extremist ideology to the broader user bases of the larger platforms. Google has already removed 8chan from its search results.

The prevailing ethos of internet hosting and connectivity companies has long been to leave content up and sites connected — preserving users' rights to free expression. But with white supremacy becoming a growing terror threat around the world, it's time to think through under what circumstances it is acceptable to allow such content to stay online.

Of course, internet companies don't look the other way for all content. Since passage of the Protection of Children from Sexual Predators Act in 1998, they have been required to report child pornography to the National Center for Missing and Exploited Children. And for sex trafficking and other content that violates federal criminal law, or intellectual property violations, these companies do not have immunity under Section 230 of the Communications Decency Act, which famously protects them from liability for all other content they carry. As a result, internet companies do in fact regularly take down child pornography, sex trafficking and Islamic terrorist content when they become aware of it — and report it to the FBI.