BEFORE THE death of Heather Heyer in Charlottesville, the Daily Stormer — a neo-Nazi website involved in organizing the white supremacist rally that led to her killing — was easy to find: all you had to do was type in the Web address. Now the site has all but vanished from the Internet. That's due to the decision of a handful of Internet companies to reject the publication as a customer in the wake of Charlottesville — a reasonable choice that nevertheless raises difficult questions about limiting speech online.

After the Daily Stormer published a post crowing over Ms. Heyer's death, the company hosting the website and providing it with a domain name withdrew its services, booting the site offline. The website bounced from service to service as each rejected it in turn. Then, Cloudflare — a company that provides protection from cyberattacks — pulled the plug as well. Without Cloudflare's support, hackers have knocked the website offline each time it's tried to reemerge. Currently, the site exists only on a hidden corner of the "dark web," off-limits to casual browsers.

While Cloudflare didn't block the website from the Internet per se, any site without the protections it offers is vulnerable to being kicked offline by vigilante hackers. And though users banned from social-media platforms can always migrate to a new service, the Web's infrastructure is made up of a relatively small number of companies such as Cloudflare. The fewer alternatives there are for services such as domain name registration and cyberattack protection, the more ability each provider has to decide which websites should be online.

As a space maintained by private enterprise, the Web is outside the scope of First Amendment protections. But freedom of expression has always been a key cultural, if not legal, value of the Internet — and that free flow of information has made the Web into a vibrant forum central to democratic life.

Nevertheless, in recent years, the Internet has also begun to reckon with the danger posed by certain kinds of speech. Social-media sites such as Facebook and Twitter are working to limit harassment, calls to violence and disinformation across their platforms. Companies that make up the Web's infrastructure now face the option of taking a similar approach.

Businesses such as Cloudflare have no legal obligation to provide service to neo-Nazis. But the more power these companies have to determine unilaterally whose website gets to be online, the more carefully they should wield that power, weighing the value of free speech against its dangers. Developing clear, transparent standards for refusing a customer would be a valuable first step.

The U.S. government should hold back from weighing in on the scope of those standards. Government regulation defining which websites may be removed from the Internet would risk legitimizing the repressive tactics of governments like those of China and Russia, which censor the Web in the name of preventing harm. In the absence of government action, Internet companies should take care to be honest and open about their decision-making.