Senator Josh Hawley (R, Mo.) on Capitol Hill in Washington, D.C., January 3, 2019 (Aaron P. Bernstein/Reuters)

If a book contains libelous statements, its author can be sued. So can the publisher who chose to release it. But libraries and stores that stock the book are generally safe. The broad pattern is that the more oversight you have over speech that isn’t yours, the more likely it is you can be held responsible for it.


It’s a tricky question how that should apply to websites where users post their own writings. In 1995, interpreting then-current law, a state court found that because the company Prodigy exercised considerable oversight over its message boards, it could be considered a “publisher” and held liable for what was on those boards.

But shortly thereafter, Congress chose to draw the line differently. In “Section 230,” it stated that websites could not be held liable for posts provided by third-party individuals, no matter how much moderating and curating and censoring the sites did.

I don’t agree with Senator Josh Hawley’s approach to this provision. He would yank Section 230 protection from any major tech company that couldn’t prove, to the satisfaction of a supermajority of Federal Trade Commission heads, that it doesn’t practice discrimination against political parties and beliefs. This poses obvious First Amendment problems and gives unelected bureaucrats way too much power over private companies.


But it is worth thinking about whether 230 drew the line in the right place for today’s Internet, which is no longer a fledgling technology but a major part of the economy increasingly controlled by a few humongous companies. I don’t think it’s obvious at all that these companies should have no responsibility for what is posted to their sites, especially when they are already extensively censoring those very sites. And I don’t think it would be unreasonable, at least for the largest of these companies, to condition 230 immunity on sites’ operating as open platforms, removing material and kicking off users only in certain circumstances (illegal postings, spam, etc.) — and perhaps to give some new guidance to courts regarding how to treat websites operating outside of Section 230, so it’s clear from the outset what they are expected to do to prevent or remove illegal posts.

Could we actually write a law that spells out such a rule in a workable way? I don’t know. I wouldn’t bet on it. But I do know Hawley’s bill isn’t it — and that I’m nonetheless grateful to him for at least starting the conversation.