The full text of this Article may be found by clicking on the PDF link to the left.

Private online platforms have an increasingly essential role in free speech and participation in democratic culture. But while it might appear that any internet user can publish freely and instantly online, many platforms actively curate the content posted by their users. How and why these platforms operate to moderate speech is largely opaque.

This Article provides the first analysis of what these platforms are actually doing to moderate online speech under a regulatory and First Amendment framework. Drawing from original interviews, archived materials, and internal documents, this Article describes how three major online platforms — Facebook, Twitter, and YouTube — moderate content and situates their moderation systems into a broader discussion of online governance and the evolution of free expression values in the private sphere. It reveals that private content-moderation systems curate user content with an eye to American free speech norms, corporate responsibility, and the economic necessity of creating an environment that reflects the expectations of their users. In order to accomplish this, platforms have developed a detailed system rooted in the American legal system with regularly revised rules, trained human decisionmaking, and reliance on a system of external influence.

This Article argues that to best understand online speech, we must abandon traditional doctrinal and regulatory analogies and understand these private content platforms as systems of governance. These platforms are now responsible for shaping and allowing participation in our new digital and democratic culture, yet they have little direct accountability to their users. Future intervention, if any, must take into account how and why these platforms regulate online speech in order to strike a balance between preserving the democratizing forces of the internet and protecting the generative power of our New Governors.

* Ph.D. in Law Candidate, Yale University, and Resident Fellow at the Information Society at Yale Law School. Research for this project was made possible with the generous support of the Oscar M. Ruebhausen Fund. The author is grateful to Jack Balkin, Molly Brady, Kiel Brennan-Marquez, Peter Byrne, Adrian Chen, Bryan Choi, Danielle Keats Citron, Rebecca Crootof, Evelyn Frazee, Tarleton Gillespie, Eric Goldman, James Grimmelmann, Brad Greenberg, Alexandra Gutierrez, Woody Hartzog, David Hoffman, Gus Hurwitz, Thomas Kadri, Margot Kaminski, Alyssa King, Jonathan Manes, Toni Massaro, Christina Mulligan, Frank Pasquale, Robert Post, Sabeel Rahman, Jeff Rosen, Andrew Selbst, Jon Shea, Rebecca Tushnet, and Tom Tyler for helpful thoughts and comments on earlier versions of this Article. A special thank you to Rory Van Loo, whose own paper workshop inadvertently inspired me to pursue this topic. Elizabeth Goldberg and Deborah Won provided invaluable and brilliant work as research assistants.