And yet there is no sign of these people on a platform like Facebook or Twitter. One can register complaints, but the facelessness of the bureaucracy is total. That individual people are involved in this work has only recently become more well-known, thanks to scholars like Roberts, journalists like Adrian Chen, and workers in the industry like Rochelle LaPlante.

In recent months, the role that humans play in organizing and filtering the information that flows through the internet has come under increasing scrutiny. Companies are trying to keep child pornography, “extremist” content, disinformation, hoaxes, and a variety of unsavory posts off of their platforms while continuing to keep other kinds of content flowing.

They must keep the content flowing because that is the business model: Content captures attention and generates data. They sell that attention, enriched by that data. But what, then, to do with the pollution that accompanies the human generation of content? How do you deal with the objectionable, disgusting, pornographic, illegal, or otherwise verboten content?

The one thing we know for sure is that you can’t do it all with computing. According to Roberts, “In 2017, the response by firms to incidents and critiques of these platforms is not primarily ‘We’re going to put more computational power on it,’ but ‘We’re going to put more human eyeballs on it.’”

To examine these issues, Roberts pulled together a first-of-its-kind conference on commercial content moderation last week at UCLA, in the midst of the wildfires.

For Roberts, the issues of content moderation don’t merely touch on the cost structure of these internet platforms. Rather, they go to the very heart of how these services work. “What does this say about the nature of the internet?” she said. “What are the costs of vast human engagement in this thing we call the internet?”

One panel directly explored those costs. It paired two people who had been content moderators: Rasalyn Bowden, who became a content-review trainer and supervisor at Myspace, and Rochelle LaPlante, who works on Amazon Mechanical Turk and is the cofounder of an organizing platform for people who work on that platform, MTurkCrowd.com. They were interviewed by Roberts and a fellow academic, the University of Southern California’s Safiya Noble.

Bowden described the early days of Myspace’s popularity when suddenly, the company was overwhelmed with inappropriate images, or at least images they thought might be inappropriate. It was hard to say what should be on the platform because there were no actual rules. Bowden helped create those rules and she held up a notebook to the crowd, which was where those guidelines were stored.

“I went flipping through it yesterday and there was a question of whether dental-floss-sized bikini straps really make you not nude. Is it okay if it is dental-floss-size or spaghetti strap? What exactly made you not nude? And what if it’s clear? We were coming up with these things on the fly in the middle of the night,” Bowden said. “[We were arguing] ‘Well, her butt is really bigger, so she shouldn’t be wearing that. So should we delete her but not the girl with the little butt?’ These were the decisions. It did feel like we were making it up as we were going along.”