Online platforms continue to absorb more and more of our society, while the companies in charge neglect the human beings tasked with cleaning up the mess that’s left behind.

Daniel Leal-Olivas / AFP / Getty Images

Last week, YouTube did something unprecedented. Awash in criticism over the discovery of a network of child predators using the platform’s comment sections to share timestamps and screenshots of underage users from implicitly sexual angles, the company disabled comments on almost all videos featuring minors. Only a small number of channels featuring minors would be able to stay monetized — as long as they “actively moderate their comments.” The decision, made by a company that has long stressed the importance of algorithms, seems a tacit acknowledgement that human moderation is currently the best solution for policing harmful content.

Moderating content and comments is one of the most vital responsibilities on the internet. It’s where free speech, community interests, censorship, harassment, spam, and overt criminality all butt up against each other. It has to account for a wide variety of always-evolving cultural norms and acceptable behaviors. As someone who has done the job, I can tell you that it can be a grim and disturbing task. And yet the big tech platforms seem to place little value on it: The pay is poor, workers are often contractors, and it’s frequently described as something that’s best left to the machines. For instance, last week, the Verge published an explosive look inside the facilities of Cognizant, a Facebook contractor that currently oversees some of the platform’s content moderation efforts. In the story, employees who requested anonymity for fear of losing their jobs described the emotional and psychological trauma of their work. Some smoked weed during breaks to calm their nerves. Others described being radicalized by the very content they were charged with policing. Most made just $28,000 a year. Facebook moderators in developing countries like India are even worse off, according to a recent Reuters report. Contractors at Genpact, an outsourcing firm with offices in the southern Indian city of Hyderabad, each view about 2,000 posts over the course of an eight-hour shift. They make about $1,400 a year; that’s roughly 75 cents an hour.

When the barbarians are already inside the gates, you don’t tell the villagers to stay tuned for an algorithmic solution.

It’s clear that human moderators are something that platforms like Facebook or YouTube believe they can eventually optimize away. Last spring, Mark Zuckerberg, while being questioned in front of Congress, referenced artificially intelligent moderation more than 30 times. In the meantime, though, the human moderators at Facebook or YouTube spend their days getting high to numb themselves so they can keep scrubbing suicides from our News Feeds. These companies keep telling us to ignore the trash in the streets, saying it’ll all get better once they can figure out how to get the garbage trucks to drive themselves. The mega-platforms that have turned the world into one giant comment section have shown time and time again that they have little interest in hiring real people to moderate it. Every week some new scandal flares, and we watch as Facebook, YouTube, and Twitter play whack-a-mole, promising us this sort of thing won’t happen again. Meanwhile, that thin layer separating platforms like Facebook and YouTube from complete 4chan-style chaos is cracking. When the barbarians are already inside the gates, you don’t tell the villagers to stay tuned for an algorithmic solution.

Zach Gibson / Getty Images

We’re all trapped in an endless comment thread from Reddit’s /r/The_Donald without a mod in sight.

Maybe one day AI will be able to instantly and effectively police the whole internet, but in the meantime, we’re all trapped in an endless comment thread from Reddit’s /r/The_Donald without a mod in sight. Community moderators, content moderators, audience development editors — they’re all shades of the same extremely important role that has existed since the birth of the internet. It’s the person looking at what’s being posted to a website and decides if a piece of content or a user should stay there or be taken down. It’s like combining a sheriff and a librarian. Usenet, one of the first real online communities, had moderators for some of its newsgroups. Subreddits have them. Discord servers have them. Online communities have always been defined, in some way, by their moderators. Comedy website Something Awful famously had a huge list of every banned user and why they were banned. The feminist humor site the Toast used to have one of the nicest, most uplifting comment sections on the internet. One of the strictest, most heavily censored communities on the internet for a while was the Neopets message board, with mod drama kicking off weekly. For about nine months, I worked as BuzzFeed’s comment moderator. Every day, I’d come in and pull up a feed showing every comment on the website. The feed had about 100 comments on it at a time. When I asked how many I should clear a day, I was told about nine refreshes a day was normal. BuzzFeed’s system in 2012 was actually a lot more intuitive than other moderation feeds I had seen. There was a toolbox that hovered to the right of the screen, allowing me to block comments as I saw them or ban users outright. My favorite tactic was a moderation technique called “shadow banning,” where the user doesn’t know that they were the only ones who could read their comments. For the most part, I would delete spam, give fun stickers to funny commenters, and scan for hate speech. At the time, BuzzFeed was still pretty small, so it wasn’t a particularly difficult job. The hardest days, though, were when we’d get attacked by another online community. The tactic is called “astroturfing,” and usually a community like Reddit or 4chan, or the neo-Nazi message board Stormfront, would flood our comment sections with gore, pornography, and hate speech. If this sort of thing happened overnight — which it usually did — I’d end up working through lunch to clean things up. After days like that, I’d usually spend my nights silently staring off into space, not because I was particularly traumatized, but because there’s really only so much vitriol and toxicity a person can absorb before it all stops meaning anything. Those astroturf days are what every day is like now.