To cope, some companies have tried to replace human moderators with algorithms. The results have been mixed at best. Some of the most high-profile failures were at Facebook, where algorithms censored archaeological images showing a 30,000 year-old nude figurine, while allowing live video of suicides to circulate widely. Facebook promised last year to hire thousands of human moderators — and, in some cases, to provide them with trauma therapy.

Those are good first steps for disaster-response moderation, but we also need to revive what Ms. West called the tummler part of the job. It’s a tough gig, but it can be done. Especially if companies admit that there is no one-size-fits-all solution for moderation.

This is why human moderators are so valuable: they can understand what’s important to the community they’re moderating. On the Reddit forum r/science, for example, moderators will delete posts that aren’t based on peer-reviewed scientific research. And on the fan-fiction forum An Archive of Our Own, where many people prefer to post stories under pseudonyms, members can be banned for revealing the legal names of another member.

A well-trained moderator enforces these rules not just to delete abuse, but also to build up a unique community. At AO3, for example, there is a class of moderator called a “tag wrangler,” whose job is to make sure stories are labeled properly for users who don’t want “Iron Man” fic mixed in with “Iron Giant” fic. Or “Iron Chef”! The forum is also recruiting bilingual moderators who can answer questions and post items of interest for its growing community on Weibo, China’s most popular microblogging site.

Monique Judge, an editor at the black news site The Root, told me that she and her colleagues are inundated with racist comments. But instead of banning the commenters, or deleting their words, The Root lets them stand. “We let those stay so that people can see how ignorant they are,” she said. “I feel like those comments are just our reality as black journalists. No matter what we talk about, people will say, ‘Don’t discuss this because you’re black.’”

Ms. Judge’s point is that context matters. Racist comments mean one thing in The Root’s community, where black perspectives are centered, and quite another on Twitter, where they are not.

Moderators aren’t the only ones responsible, though. They are effective only if they have the support of their employers. Anil Dash, a social critic and podcaster who runs the app development community Glitch, once argued, in an essay that has become a classic among moderators, that if a website’s comment section is full of jerks, “It’s your fault.”