Rei Watanabe

Much of the internet runs on volunteer labor performed by people who are often unnoticed, such as online community moderators. When these people are recognized, it’s usually because they’ve become a target of harassment, are involved in a flamewar, or are accused of abusing their power.

Moderators make message boards, Reddit, Facebook groups, email listservs, and many other online communities function, and yet not a whole lot of time has been spent by mainstream academics understanding good internet moderation, or the psyche of a moderator. Kat Lo, a PhD student at the University of California Irvine, is bridging that gap by researching online communities at a time when most major platforms are trying reckon with widespread harassment.

“Eight years ago I started moderating communities, especially the girlgamer subreddit,” Lo told me. “I was so interested in thinking about making policies that people can believe in and helping people enforce those policies in their own communities so it’s not a top-down decree.”

There is no unified theory of community management or moderation, but platforms are currently trying to balance keeping themselves open and as impartial as possible, while reckoning with various harassment campaigns, be they GamerGate, the alt-right, neo-Nazis, or more run-of-the-mill flamewars that have long been a part of internet culture.

Read More: Female Hackers Still Face Harassment at Conferences

What’s largely happened is that people who have traditionally been marginalized by society have been marginalized online, too.

On large online platforms, harassers “feel safe because they are safe,” Lo said. “There aren’t a lot of consequences. We talk about anonymity, but that’s a misdirection: Look at Facebook comments—there’s a lack of consequences and people aren’t buying into the norms of a community and are imposing their own thoughts on what’s possible.”

On a day-to-day basis, unpaid moderators are often those who end up having to deal with keeping toxicity out of an online community. Moderators are often tasked with deleting graphic images and videos, deflecting vitriol, enforcing rules, and ensuring their communities continue to function. Then, in the act of moderating, they’re often shamed by the community for censorship. It’s a thankless, difficult job.

“It’s a far more complex job than just banning people,” Lo said.

“A lot of moderators burn out. Well, we call it ‘burning out’—they’re fatigued, they’re demoralized, and they have an aversion to doing it,” she added. “But the things people are describing are symptoms of trauma. Moderators determine a lot of culture that happens on the internet and they do hold a lot of power, but simultaneously they hold a lot of trauma.”

Besides her research, Lo has begun doing volunteer crisis counseling for moderators, streamers, YouTubers, developers, and academics who have been harassed or have otherwise experienced online trauma.

“Almost everyone I’ve counseled has said ‘I didn’t know a person like you existed,’ or ‘I didn’t know anybody else could understand these problems,’” she said. “I am trying to empower people on an individual level and I’m hoping those people can use those skills to build their own communities. When you have these moments with people on a smaller scale, it makes doing this work feel sustainable.”

It’s not all bleak, of course. It’s important that academics are beginning to take these jobs seriously, and online platforms are beginning to hire community experts who can offer support for moderators and enact changes that can make entire platforms safer for everyone. Five years ago, it might have seemed crazy that a Reddit moderator would pursue a doctorate in, broadly speaking, Reddit moderation. Now, it seems absolutely imperative that more people do the same.

Humans of the Year is a series about the people building a better future for everyone. Follow along here.