Bad comments are a system failure

So why can’t you fix them like any other bug?

Good moderation has a top down view of the conversational ecosystem and the underlying technology.

Internet comments are awful. Recently sites [like Popular Science, Bloomberg Business, Reuters, Mic, The Week, re/code, The Verge, and now The Daily Dot] have been giving up on hosting local comments altogether. Blaming trolls and spambots and the shift in engagement to platforms like Facebook, Twitter and Reddit, you can almost hear the sighs of relief in their articles discussing these decisions.

This “What can you do? People are awful amirite!” attitude towards comment sections is fatalistic and misguided. If you don’t want comments on your website, that’s fine, don’t have them. But don’t act like comments are some sort of intractable problem that can’t be realistically addressed by mortals. They’re not. There are only a few reasons why most internet comments sections are terrible, and real-world solutions to those problems. Be honest: you could fix this, but your priorities are elsewhere.

MetaFilter, a site I used to help run for a decade, has maintained a community based on conversation for over fifteen years. It’s nothing but comments. It’s mostly not awful. The Daily Dot’s article claims “No one has quite figured out how to thread that needle” between having a vibrant online community and supporting all voices, yet they then go on to say that “commenting systems take thoughtful moderation and constant development” strongly implying they’ve decided not to have either.

MetaFilter is a community of Internet People, people who spend a lot of time online. Everyone who spends a lot of time online is online for a reason. I am online a lot because I live in a rural area, keep late hours, and want people to socialize with when my town is asleep. And I like making jokes with other nerds who understand and appreciate them. Some people work swing shift or are otherwise time-shifted, are expatriates where people don’t speak their language, are caring for family members at home, only like to interact when they can multitask, have a disability or social anxiety that makes online communication a better option for them, or are just better at communicating through text than face to face. Understanding your community of people who are heavy online users is part of learning how to manage them and help them be their best selves.

No Reset Button

As to why online conversations go badly, they’re often full of specific kinds of people: very smart people; very anxious people; very frustrated people; very verbose people; people with high IQs and low empathy, or sometimes the reverse. And there’s no reset button on their conversations. Even your corner bar closes to kick the drunks out and mop up every night. The assholes and the unruly and the people with social problems that they’re publicly self-medicating with alcohol all have to go somewhere else for a while. This doesn’t happen in most comments sections. What starts out as a small bit of grar can turn into a huge raging shitfest if left to fester unattended for a few days.

Go home internet, you’re drunk.

Having threads that close, having moderators that redirect entrenched disagreements, giving users timeouts if they can’t get with the program, all of those can help a community reset and get back on track. These are time-tested strategies that work, but they require human attention and are difficult to automate. This means resources, usually money.

Context Collapse

Conflicting conversational contexts leads to a constant restatement of terms, values and endless nitpicking about meaning versus use in that “I didn’t call you an asshole, I just said that assholes talk like that…” way. For years this has been discussed in more academic circles as “context collapse.” You have an identity and a set of ideas about the world that exists and is understood in one social context. You want to bring it to another place and not have to have to do a five minute introduction about who you are and what you value every time you say anything. Other people don’t share the same preset understandings and may read more into what you are saying than you think you put there. Your jokes fall flat, or cause offense. Conversation devolves into side discussions and arguments about first principles and word definitions. People start citing the dictionary and Wikipedia and angrily talking past each other.

People need to know who they are talking to.

The community has to make decisions about its values. Are “101” discussions like What Is Feminism or What Is Racism tolerated, encouraged or out and out disallowed? Is your community a safe space with mechanisms like trigger warnings and spoiler alerts, or not? Are those expectations explicit and enforced by someone who is contactable and respected? Some initial work at creating practical and enforceable ground rules can keep every contentious discussion from turning into a first-principles slugfest.

The Lie of the Self-Moderating Community

This is the dream. Build it and they will come. They will not destroy it because it’s theirs, it’s the internet and it’s super democratic, right? But why should this work online if it doesn’t work offline? The alleged democratic nature of the online world is even more of a myth than it is in the offline world. Online, someone pays the website bills and has the passwords to the back end, someone registered the domain, someone makes money off of the site, someone has admin tools. Those people have more power than the other people in the community, including the power to decide to do nothing with those powers. The decisions they make set the tone of a community more than anything else, more than the user interface, more than the slick design, more than the user population itself.

Non-admin users can certainly build up social capital and power over time. They can become trusted users that other community members look up to and emulate. They can become power users who flag problematic content and communicate about site issues with an admin team. However, they can’t make site-wide decisions and set policy without having the keys to the store; they can’t speak for the site owners, or shouldn’t. Giving volunteer users some admin-like powers without compensating them somehow is a potentially exploitative situation for any site which makes money.

These people are at your tea party. Are you being a good host?

Moderating is about sharing as much of the top-down decision making as possible while at the same time keeping the community from eating itself. Off-the-shelf technology for facilitating moderation is rarely up to the job. Custom tools for a community’s specific needs encourage a feeling of ownership, of buy-in, of specialness. Staff who do the jobs of running the site should be compensated decently to do that and they should be given the right tools to be effective, tools that can grow as the community grows.