reddit takes a new direction

Several months ago, reddit shook with the news that longtime subreddit /r/jailbait — dedicated to, well, you can probably guess — had been shut down by its own moderators. Yesterday, reddit shook again, with the news that a variety of other subreddits, arranged on similar topical lines, were being shut down by reddit’s admins.

Predictably, this has caused a shitstorm. It has also caused calls for bans of other subreddits which have nothing to do with sexual fetishes involving children. There are, of course, plenty of subreddits devoted to other deeply disturbing things, and as long as reddit’s getting rid of the kiddie diddlers, why not the Nazis and the wife-beaters too?

And, also predictably, there are people who find themselves in the unenviable position of defending things which are beyond the pale, by arguing that reddit should simply confine itself to following applicable laws — by removing/reporting/cooperating-in-the-prosecution-of actually illegal things — and otherwise let the subreddits moderate themselves.

I am not going to get up on my soapbox here and defend or oppose anything; I’m simply going to talk a bit about this problem, and why it’s so damned hard. People who are both wiser than I am and better writers than I am have already covered this territory pretty thoroughly, so if you’re looking for high-quality reasoning and presentation I suggest you instead have a read of Clay Shirky’s A Group Is Its Own Worst Enemy and Neil Gaiman’s Why defend freedom of icky speech?. What follows here is nothing more than my own possibly-incoherent rambling.

Reminiscences of an old fart

First, a bit of background.

It’s 2012, and I’ve been involved in user-run online communities for nearly fifteen years now. I’ve been on just about every side of the equation — I’ve been a user, a moderator, an admin, a developer — and I’ve seen this issue, or something close enough to it, come up over and over and over again. It was coming up before I had ever been online, and it will almost certainly continue to come up for as long as something resembling the internet we know continues to exist. Deciding what will be allowed and where to draw the line (and who gets to draw that line) is the original permathread.

I originally cut my teeth on sites and services that, frankly, make 4chan and reddit (today’s all-too-frequent bogeymen) look like a knitting circle. Some of them are still around today, though they’re shadows of their former selves; others have vanished into the dustbin of internet history. All of them tried to confront this issue at some point, though I couldn’t really say whether any of them succeeded. Mostly this is because all of them, sooner or later, had the same Very Serious Discussion About What Sort of Community We Are, debating the same sorts of extreme stuff, with the same inevitable hand-wringing, the same inevitable trolling and the same inevitable decision to do one thing or another — more moderation, less moderation, factions splitting off into their own niche sites, shutting everything down or selling out — and in every case the only consistent result was that what existed after was different from what existed before. For example, I don’t know if k5 was “better” or “worse” after a bunch of the old-timers split and went over to HuSi; I don’t have any sort of special wisdom to let me judge that. I just know that k5 after was a different place than k5 before, and that after I didn’t really feel like the community I’d been a part of existed anywhere anymore.

Ultimately I moved on from my online youth, having been enriched slightly by experience, a bit of money (since I had started helping to build community sites) and a few lasting friendships. I still keep tabs on one or two things, for old time’s sake, but the novelty has long since worn off. Back then everything was new and exciting, and I was young and idealistic. Now I’m a bit older, a bit more cynical, and a lot less excitable than I once was, especially because I can’t escape the feeling that there really is nothing new under the sun.

I mention all of this not to try to claim some sort of unique perspective, or to lord it over the poor newbs who are making all the same mistakes all over again, but simply to provide some context for where I’m coming from and to reinforce that this really is not a new problem.

This shit is hard

The most relevant example I can think of, as an analogue to what’s happening with reddit right now, is LiveJournal circa 2007. Yes, it’s easy to look down our noses and sneer at poor old LiveJournal, but basically every pattern of interaction or behavior that’s worth knowing about has happened there, probably multiple times. I know this because, once upon a time, I was a somewhat active user of LJ; it was a safe, supportive and above-all privacy-respecting space, and there was a point in my life when I desperately needed that.

But in May of 2007, LiveJournal debuted a new set of policies, and a corresponding crackdown on journals and communities, in the interest of “protecting children”. LJ had long been a target of activist groups concerned about pedophilia and child pornography, and those groups rejoiced at seeing direct action taken.

If you’re interested, the sordid details of this are mostly still hanging around online, and a bit of careful Googling will turn them up. But the crux of the issue was that, in cracking down on things which were clearly unacceptable, LiveJournal had also — they claimed, inadvertently — cracked down on a bunch of other stuff. There were silly fanfic communities caught in the dragnet. There were experimental-fiction groups caught up in the dragnet. There were, both inevitably and ironically, groups dedicated to serious literary discussion of Vladimir Nabokov caught in the dragnet. And the policy went so far that — though whether any such groups were actually shut down seems to be unclear from the sources that are still online — support groups for survivors of sexual abuse were on the wrong side of the rules.

That last one’s the killer, really.

If you sit down and think about it, it’s easy to see that, say, a support group for survivors of childhood sexual abuse must, by its nature, end up involving and allowing descriptions of horrific, often-violent sexual acts perpetrated against children. And such groups, by their nature, need to exist. In fact, they don’t just need to exist; they need to be actively encouraged and protected, because the work they do is undeniably good. So here we have a thorny problem: can we formulate a policy that allows the support group to exist, but keeps the indefensible stuff out?

This is hard, but actually it’s only half of the problem. Here’s the other half: let’s say you can come up with a policy that allows the support group to exist — hooray. But now it’s inevitable that somewhere, somehow, someone is going to end up masturbating to that material. No matter how many protective barriers you put up around those groups, they’re going to be the mother of all spank banks for some very dedicated perverts, and those people are going to get at it. And then what do you do?

It never ends

But really, that problem starts a lot earlier. A lot of Flickr users have learned, over the years, to be very careful about what sorts of photos they have publicly viewable, because sooner or later they start getting comments and favorites from people who have non-identifying usernames, never post photos, and only favorite pictures with small children in them. Anything can and will be sexualized, with sufficient motivation and/or desperation.

Some sites can solve this through privacy controls, but others — and reddit is one of these — just can’t do that without effectively ceasing to exist. For them, there are basically two options:

Give up and just fall back to what your lawyers say is the bare minimum to avoid being raided by the FBI , or Start playing the endless game of whack-a-mole, as a small fringe comes up with ever-more-elaborate ways to get around your policies and keep doing what they do.

And, well, both of those options suck. They’re both no-win situations; one will get you repeatedly crucified for “supporting pedophiles”, and the other will just lead you further and further down the garden path until you wake up one morning and wonder why you just banned an abuse-survivor support group.

The takeaway

Most people who are discussing reddit’s policy change are doing so from an extremely naive, extremely simplified perspective. They’re arguing about things like what’s allowed by US law, or whether a policy is vague, or making broad emotional appeals, or arguing about who’s more offended than whom or suggesting other areas where broad banhammers could be applied or… well, anything that’s (relatively) straightforward and easy, rather than facing the fact that this is a gigantic, complex, scary issue with gigantic, complex, scary consequences no matter what path ends up being taken.

There are no easy ways to talk about this. There are no easy solutions. Hell, as far as anyone knows, there really aren’t any solutions of any sort. But an open, user-run and user-moderated community with minimal admin tampering is an awfully tempting dream, and lots of people have tried to make that dream real over the years, all with varying degrees of utter failure. See, for example, the famous “LambdaMOO Takes a New Direction”, and its aftermath (same page: “LambdaMOO Takes Another Direction”).

This sort of problem has always existed; these days it’s just more likely to end up in the spotlight, because the types of sites and services that run into it are getting ever larger and ever more ubiquitous. And that trend (of growth and ubiquity) isn’t showing any signs of slowing, so we’re going to see this happening — and happening very, very publicly — much more than in the past. Pretty soon we’re going to need, if not solutions, then at least genuine discussion that genuinely acknowledges how tough this stuff really is.

Meanwhile, I don’t envy reddit one bit.