Tumblr’s ban on “adult content” is a treasure trove of problems: filtering technology that doesn’t work, a law that forces companies to make decisions that make others unsafe, and the problems that arise when one company has outsized influence on speech. It’s also the story of how people at the margins find themselves pushed out of the places where they had built communities. And so Tumblr is also a perfect microcosm of the problems plaguing people on every platform.

In December of 2018, blogging platform Tumblr announced a new ban on “adult content,” a wonderfully vague term that wasn’t so much defined in later posts as made more confusing. The ban came a month after Tumblr saw its app disappear from the Apple App Store, which, as a gatekeeper, has enforced draconian rules for app developers, exerting control over how its users get to experience the Internet. The existence of porn on Tumblr also led to it being banned in Indonesia. Add to that passage of SESTA/FOSTA, a bill which makes platforms liable for what is said and done by their users if those things are tied to prostitution.

Companies—especially large ones—are risk-averse. So, when you’re being cut off from potential customers and you could be liable for things related to sex, it’s much easier to just institute a blanket ban on sex and nudity as much as possible. And it’s much easier to do that by using a filtering tool that is over-inclusive—so much so that it winds up flagging your own examples of acceptable content. While Tumblr has never come right out and said what caused their decision, it’s a fair bet that some combination of these factors got them there.

And there’s a lot to say about the effect of this ban. About the queer and sex-postive communities that felt threatened and erased. About how hard it is to find somewhere to go that would be as safe as Tumblr had been. About how all of this is reflective of a very specific, sanitizing view of what’s acceptable online.

There’s a human cost to what Tumblr probably just sees as a business decision. The only semi-helpful result of this high-profile disaster in platform censorship is how well-publicized its failings have been.

Tumblr made what could be charitably described as “a lot” of mistakes in the roll out of the ban. It insisted that the new policy struck a balance, that it banned adult content but would continue to foster a “diversity of expression” for people using the platform to discuss “art, sex positivity, your relationships, your sexuality, and your personal journey.” That sounds like a complicated issue, right? It’s expensive to police this issue, particularly if you’re already policing other categories such as illegal activity.

So Tumblr decided to use an automated system. One that it admitted straight out would make mistakes. And make mistakes it did. The hashtag “#TooSexyforTumblr” had people sharing some of the more ridiculous and upsetting mistakes.

Which brings us to Sarah Burstein, a law professor with a Tumblr that shares “new, notable, or otherwise interesting design patents.” Patents rarely include pornography in them, but, again, automated filters are very bad at the job they’re supposed to do. Right after the new ban was announced, Burstein ran into a number of her posts being flagged. What did the filter have a problem with?

Well, a heart-shaped necklace, a boot-scrubbing design, LED jeans, troll socks, a Louis Vuitton bag, some boxes, a tire, a hanger, a flamingo floatie, shoes, pillows of all sorts, and so much more.

They’re almost exclusively black-and-white line drawings and diagrams of things. And who knows what setting on the filters caused a sudden deluge of flagging for Burstein. That, in itself, is a huge problem. The policy is broad and clumsy, while Tumblr claims it’s nuanced. But then again, Tumblr employs an automated system incapable of being nuanced, resulting in absurd results. Tumblr says it has to use these tools because it’s the only thing that can work on a large-scale, but that, again, misses the point that Tumblr didn’t need this policy in the first place. It already had rules against illegal content, deciding to accede to Puritanical notions of what is acceptable was not necessary.

And because of all of this, the things users really need—consistency and clarity—are lacking completely.