Earlier this summer, John Oliver called Facebook “History’s Most Profitable Data-Harvesting Machine,” which was charitable compared to Sunday night’s 20-minute takedown on the company’s hate speech policies and content moderation, especially in countries that are newly getting online.

Oliver compared Facebook unfavorably to toilets, which “make shit go away, whereas Facebook retains shit, disseminates shit to your acquaintances, and reminds you of shit from seven years ago while allowing corporations to put their shit in front of you. What I’m saying is there’s a purity and integrity to toilets that Facebook seriously lacks.”

If you haven’t been following Facebook’s various content moderation-related problems—especially outside the United States—it’s a pretty good primer. Facebook used its Internet.org and Free Basics program quickly expand into the developing world, overtook the internet as “the internet” in many of those countries because it’s free to use, and then suddenly found itself spreading fake news and disinformation with no plan to stop it.

As Reuters and VICE News have reported, Facebook was especially unprepared in Myanmar, where the company at first did not have the ability to parse Burmese text, and was extremely understaffed in moderators who had cultural and language expertise. In that country, it has been blamed for spreading fake news and amplifying ethnic violence against Muslims by the Buddhist majority in that country.

An August investigation by Motherboard gets a shout-out in the piece, as Oliver references one of Facebook’s many internal content moderation rules—in this case, one that specifies the very specific instances in which photoshopped anuses are allowed on the site. Facebook has similar rules for hate speech (drawing the line on what’s allowed and what isn’t in often difficult ways to understand), which cut across cultures, countries, political regimes, and geographic borders.