Content Moderation At Scale Is Impossible; Naughty Kids In Wuhan Edition

from the masnick's-impossibility-theorem dept

I keep trying to point out that content moderation at scale is impossible to do well for a whole variety of reasons, including the fact that sooner or later some people -- or some large groups of people -- may try to game the system in totally unexpected ways. Witness this amusing example from the London Review of Books, reporting on the situation in Wuhan, China, which was ground zero for the Covid-19 coronavirus outbreak. With everything shut down in and around Wuhan, schools have moved to online learning -- and some naughty kids seem to have worked out a way to try to get out of having to do schoolwork: getting the app the schools rely on pulled from the app store via fake negative ratings.

Schools are suspended until further notice. With many workplaces also shut, notoriously absent Chinese fathers have been forced to stay home and entertain their children. Video clips of life under quarantine are trending on TikTok. Children were presumably glad to be off school – until, that is, an app called DingTalk was introduced. Students are meant to sign in and join their class for online lessons; teachers use the app to set homework. Somehow the little brats worked out that if enough users gave the app a one-star review it would get booted off the App Store. Tens of thousands of reviews flooded in, and DingTalk’s rating plummeted overnight from 4.9 to 1.4. The app has had to beg for mercy on social media: ‘I’m only five years old myself, please don’t kill me.’

Must tip my cap to the cleverness here, but on the content moderation side it shows, yet again, just how difficult it is to handle content moderation. No one running an app store or other platform prepares for a situation like this. In this case, at least, it seems likely that with so many negative reviews -- and now press attention -- the platform might take notice and discount the most recent thousands of reviews, but imagine having to keep track of every case where this is happening, often on a much smaller, less obvious, scale?

What seems easy about content moderation almost never is. Everyone seems to think it's easy until they're actually running a platform.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, content moderation at scale, coronavirus, covid-19, dingtalk, ios, remote learning, students, wuhan

Companies: apple