An important detail is that none of these people technically work for Facebook, but for companies that Facebook contracts with. With contract work, Facebook can "scale globally," but they can get away with paying workers 1/10th of what they pay full-time employees. It also, by design or coincidence, shields Facebook's full-time employees (including programmers and management) from exposure to any of this. Even without the horrific content, the workers' conditions sound like so many other hyper-stressful, micromanaged work environments fostered by places like Amazon distribution centers. Their time is monitored constantly, they can't leave their desks without activating a web browser extension, and they can be fired if their "accuracy" score drops below 95 percent. That last detail might sound like an appropriately rigorous standard, but "accuracy" in this case simply means that your superiors draw the same conclusion that you do, not that your calls adhere to objective criteria.

Of course, there is theoretically "objective" criteria: Facebook's guidelines for dealing with potentially hateful, graphic, or violent content. The problem is that following a strict, legalistic reading of the policies produces strange gaps. Promoting eugenics and forced sterilization is fine, for example, as long as it's directed at non-protected categories like "people with autism." It's this kind of misreading that, anecdotally, causes Facebook to moderate posts calling out racism and still leave up content with neo-Nazi propaganda. In fact, Facebook's neo-Nazi problem has been an ongoing one, and despite public vows to change the company's ad-targeting system, advertisers are still able to target users interested in Nazis.

Clearly, Facebook needs to do a better job. That's the same conclusion reached by a year-long investigation by a committee of British Parliament that looked into the company's business practices before and after the Cambridge Analytica scandal. After determining that Mark Zuckerberg failed to show "leadership or responsibility," the committee has called for additional regulations administered by independent regulators, all funded by new taxes on social-media companies. It's ambitious, but the past few years have shown that Facebook itself is completely unable to deal with Facebook's problems.