In a statement to Engadget, Facebook said it was "reviewing" the lawsuit and took moderator support "incredibly seriously" and pointed to its existing assistance, including "in house" psychological and wellness support as well as similar requirements for its third-party partners. You can read the full statement below.

The lawsuit emphasizes the fine line Facebook must walk to police content from billions of users. It needs thousands of human moderators to quickly take down policy-violating content that might otherwise slip through the cracks, but there's only so much those people can endure. Facebook has to balance between effectiveness and the mental well-being of its contractors, and this lawsuit suggests that it doesn't always get that balance right.