The social network says that in recent years it has been developing artificial intelligence to spot problematic posts, but the technology isn't sophisticated enough to replace the need for significant amounts of human labour.
Facebook is under intense scrutiny from politicians and lawmakers, who have taken top executives to task in two high-profile hearings on Capitol Hill this year and are considering new regulations that would hold the companies to a more stringent standard of responsibility for illegal content posted on their platforms.
The complaint also charges the Boca Raton, Florida-based contracting company, Pro Unlimited, Inc., with violating California workplace safety standards.
Pro Unlimited didn't respond to a request for comment.
The lawsuit does not go into further detail about Ms. Scola's particular experience because she signed a non-disclosure agreement that limits what employees can say about their time on the job. Such agreements are standard in the tech industry, and Scola fears retaliation if she violated it, the suit says. Her attorneys plan to dispute the NDA, but are holding off on providing further detail until a judge weighs in.
The suit notes that Facebook is one of the leading companies in an industry-wide consortium that has developed workplace safety standards for the moderation field. The complaint alleges that Facebook does not uphold the standards it helped developed, unlike industry peers.
In 2017, two former content moderators also sued Microsoft, claiming that they developed PTSD and that the company did not provide adequate psychological support.
It asks that Facebook and its third-party outsourcing companies provide content moderators with proper mandatory onsite and ongoing mental health treatment and support, and establish a medical monitoring fund for testing and providing mental health treatment to former and current moderators.
- Washington Post