The case is the first of its kind with a content moderator from outside the company’s home country. In May 2020, Meta (then Facebook) reached a $52 million settlement with U.S. moderators who suffered from PTSD due to their work at the company. But previous reports have found that many of the company’s international moderators face lower wages and less support for much of the same work as they do in countries with fewer mental health services and labor rights. While moderators in the U.S. earn about $15 an hour, moderators in places like India, the Philippines, and Kenya earn much less, according to The Verge in 2019.
“The whole point of sending content moderation overseas and far away is to distance yourself from it and reduce the cost of that business function,” said Paul Barrett, associate director of the New York University Center for Business and Human Rights. , who authored a 2020 report on outsourcing content moderation. But content moderation is critical to the continued operation of the platform, blocking content that would drive users and advertisers away from the platform. “Content moderation is a core, important business function, not a peripheral or an afterthought. But the irony is that the whole arrangement is one of buck-passing,” he said. (A summary version of Barrett’s report is included as evidence in Kenya’s current case representing Motaung.)
Other outsourcers, such as those in the apparel industry, would find it inconceivable today to say they are not responsible for the conditions under which clothes are made, Barrett said.
“I think tech companies, younger and in some ways more arrogant, think they can pull this off,” he said.
Speaking to WIRED on the condition of anonymity out of fear of reprisals, one Sama moderator said he reviews thousands of pieces of content a day, often in 55 seconds or less to decide which ones to keep. What’s on the platform and what’s not on the platform. Sometimes that content can be “some bloody, hate speech, bullying, inflammatory, sexual stuff,” they said. “You should expect anything.”
Crider of Foxglove Legal said the systems and processes that Sama moderators were exposed to — and that have been shown to be mentally and emotionally damaging — were designed by Meta. (The case also alleges that Sama participated in labor abuse through union-busting activities, but not Meta.)
“This is a broader complaint about the system of work that is inherently harmful, toxic and exposes people to unacceptable levels of risk,” Crider said. “The system is functionally the same whether the person is in Mountain View, Austin, or , Warsaw, Barcelona, Dublin, or Nairobi. So, from our perspective, the point is that the system Facebook designed is a driver of harm and a risk for people to develop PTSD.”
In many countries, especially those relying on English common law, courts often refer to judgments in other similar countries to help craft their own judgments, and Motaung’s case could serve as a blueprint for other countries to outsource mediators, Crider said. “While it does not set any formal precedent, I hope this case can serve as a milestone for other jurisdictions considering how to deal with these large multinational corporations.”
Leave a Reply