digital trauma

Kenya: Facebook outsourcing company accused of inhumane conditions

By Jaysim Hanspal

Posted on February 25, 2022 14:59

Illustration shows Meta and Facebook logos
Meta and Facebook logos are seen in this illustration taken February 15, 2022. REUTERS/Dado Ruvic/Illustration

An investigation by TIME has revealed brutal conditions at a Facebook outsourcing company that moderates explicit content for the social media giant.

Sama, a San Franciscan AI data company that operates an outsourced office in Nairobi, Kenya, is facing fire for its working conditions and company practices, some of which seem to come directly from Facebook headquarters.

According to TIME’s investigation, “nearly 200 young men and women from countries across Africa sit at desks glued to computer monitors, where they must watch videos of murders, rapes, suicides, and child sexual abuse”.

The moderating team, one of over 20 global sites that are part of Facebook’s safeguarding policy, showed symptoms of PTSD, arising from the explicit content and extreme quotas – Sama employees in Nairobi were expected to view videos in 50 seconds.

Facebook guidelines seen by TIME—and previously unreported—instruct content moderators to watch only the first 15 seconds of a video before marking it safe to remain on the platform.

This is not soon after Frances Haugen, a former Facebook employee now whistleblower, testified to US senators in October that the company was aware of the negative impacts of the site, including “ethnic violence” in Myanmar and Ethiopia.

In response to the article, Haugen tweeted, “You can’t fix an algorithm designed to amplify hate with human moderation, and this is what happens when you try.”