in

South Africa’s Daniel Motaung, the exploited Facebook moderator demanding change

South Africa’s Motaung was 27 years when he was looking for his first job after university. He came across an online ad from Sama seeking Zulu speakers. He wanted to be a lawyer, but he was excited about working for an outsourcing company for Facebook and believed he could work his way up to a job at the social-media giant. However, this was not the case. “Sama used to tell us don’t talk to Facebook, don’t mention Facebook”, Motaung says. “We weren’t allowed to say we worked at Facebook or put Facebook on our CV.”

Facebook has similar relationships and subsequent scandals with companies like Accenture, a blue-chip consulting firm which the New York Times reported received $500m a year for its services.

Unpleasant surprises

“I didn’t find out about the explicit nature of the job until two weeks after I arrived in Kenya,” Motaung said. Daniel then had to sign a non-disclosure agreement, preventing him from publicly disclosing the nature of his job. As a moderator for Facebook, through Sama, employees have to watch explicit content to determine it is fit for the social-media network.

“Sama targets poor people. When you are poor, it is difficult to negotiate things in a work environment.” Most of Motaung’s colleagues did not get permanent work visas, meaning they could not open bank accounts in Kenya. Many were enticed by the wage – which Sama advertises is “3x the minimum wage and 2x the living wage in Kenya”. In reality, Motaung received just $2.50 an hour. “It’s not easy to walk away,” he said. “You have obligations as a poor person, and I had to pay for rent in Kenya.”

Sama had previously come under fire for its low wages. In 2018, company founder Leila Janah justified the levels of pay: “One thing that’s critical in our line of work is to not pay wages that would distort local labour markets. If we were to pay people substantially more than that, we would throw everything off.”

For Sama’s moderators, it seems initially that this job was an opportunity, despite the trauma they had to endure. Daniel said, “You think you’ll be able to manage, but these things break you emotionally, physically and mentally. Even during this conversation, I have a heightened fear of death.”

Daniel’s first video he moderated was a live beheading. “It was one of the most graphic things I have seen in my life”, he said. “When it boils down to it you will see beheadings, people raping children.”

To alleviate the evident stressful environment, Sama hired “wellness counsellors” who provided sessions that involved playing games and other forms of entertainment. No psychotherapy or real counselling was provided, and Daniel is unsure the counsellors were even qualified.

The stress of the job led Daniel to begin to organise a strike for better working conditions, and despite getting many of his colleagues on board he was fired within days. His vocal nature led to two executives from Sama headquarters flying to Nairobi to personally quell the strike, accusing Daniel of putting both Facebook and Sama “at risk”.

What TIME got wrong?

In response to the TIME investigation, Sama released a statement called “What TIME got wrong”. The company wrote, “It is completely inaccurate to suggest that Sama employees were hired under false pretences or were provided inaccurate information regarding content moderation work.”

“To date, Sama’s work has helped lift more than 59,000 individuals out of poverty. […] To be clear, there was never a strike and the article falsely alleges that Sama does not compensate its employees fairly.” In the statement, they do not deny Daniel’s account.

In a world where social media is so pervasive, online moderation is essential to ensuring platforms like Facebook are safe spaces for the people who choose to use them. For flagged content, roughly 10% of the content is removed by human moderators. For Daniel, this means first ensuring moderators have safe spaces themselves. “I think ongoing mental health support would be useful because you take the trauma home with you.”

This is not the first time Facebook has come under fire for its practices. In October 2021, in a testimony to US senators Facebook whistleblower Frances Haugen testified that the social media giant is well aware of the negative impacts of the site, including “ethnic violence” in Myanmar and Ethiopia.

Daniel agrees that Facebook’s system is not effective. “If you are reviewing content in Ethiopia, for example, you would use Google. But you have to stick to your workstation”. With a review time cap of under a minute for each video, is there sufficient time to review the internet’s most explicit content?

Sama has now promised workers a 30-50% wage increase, around $2.20 per hour for a 9-hour working day. For Daniel, this is proof that wage standards were “not impossible”. He said, “If you’re looking for me to do work that will disturb me, compensate me for it.”

“Swimming in a toxic river”

Daniel is currently embroiled in a legal battle against Facebook supported by not-for-profit Foxglove. His confidence is fuelled by a legal settlement in 2020, in which 11,250 moderators from outsourcing company Cognizant received $52 million from Facebook in damages, including widespread symptoms of PTSD. This worked out to a minimum of $1000 of compensation, roughly enough to cover twenty hours of therapy. As a result, Facebook promised moderation tools such as muting audio by default and changing videos to black and white to minimise distress.

Cori Crider, director of Foxglove, thinks that Daniel’s suit is a clear cut claim for unconstitutional union-busting and wrongful termination under Kenyan labour law. She said, “You can’t hope to make Facebook safe for people if there are very few people staffed to do it, and those people work in conditions that give a very significant percentage of them PTSD.”

She calls the group of more than 150 moderators who came up against Sama executives as a “skeleton crew with the job to protect everybody are swimming in a toxic river and can’t possibly cope with the volume of material Facebook want them to go through.”

Crider has worked with Facebook moderators in other countries, namely Ireland, where ongoing lawsuits from moderators against Facebook persist.

She stresses that “It speaks to the wider problem with Facebook’s decision to run its business this way because content moderation is not some kind of side part of the business. If you’re going to run a massive platform for one plus billion people, then content moderation is your business. They are absolutely essential to Facebook’s work, but because the work is so difficult, so dangerous.”

However, the Sama case differs from the others in its level of exploitation in its workers and the company’s B-corp rating, which gives it the ethical seal of approval that shelters it from scrutiny.

“Facebook doesn’t want to admit that that’s what the core of its business model really requires, and it doesn’t want to be liable for them or responsible for them. So it outsources the work everywhere so that it can hold these people to whom it has a moral and ultimately, legal responsibility, at arm’s length,” Crider says.

Sama did not respond to a request for comment.

Written by admin