Campaigners have accused Facebook parent Meta of inflicting “potentially lifelong trauma” on hundreds of content moderators in Kenya, after more than 140 were diagnosed with PTSD and other mental health conditions.
The diagnoses were made by Dr. Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Kenya’s capital Nairobi, and filed with the city’s employment and labor relations court on December 4.
The medical reports were filed to the court by legal firm Nzili and Sumbi Associates as part of an ongoing lawsuit against Meta and Samasource Kenya – an outsourcing company that was contracted to review content for the tech giant.
Content moderators help tech companies weed out disturbing content on their platforms and are routinely managed by third party firms, often in developing countries. For years, critics have voiced concerns about the impact this work can have on moderators’ mental well-being.
Meta declined to comment on the medical reports due to the ongoing litigation, but said that it takes the support of moderators seriously and its contracts with third-party firms set out expectations about counselling, training and fair pay.
The spokesperson added that moderators are able to customise the “content review tool,” so that, for example, graphic content appears blurred or in black and white.
Samasource, now known as Sama, did not respond to a request for comment.
Kanyanya said the moderators he assessed encountered “extremely graphic content on a daily basis which included videos of gruesome murders, self-harm, suicides, attempted suicides, sexual violence, explicit sexual content, child physical and sexual abuse, horrific violent actions just to name a few.”
Of the 144 content moderators who volunteered to undergo psychological assessments – out of 185 involved in the legal claim – 81% were classed as suffering from “severe” PTSD, according to Kanyanya.
The class action grew out of a previous suit launched in 2022 by a former Facebook moderator, which alleged that the employee was unlawfully fired by Samasource Kenya after organizing protests against unfair working conditions, according to UK non-profit organization Foxglove, which is supporting the case.
Last year, all 260 content moderators working at Samasource Kenya’s moderation hub in Nairobi were made redundant, “punished” for raising concerns about their pay and working conditions, Foxglove said.
The moderators involved in the current legal claim worked for Samasource Kenya between 2019 and 2023, court documents show.
In one medical record seen by , a content moderator described waking up in cold sweats from frequent nightmares related to the graphic content they reviewed during the job. They added that this resulted in them experiencing frequent breakdowns, vivid flashbacks and paranoia.
Another former content moderator said she developed a “fear of seeing dotted patterns” – known as trypophobia – after seeing an image of maggots crawling out of a decomposing human hand.
Martha Dark, co-executive director of Foxglove, said that “moderating Facebook is dangerous, even deadly, work that inflicts lifelong PTSD on almost everyone who moderates it.”
“In Kenya, it traumatized 100% of hundreds of former moderators tested for PTSD… Facebook is responsible for the potentially lifelong trauma of hundreds of people, usually young people who have only just finished their education,” she said in a statement provided to Friday.
Dark believes that if these diagnoses were made in any other industry, the people responsible would be “forced to resign and face the legal consequences for mass violations of people’s rights.”
This is not the first time that content moderators have taken legal action against social media giants after claiming the job traumatized them.
In 2021, a content moderator for TikTok sued the social media platform after she says she developed psychological trauma as a result of her job.
The following year, TikTok was hit with another lawsuit from former content moderators.