Showing posts with label content moderators. Show all posts
Showing posts with label content moderators. Show all posts

Thursday, February 5, 2026

‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI; The Guardian, February 5, 2026

Anuj Behal, The Guardian ; ‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI


[Kip Currier: The largely unaddressed plight of content moderators became more real for me after reading this haunting 9/9/24 piece in the Washington Post, "I quit my job as a content moderator. I can never go back to who I was before."

As mentioned in the graphic article's byline, content moderator Alberto Cuadra spoke with journalist Beatrix Lockwood. Maya Scarpa's illustrations poignantly give life to Alberto Cuadra's first-hand experiences and ongoing impacts from the content moderation he performed for an unnamed tech company. I talk about Cuadra's experiences and the ethical issues of content moderation, social media, and AI in my Ethics, Information, and Technology book.]


[Excerpt]

"Murmu, 26, is a content moderator for a global technology company, logging on from her village in India’s Jharkhand state. Her job is to classify images, videos and text that have been flagged by automated systems as possible violations of the platform’s rules.

On an average day, she views up to 800 videos and images, making judgments that train algorithms to recognise violence, abuse and harm.

This work sits at the core of machine learning’s recent breakthroughs, which rest on the fact that AI is only as good as the data it is trained on. In India, this labour is increasingly performed by women, who are part of a workforce often described as “ghost workers”.

“The first few months, I couldn’t sleep,” she says. “I would close my eyes and still see the screen loading.” Images followed her into her dreams: of fatal accidents, of losing family members, of sexual violence she could not stop or escape. On those nights, she says, her mother would wake and sit with her...

“In terms of risk,” she says, “content moderation belongs in the category of dangerous work, comparable to any lethal industry.”

Studies indicate content moderation triggers lasting cognitive and emotional strain, often resulting in behavioural changes such as heightened vigilance. Workers report intrusive thoughts, anxiety and sleep disturbances.

A study of content moderators published last December, which included workers in India, identified traumatic stress as the most pronounced psychological risk. The study found that even where workplace interventions and support mechanisms existed, significant levels of secondary trauma persisted."

Sunday, April 27, 2025

I didn’t eat or sleep’: a Meta moderator on his breakdown after seeing beheadings and child abuse; The Guardian, April 27, 2025

  and , The Guardian; I didn’t eat or sleep’: a Meta moderator on his breakdown after seeing beheadings and child abuse

"When Solomon* strode into the gleaming Octagon tower in Accra, Ghana, for his first day as a Meta content moderator, he was bracing himself for difficult but fulfilling work, purging social media of harmful content.

But after just two weeks of training, the scale and depravity of what he was exposed to was far darker than he ever imagined."

Meta faces Ghana lawsuits over impact of extreme content on moderators; The Guardian, April 27, 2025

 and , The Guardian; Meta faces Ghana lawsuits over impact of extreme content on moderators

"Meta is facing a second set of lawsuits in Africa over the psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse.

Lawyers are gearing up for court action against a company contracted by Meta, which owns Facebook and Instagram, after meeting moderators at a facility in Ghana that is understood to employ about 150 people.

Moderators working for Majorel in Accra claim they have suffered from depression, anxiety, insomnia and substance abuse as a direct consequence of the work they do checking extreme content.

The allegedly gruelling conditions endured by workers in Ghana are revealed in a joint investigation by the Guardian and the Bureau of Investigative Journalism."

Sunday, September 15, 2024

‘I quit my job as a content moderator. I can never go back to who I was before.’; The Washington Post, September 9, 2024

 , The Washington Post;  ‘I quit my job as a content moderator. I can never go back to who I was before.’

"Alberto Cuadra worked as a content moderator at a video-streaming platform for just under a year, but he saw things he’ll never forget. He watched videos about murders and suicides, animal abuse and child abuse, sexual violence and teenage bullying — all so you didn’t have to. What shows up when you scroll through social media has been filtered through an army of tens of thousands of content moderators, who protect us at the risk of their own mental health.

Warning: The following illustrations contain references to disturbing content."

Thursday, April 26, 2018

Facebook finally explains why it bans some content, in 27 pages; The Washington Post, April 24, 2018

Elizabeth Dwoskin and Tracy Jan, The Washington Post; Facebook finally explains why it bans some content, in 27 pages

"“We want people to know our standards, and we want to give people clarity,” Monika Bickert, Facebook’s head of global policy management, said in an interview. She added that she hoped publishing the guidelines would spark dialogue. “We are trying to strike the line between safety and giving people the ability to really express themselves.”"