London Daily

Focus on the big picture.
Friday, Oct 03, 2025

Facebook content moderators sue over psychological trauma

Facebook content moderators sue over psychological trauma

In the summer of 2017, Chris Gray walked into Facebook’s Dublin office for his first day of work as a content moderator.

“It's one of these very trendy, California-style open offices. Bright yellow emojis painted on the wall,” Gray said. “It's all very pretty, very airy, seems very cool.”

Gray wasn’t a Facebook employee. He was a contractor hired by CPL Resources PLC in Dublin, one of several outsourcing firms Facebook works with to moderate content on its platform. He took the job hoping to move up the ranks and eventually work for Facebook. But that never happened. Instead, Gray says, the nine months he spent at CPL Resources left him with lasting psychological trauma and post-traumatic stress disorder (PTSD).

Gray and several other former contractors are now suing CPL Resources and Facebook in Ireland’s High Court over the psychological trauma they say they endured because of poor training and lack of adequate mental health resources on the job. The lawsuit, which was filed last week, is bringing new scrutiny to the content moderation ecosystem that Facebook and other platforms rely on to police what gets posted on their platforms.

Gray started working as a content moderator in July 2017. He was one of the thousands hired to moderate flagged content on Facebook following a series of high-profile incidents. In April 2017, a Cleveland, Ohio man uploaded a video of himself gunning down an elderly stranger on the street. It stayed on Facebook for hours. Within days, a man in Thailand livestreamed the murder of his baby daughter on Facebook Live.

Facebook was scrambling to prove it was taking steps to keep posts like this off the site and in May, Facebook CEO Mark Zuckerberg announced his company would be adding 3,000 new members to the team of people who moderate content for Facebook. (Today, there are about 15,000 people around the world who review content for Facebook, according to a company spokesperson.)

Related: Facebook wants to create a 'Supreme Court' for content moderation. Will it work?

At first, Gray’s job was to keep pornography off the site. When a user or Facebook’s technology flagged a post that seemed to be in violation of Facebook’s “Community Standards,” it would go to Gray, or to someone else on his team who would review the video, photo or text and decide what to do with it - take it down, mark it with a warning or leave it up.

“After a few months, I was moved to the high-priority queue, which is hate speech, graphic violence, bullying. Really all the nasty stuff you want to act on very quickly,” Gray said. “I really don't like to talk in detail about [the things I was reviewing, but it included] executions. Terrorists beheading people. Ethnic cleansing in Myanmar. Bestiality. I mean, you name it. All the worst of humanity, really.”

On busy days, Gray would walk into work to find 800 of these posts waiting in his queue. On good days, it was closer to 200. He had to sift through quickly, but also carefully because Facebook was auditing the decisions he was making - and keeping a score. Gray was working 37.5 hours a week, making about $14 per hour.

Gray was under a strict nondisclosure agreement and therefore was not speaking with friends and family about the disturbing things he was seeing at work. It wasn't until a whole year after he left the company during a meeting with a journalist at a coffee shop that Gray says he opened up about the work he did for Facebook.

“This was the first time I'd ever talked about all the horrible stuff I had to see. I never even discussed it with my wife. And I literally broke down and cried in a coffee shop. And I was absolutely shocked. I was bewildered. I just did not know what was happening to me,” he said.

This incident prompted Gray to go see a doctor. He was diagnosed with PTSD.

CPL Resources did not respond to several interview requests on the lawsuit.

Through a statement provided to The World, Facebook wrote, “We recognize this review work can be difficult, and we work closely with our partners to ensure that the people who do this work are supported. We require everyone who reviews content for Facebook go through an in-depth, multi-week training program on our Community Standards and have access to extensive psychological support to ensure their wellbeing.

“This includes 24/7, on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right.”

Sean Burke, another former contractor who is suing, says that he did not receive the training and support Facebook claims its partners provide to workers. Burke started working for CPL Resources in 2017.

“On the first day, one of my first tickets was watching someone being beaten to death with a plank of wood with nails on it,” he said. “Within my second week on the job, it was my first time ever seeing child porn.”

Burke says he saw videos of people being decapitated and people committing suicide. Not everything he was reviewing was this disturbing, but these posts stuck with him.

Burke was working the night shift. He'd come in to hundreds of posts every day. Do his work, get home around 3 or 4 in the morning, and try to get some sleep. When he finally did, he'd have nightmares about the things he saw on his computer screen at work.

“You're seeing the worst that humanity has to offer, and you just become completely disheartened,” he said.

Burke felt like he needed help and support, so he paid a visit to the CPL Resources’ Wellness Center. They offered yoga, finger-painting classes, and people he could talk to.

“Unfortunately…they can't do anything to help you cope or manage with the material or the environment,” Burke said. “They're kind of just, there's a shoulder to lean [on] and cry on.”

Burke sought outside help and was prescribed anxiety medication, which he says helped him cope. But his accuracy rating fell below 98, and CPL Resources did not extend his contract. (Remember, every decision he made was audited to assess whether he did a good enough job applying Facebook's rules - rules that Burke says were complicated and constantly evolving).

Facebook has said it wants to eventually automate most of this crucial content moderation work - to have it done by sophisticated algorithms. But the technology isn't there yet. In the meantime, Cori Crider, who’s with the London-based nonprofit Foxglove, and is assisting with the lawsuit, says she wants better conditions for the humans doing the work.

Facebook and other social media platforms could not exist without the labor that these people provide,” she said. “It would be unusable. You wouldn't touch it. You wouldn't set foot in it because it would just be awash in abuse and pornography and violence. And the people who are on the front lines of this battle making the platforms a place that's [usable] for us all - they're really paying the price right now.”

Crider wants Facebook and its partners to provide better mental health support to employees, and to limit how much toxic content moderators are exposed to.

“If you think, for example, about police investigating child abuse cases here in the UK, they all have…very serious psychological support and actually limits on the amount of time they're permitted to be exposed to that stuff,” Crider said. “So, if [Facebook] had just taken a little bit closer look at comparative examples of other people who do this sort of work, they could've done better.”

Gray agrees. But he says that figuring out the best way to keep the internet and the people who moderate it safe is not easy.

“It's an incredibly complex task,” Gray said. “And I don't think Facebook should be ashamed that they haven't got it right yet. I think they just need to say, ‘OK, we're learning. We're doing our best. We've made mistakes. And it appears that some people have suffered as a result of those mistakes. And we're going to make that right.’”

Newsletter

Related Articles

0:00
0:00
Close
Trump Administration Launches “TrumpRx” Plan to Enable Direct Drug Sales at Deep Discounts
Trump Announces Intention to Impose 100 Percent Tariff on Foreign-Made Films
Altman Says GPT-5 Already Outpaces Him, Warns AI Could Automate 40% of Work
Singapore and Hong Kong Vie to Dominate Asia’s Rising Gold Trade
Trump Organization Teams with Saudi Developer on $1 Billion Trump Plaza in Jeddah
Manhattan Sees Surge in Office-to-Housing Conversions, Highest Since 2008
Switzerland and U.S. Issue Joint Assurance Against Currency Manipulation
Electronic Arts to Be Taken Private in Historic $55 Billion Buyout
Thomas Jacob Sanford Named as Suspect in Deadly Michigan Church Shooting and Arson
Russian Research Vessel 'Yantar' Tracked Mapping Europe’s Subsea Cables, Raising Security Alarms
New York Man Arrested After On-Air Confession to 2017 Parents’ Murders
U.S. Defense Chief Orders Sudden Summit of Hundreds of Generals and Admirals
Global Cruise Industry Posts Dramatic Comeback with 34.6 Million Passengers in 2024
Trump Claims FBI Planted 274 Agents at Capitol Riot, Citing Unverified Reports
India: Internet Suspended in Bareilly Amid Communal Clashes Between Muslims and Hindus
Supreme Court Extends Freeze on Nearly $5 Billion in U.S. Foreign Aid at Trump’s Request
Archaeologists Recover Statues and Temples from 2,000-Year-Old Sunken City off Alexandria
China Deploys 2,000 Workers to Spain to Build Major EV Battery Factory, Raising European Dependence
Speed Takes Over: How Drive-Through Coffee Chains Are Rewriting U.S. Coffee Culture
U.S. Demands Brussels Scrutinize Digital Rules to Prevent Bias Against American Tech
Ringo Starr Champions Enduring Beatles Legacy While Debuting Las Vegas Art Show
Private Equity’s Fundraising Surge Triggers Concern of European Market Shake-Out
Colombian President Petro Vows to Mobilize Volunteers for Gaza and Joins List of Fighters
FBI Removes Agents Who Kneeled at 2020 Protest, Citing Breach of Professional Conduct
Trump Alleges ‘Triple Sabotage’ at United Nations After Escalator and Teleprompter Failures
Shock in France: 5 Years in Prison for Former President Nicolas Sarkozy
Tokyo’s Jimbōchō Named World’s Coolest Neighbourhood for 2025
European Officials Fear Trump May Shift Blame for Ukraine War onto EU
BNP Paribas Abandons Ban on 'Controversial Weapons' Financing Amid Europe’s Defence Push
Typhoon Ragasa Leaves Trail of Destruction Across East Asia Before Making Landfall in China
The Personality Rights Challenge in India’s AI Era
Big Banks Rebuild in Hong Kong as Deal Volume Surges
Italy Considers Freezing Retirement Age at 67 to Avert Scheduled Hike
Italian City to Impose Tax on Visiting Dogs Starting in 2026
Arnault Denounces Proposed Wealth Tax as Threat to French Economy
Study Finds No Safe Level of Alcohol for Dementia Risk
Denmark Investigates Drone Incursion, Does Not Rule Out Russian Involvement
Lilly CEO Warns UK Is ‘Worst Country in Europe’ for Drug Prices, Pulls Back Investment
Nigel Farage Emerges as Central Force in British Politics with Reform UK Surge
Disney Reinstates ‘Jimmy Kimmel Live!’ after Six-Day Suspension over Charlie Kirk Comments
U.S. Prosecutors Move to Break Up Google’s Advertising Monopoly
Nvidia Pledges Up to $100 Billion Investment in OpenAI to Power Massive AI Data Center Build-Out
U.S. Signals ‘Large and Forceful’ Support for Argentina Amid Market Turmoil
Nvidia and Abu Dhabi’s TII Launch First AI-&-Robotics Lab in the Middle East
Vietnam Faces Up to $25 Billion Export Loss as U.S. Tariffs Bite
Europe Signals Stronger Support for Taiwan at Major Taipei Defence Show
Indonesia Court Upholds Military Law Amid Concerns Over Expanded Civilian Role
Larry Ellison, Michael Dell and Rupert Murdoch Join Trump-Backed Bid to Take Over TikTok
Trump and Musk Reunite Publicly for First Time Since Fallout at Kirk Memorial
Vietnam Closes 86 Million Untouched Bank Accounts Over Biometric ID Rules
×