top of page
Search
  • Writer's picturePhil Griffis

Facebook Sued for Failing to Protect Content Monitors from Graphic Images

Updated: Mar 23, 2019


The content that makes it onto Facebook is bad enough. Can you imagine being one of the human beings that has to review, screen and block the endless garbage, and worse, that the social media site bans?


A new California lawsuit provides a riveting glimpse into this world. Selena Scola was an employee of Pro Unlimited, Inc., a Florida based Contingent Workforce Management Company that provided contract workers to Facebook. For a little less than a year Selena worked as a Public Content Contractor at Facebook’s California offices. As described by her, Facebook (instead of scrutinizing content before it is posted) relies on users to report inappropriate conduct. Content moderators, she claims, are asked to review more than 10 million potentially rule breaking posts a day.


But what Facebook doesn’t do, according to the suit, is adequately protect its thousands of contract moderators from the stress caused by what they have to see. She alleges that in in August 2015 the company rolled out Facebook Live, which allows users to broadcast their activities live. According to Selena, the service has provided “a platform for users to “post millions of videos, images and live-streamed broadcasts of child sexual abuse, rape, torture…, beheadings, suicide and murder”. As another moderator described to the UK Guardian “You’d go into work at 9:00 a.m. every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see…”.


Selena claims to have witnessed thousands of such events, and alleges that she developed Post-Traumatic Stress Disorder (PTSD) as a result. She alleges that Facebook has developed safety standards to help protect moderators from the disturbing content. They include suggestions to limit the amount of time employees are exposed, teaching moderators how to assess their reactions to the images, and counselling sessions. But, she claims, Facebook essentially ignores these standards and fails to provide moderators training on implementing them.


Ms. Scola’s suit asks that the Court certify it as a class action, meaning that if the court grants the request the suit would be brought on behalf of all content moderators from the last three years.


She claims that Facebook was negligent and that both it and Pro Unlimited violated various California labor laws, including those that prohibit employers from requiring employees to go into workplaces that are not safe and healthful. It asks the court to enjoin the companies from continuing to conduct business through the unlawful acts it alleges. It also requests that the court order the defendants to establish a medical monitoring fund, which would fund employees’ medical and psychological monitoring and treatment necessitated by their exposure to disturbing content.



WRITTEN BY



Phil Griffis obtained his first jury verdict in 1990, when he convinced a jury that a customer’s fall at his client’s store did not cause the customer’s aspiration pneumonia and stroke. In the years since he has continued to win in courtrooms across the State of Texas.

29 views0 comments

Recent Posts

See All

Next Up For Fox News - The Smartmatic Lawsuit

Fox News will have little “breathing room” following its enormous three-quarters of a billion dollars defamation settlement payout to Dominion Voting Systems. Waiting in the wings of the courthouse is

Shareholders Sue Southwest For Mass Flight Cancellations

A shareholder class action lawsuit has been filed against Southwest Airlines in response to the operational disaster that resulted in nearly 17,000 canceled flights over the recent holiday travel seas

Buying Time for Your Business with Standstill Agreements

Standstill Agreements are a little known legal tool that could literally save your business, or someone else’s, during the current crisis. At its most basic, a Standstill Agreement puts a temporary "f

bottom of page