Weeks After PTSD Settlement, Facebook Moderators Ordered to Spend More Time Viewing Online Child Abuse

Contractor Accenture has added 48 minutes to the shifts of many moderators in North America.

In this Jan. 9, 2019 photo, Facebook employees are seen at their stations during a tour of its new 130,000-square-foot offices, which occupy the top three floors of a 10-story Cambridge, Mass. building. The space gives the company room to triple its current local staff of more than 200. The Silicon Valley company, created by Mark Zuckerberg when he was two subway stops away at Harvard University, opened its first Boston office five years ago. (AP Photo/Elise Amendola)
Facebook employees are seen at their stations during a tour of its offices, in Cambridge, Mass. on Jan. 9, 2019. Photo: Elise Amendola/AP

With the ink still drying on their landmark $52 million settlement with Facebook over trauma they suffered working for the company, many outsourced content moderators are now being told that they must view some of the most horrific and disturbing content on the internet for an extra 48 minutes per day, The Intercept has learned.

Following an unprecedented 2018 lawsuit by ex-Facebook content moderator Selena Scola, who said her daily exposure to depictions of rape, murder, and other gruesome acts caused her to develop post-traumatic stress disorder, Facebook agreed in early May to a $52 million settlement, paid out with $1,000 individual minimums to current and former contractors employed by outsourcing firms like Accenture. Following news of the settlement, Facebook spokesperson Drew Pusateri issued a statement reading, “We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future.”

Less than a month after this breakthrough, however, Accenture management informed moderation teams that it had renegotiated its contract with Facebook, affecting at least hundreds of North American content workers who would now have to increase their exposure to exactly the sort of extreme content at the heart of the settlement, according to internal company communications reviewed by The Intercept and interviews with multiple affected workers.

The new hours were announced at the tail end of May and beginning of June via emails sent by Accenture management to the firm’s content moderation teams, including those responsible for reviewing Child Exploitation Imagery, or CEI, generally graphic depictions of sexually abused children, and Inappropriate Interactions with Children, or IIC, typically conversations in which adults message minors in an attempt to “groom” them for later sexual abuse or exchange sexually explicit images. The Intercept reviewed multiple versions of this email, apparently based off a template created by Accenture. It refers to the new contract between the two companies as the “Golden SoW,” short for “Statement of Work,” and its wording strongly suggests that stipulations in the renewed contract led to 48-minute increases in the so-called “Safety flows” that handle Facebook posts containing depictions of child abuse.

“For the past year or so, our Safety flows (CEI,IIC) as well as GT have been asked to be productive for 5.5 hours of their day,” reads one email reviewed by The Intercept, referring to “Ground Truth,” a team of outsourced humans tasked with helping train Facebook’s moderation algorithms. “Over the last few weeks the golden sow, Accenture’s contractual agreement with Facebook, was signed. In the contract, it discussed production time and the standard that all agents will be held to.” Accenture moderators, the email continues, “will need to spend 6.3 hours of their day actively in production” — meaning an extra 48 minutes per day viewing the arguably most disturbing possible content found on the internet.

The email then notes that Accenture is “aligning to our global partners as well as our partners in MVW,” a likely reference to Mountain View, California, where, the email suggests, moderators were already viewing such content for 6.3 hours per day. It is understood, the email said, that there could be “one offs every now and then when you are unable to meet the daily expectation of 6.3″ hours of exposure, but warned against letting it become a pattern.

Pusateri, the Facebook spokesperson, told The Intercept, “We haven’t increased guidance for production hours with any of our partners,” but did not respond to questions about Accenture’s announcement itself. Accenture spokesperson Sean Conway said only that they had not been instructed to enact any change by Facebook, but would not elaborate or provide an explanation for the internal announcement.

Not only does the increase in child pornography exposure seemingly run afoul of Facebook’s public assurances that it will be “providing [moderators] additional support through this settlement and in the future,” it contradicts research into moderator trauma commissioned by the company itself. A 2015 report from Technology Coalition, an anti-online child exploitation consortium co-founded by Facebook and cited in Scola’s lawsuit, found that “limiting the amount of time employees are exposed to [child sexual abuse material] is key” if employee trauma is to be avoided. “Strong consideration should be given to making select elements of the program (such as counseling) mandatory for exposed employees,” the paper also noted. “This removes any stigma for employees who want to seek help and can increase employee awareness of the subtle, cumulative effects that regular exposure may produce.” The Accenture announcement, however, appears to fall well short of mandatory counseling: “Agents are free to seek out wellness coaches when needed,” the email states. A request for comment sent to Technology Coalition was not returned.

Accenture’s “wellness” program is a contentious issue for Facebook moderators, many of whom say such quasi-therapy is a shoddy stand-in for genuine psychological counseling, despite the best intentions of the “coaches” themselves. Last August, The Intercept reported an instance of Accenture management pressuring a wellness coach to divulge details of a session with an employee; a recent New York University report on digital content moderators slammed Accenture’s wellness program as “inadequate care” for those primed to develop PTSD.

Related

Facebook’s $1,000 Coronavirus Bonus Doesn’t Apply to Its Most Vulnerable Workers

To some moderators, the increase in required work hours is just another in a long series of slights from Accenture and Facebook. “It makes me feel extremely unappreciated and uncared for by this company and its management, even more so than before,” said a moderator who works with child exploitation content, speaking with The Intercept on the condition of anonymity. “Management doesn’t view the content so they have no way of actually understanding or empathizing. If they have seen pieces of content every once in a while, this doesn’t compare to having to sit in front of a computer screen being immersed in it for five and a half hours, five days a week (now 6.3).”

Though 48 minutes of extra work may not seem like a profound increase, this moderator emphasized that the heinousness of the task can’t be overstated and that they fear what effects even a sub-hour increase could have over time. “It is difficult to explain the emotional impact of doing this job to anyone who hasn’t firsthand experienced it,” the source said. “The more time you spend every day immersed in the content the more it stays on your mind and is difficult to let go of after you clock out and in your time outside of work. It gets very draining quickly; it’s almost like how 48 minutes of running feels a lot longer and has more of an effect than 48 minutes of sitting or walking. It might not seem like a lot longer, but when you’re doing the work we do, it really is. We are witnessing child grooming and depraved predatory behavior firsthand, including child pornography, and all the while having to match everything to complicated and ever-changing policy and stressing about perfect quality and getting the actions right.” Another Accenture moderator who also requested anonymity echoed the fear of psychological harm: “This production increase will affect my ability to maintain the resiliency I’ve already established to safeguard my mental health from the content I view on a daily basis.”

Multiple moderators who discussed the change with The Intercept on the condition of anonymity explained that the previous 5.5 hour quota left some time at the end of the day for “wellness” — free time during which they could meditate, distract themselves, and generally decompress from the nightmarish things they had viewed during that day. Under the new contract, these workers now fear their already precarious mental health could be pushed to a breaking point: “Since starting in this job almost two years ago, I’ve had many nightmares related to the content, and traumatizing images I’ve seen enter my mind regularly during my personal life off the clock,” said the child exploitation moderator. At the same time as workers are learning they’ll be subjected to deeply disturbing content for hours more each week, many are also learning that they’re no longer eligible for premium pay Accenture had for years provided for moderators working overnight or weekend shifts, according to interviews with contractors and Accenture communications reviewed by The Intercept. Many Accenture workers speak openly about how little the company pays relative to the horrors of the work itself, with many taking on second or third jobs to make ends meet. “I can’t afford the therapy that I need with the barely-living-wage pay that we receive,” added one moderator. “I can barely afford my rent and other bills and food.”

Join The Conversation