Propaganda, Hate Speech, Violence: The Working Lives Of Facebook's Content Moderators

Mar 4, 2019
Originally published on March 30, 2019 4:42 pm

Without the work of social media moderators, your timelines and news feeds would feel a lot darker.

"There are people in the world who spend a lot of time just sort of uploading the worst of humanity onto Facebook and Instagram," Casey Newton, the Silicon Valley editor for The Verge, said in an interview with NPR's Scott Simon.

And the moderators contracted by Facebook are on the front lines of this fight. (Facebook is a financial supporter of NPR.)

In a recent article for The Verge titled "The Trauma Floor: The secret lives of Facebook moderators in America," a dozen current and former employees of one of the company's contractors, Cognizant, talked to Newton about the mental health costs of spending hour after hour monitoring graphic content.

In Newton's story, the moderators describe a low-paying, high-stress job with limited allowance to process the emotional toll of the work. Non-disclosure agreements, drawn up to protect employees from users who might be angry about a content decision, prevent the employees from talking about their work, according to Newton.

Despite continued promises from Facebook leadership to prioritize safety on the platform, the company has come under scrutiny for its failure to curb the spread of propaganda, hate speech and other harmful content. At the same time, it has also been accused of wielding too heavy a hand in censoring free speech.

One way Facebook has responded is by hiring a small army of mostly contract labor to manage the abundance of flagged content. Worldwide, 15,000 moderators contracted by the company spend their workday wading through racism, conspiracy theories and violence. As Newton cites in his story, that number is just short of half of the 30,000-plus employees Facebook hired by the end of 2018 to work on safety and security.

"Every piece of content that gets reported on Facebook needs to be evaluated by a moderator to see if it breaks the rules or not," Newton said. "And if a moderator makes the wrong call more than a handful of times during the week, their job could be at risk."

That's difficult when the rules are ever-changing. "They're just under tremendous pressure to try to get it right, even though Facebook is changing those guidelines on a near daily basis to account for some nuance," he said.

Newton found that some workers are so disturbed by the content that they don't finish the required training to become a full-time moderator. Some moderators, he noted, went on to develop PTSD-like symptoms after leaving the company.

In one chilling account described by Newton, a Cognizant trainee using the pseudonym "Chloe" is asked to moderate a Facebook post in front of her fellow trainees.

"The video depicts a man being murdered," Newton writes. "Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe's job is to tell the room whether this post should be removed."

"Almost everyone I spoke with could vividly describe for me at least one thing they saw that continues to haunt them," he said.

Yet according to Newton, employees were regularly dissatisfied with the counseling opportunities available to them. "A common complaint of the moderators I spoke with was that the on-site counselors were largely passive, relying on workers to recognize the signs of anxiety and depression and seek help," writes Newton.

Their time is also managed down to the second, Newton said, leaving moderators with little time to reflect on the disturbing content they might see.

To track their time, the moderators must click a browser extension every time they leave their desk. They get two 15-minute breaks to use the bathroom and a 30-minute lunch break. They're also allotted nine minutes of "wellness time" per day, reserved for when they encounter particularly traumatizing content.

The longer they looked at the kind of fringe conspiracies that get posted onto Facebook, the more they found themselves sympathetic to those ideas - Casey Newton, "The Verge"

Perhaps the most surprising find from his investigation, the reporter said, was how the majority of the employees he talked to started to believe some of the conspiracy theories they reviewed.

"The longer they looked at the kind of fringe conspiracies that get posted onto Facebook, the more they found themselves sympathetic to those ideas," he said. "I spoke to one man who told me that he no longer believes that 9/11 was a terrorist attack. I talked to someone else who said they had begun to question the reality of the Holocaust."

Some told Newton that they knew their changing beliefs were actually false, he said, "but they just kept telling me these videos are so persuasive and we see them all the time."

In a statement to NPR about Newton's story, Facebook spokesperson Carolyn Glanville said the company is committed to getting "such an important issue" right.

"We work with our partners to ensure they provide competitive compensation starting at $15 per hour, benefits, and a high level of support for their employees. We know there are going to be incidents of employee challenges or dissatisfaction that call our commitment into question, which is why we are taking steps to be extremely transparent about the expectations we have of our partners," the statement said.

Those steps include working to regularly audit their partners. Facebook also plans to invite partner employees to a summit to discuss these issues, according to the statement.

Glanville also noted that Facebook invited Newton to visit Cognizant's Phoenix office — an offer he accepted and detailed in his story.

Bob Duncan, who heads Cognizant's content moderation operations in North America, told Newton in response to his story that recruiters describe to applicants the types of graphic content they should expect to see on the job, and provide them with examples of such content.

"The intention of all that is to ensure people understand it," Duncan told Newton. "And if they don't feel that work is potentially suited for them based on their situation, they can make those decisions as appropriate."

Duncan also told Newton the company would investigate the safety and management issues raised by moderators.

Newton said he's glad to hear that Facebook is taking these issues seriously, but he has suggestions for steps he thinks the company should take. Topping the list: Raise the salary of moderators.

Currently, Newton noted, moderators at Cognizant are earning about $4 above the state's minimum wage. Moderators in Phoenix will make just $28,800 per year. By comparison, the average Facebook employee has a total compensation of $240,000, according to Newton's reporting.

"When you think of other people in these similar first responder-type roles — a police officer, a firefighter, a social worker — those folks are in many cases going to be making something more like $60,000 a year," he said.

Moderators are assessing crucial questions about speech and security, he said. "They're policing the terms of our public debate."

Karina Pauletti and Lynn Kim edited and produced this story for broadcast. Emma Bowman produced this story for digital.

Copyright 2019 NPR. To see more, visit https://www.npr.org.

SCOTT SIMON, HOST:

Facebook has pledged to do better at moderating content. The social media company usually employs third-party contractors to do the job. The average moderator makes about $28,000 a year. Meanwhile, the average Facebook employee's salary is around $120,000 a year. And we want to note here that Facebook is a financial supporter of NPR. In a recent article by Casey Newton for The Verge, moderators employed by one of those contractors, Cognizant, talked about the stress of their jobs - not only low pay but high-pressure working conditions and the emotional toll of monitoring hour after hour of graphic content and conspiracy theories. Casey Newton, Silicon Valley editor at The Verge, joins us now from New York. Thanks so much for being with us.

CASEY NEWTON: Thanks for having me, Scott.

SIMON: Well, help us understand how a lot of these employees live during the workday.

NEWTON: Well, every piece of content that gets reported on Facebook needs to be evaluated to see if it breaks the rules or not. And if a moderator makes the wrong call more than a handful of times during the week, their job could be at risk. And so the folks that I spoke with said that they're just under tremendous pressure to try to get it right even though Facebook is changing those guidelines on a near daily basis to account for some nuance. And, of course, a lot of that content they're looking at is extremely graphic or disturbing. And so many of the folks that I spoke with were struggling with mental health issues months after they left the job.

SIMON: Because they have to see so much?

NEWTON: That's right. You know, there are people in the world who spend a lot of time just sort of uploading the worst of humanity onto Facebook. So almost everyone I spoke with could vividly describe for me at least one thing they saw that continue to haunt them.

SIMON: And it sounds as if during their workday, there's not a lot of time to reflect. There's not even really time to go to the bathroom.

NEWTON: That's right. One of the things that surprised me most about this story was that the moderators' time is managed down to the second. Every time they want to use the bathroom, they have to click a browser extension to let someone know that they're leaving. They also get nine minutes a day of something called wellness time, which they're supposed to use if they see something really traumatizing and need to stand up and walk away. But many of the folks that I spoke with said that wasn't really adequate to kind of emotionally process what they were seeing.

SIMON: What about the effect of seeing so many conspiracy theories?

NEWTON: Well - so this was maybe the thing that surprised me the most from my reporting was the majority of the people that I spoke with said that the longer they looked at the kind of fringe conspiracies that get posted on to Facebook, the more they found themselves sympathetic to those ideas. So I spoke to one man who told me that he no longer believes that 9/11 was a terrorist attack. I talked to someone else who said they had begun to question the reality of the Holocaust. And in some cases, these folks knew sort of how wrong that sounded. But they just kept telling me these videos are so persuasive, and we see them all the time.

SIMON: Let me share with you some words we got from Facebook, knowing we were going to interview you. We work with our partners to ensure they provide competitive compensation starting at $15 per hour, benefits and a high level of support for their employees. They went on to say that they will regularly audit their partners. They'll try to make working conditions and salaries uniform. And they're going to hold a summit on those issues and talk to employees. How do you react to their statement?

NEWTON: Well, I'm glad to hear that Facebook is taking these issues seriously. I would say if they're looking for suggestions, I'm happy to offer two. One would be to pay these folks more. And I think that would be a great place for Facebook to start when it came to compensating employees, who, in many cases, are being asked to evaluate essential questions of speech and security. They're policing the terms of our public debate. That feels like a $60,000-a-year job to me. And then the second thing they could do is just not make these employees have to raise their hand every time they want to go to the bathroom. Just treat these employees the way they treat any Facebook executive, and let them manage their own time.

SIMON: Casey Newton at The Verge, thanks so much for being with us.

NEWTON: Thank you, Scott. Transcript provided by NPR, Copyright NPR.