YouTube limits moderators to viewing four hours of disturbing content per day

YouTube CEO Susan Wojcicki said today that the video platform has started limiting the number of hours its part-time content moderators can view disturbing videos to four hours per day. The news, announced during a Q&A session during Wojcicki’s South by Southwest Interactive talk here in Austin, comes as companies like YouTube are struggling to parse through the sheer volume of user-uploaded content and ensure it abides by its policies. Platforms including YouTube, Facebook, Reddit, and Twitter have faced criticism for subjecting low-paid contractors to content that can be extremely disturbing.
“This is a real issue and I myself have spent a lot of time looking at this content over the past year. It is really hard,” Wojcicki said about content moderation. Recent solutions the company has landed on include both limiting the hours per day contractors perform this work and providing what Wojcicki referred to as “wellness benefits.”
Federal laws that absolve tech companies for hosting illegal content still require them to scrub illegal videos from the platform using a blend of algorithmic and human moderation. YouTube deploys a system known as Content ID to identify and remove straightforward violations involving copyrighted television, film, and music. But for videos depicting violence, murder, suicide, and other disturbing subjects, YouTube employs part-time human moderators to physically confirm the content of the videos. These people are often hired as contractors and do not have the same access to mental-health benefits as full-time Google employees. It’s unclear what types of psychological counseling part-time YouTube contractors receive under Wojcicki’s definition of “wellness benefits.”
This problem may only intensify over the coming months, as YouTube has pledged to hire 10,000 people to address the limitations of its algorithms, a process Wojcicki said today was ongoing. The company has come under fire over the last 18 months or so for failing to address an increase in conspiracy theories, propaganda, fake news, and religious radicalization videos on its platfomr. YouTube’s flat-footed response led to a series of advertising boycotts, pushing Wojcicki and her team to begin taking the issue much more seriously by employing more human moderators.
This is why I can’t get down with calls for Facebook, YouTube, etc to increase moderation that don’t consider the human cost of exposing more low paid workers to trauma and extremely tedious labor. https://t.co/YkFEEAoAEa
— Adrian Chen (@AdrianChen) March 13, 2018
Meanwhile, YouTube continues to fall prey to controversy over its failure to catch obvious transgressions. This week, critics noted that Infowars conspiracy theory videos regarding the Austin home bombings floated to the top of YouTube search results, something the company said it could not immediately explain.
Now, it appears YouTube is grappling on a separate front with the psychological toll its solution to these problems takes on part-time workers. Wojcicki acknowledged that even four hours per day is still a lot. We’ve reached out to YouTube for clarification on how many hours its content moderators were previously asked to watch these types of videos, and we’ll update the story when we hear back.
https://www.theverge.com/2018/3/13/17117554/youtube-content-moderators-limit-four-hours-sxsw