YouTube videos with thumbnails depicting women engaging in various sexual acts with horses and dogs populate top search results on the video platform, according to a report from BuzzFeed News. Some of these videos, which can be easily found through YouTube’s algorithmic recommendation engine after searching innocuous phrases like “girl and her horse,” have millions of views and been on the platform for months.
Of course, YouTube videos depicting such acts would be more easily caught by the company’s algorithmic filters, its user-reporting system, and its human content moderators. Harder to find and weed out are videos that use graphic and obscene images as thumbnails, alongside clickbait titles, to juice viewership and generate more ad revenue. It does not seem like any of the videos featuring the bestiality thumbnails do in fact feature bestiality.
This is not an isolated problem, but rather yet another example of how the fundamental structure of YouTube can be exploited by bad actors, many of whom game the platform’s rules either to generate ad revenue for click farms or for nefarious purposes. Like Facebook and Twitter, YouTube has struggled over the last couple of years with the lack of control it has over the immense amount of user-generated content it overseas each and every day. Though YouTube has powerful algorithms that help flag content and many thousands of contracted human moderators, it seems like each and every week a new issue pops up that shows just how frail and ill-equipped the company’s moderation system is at dealing with content that is against the rules or otherwise illegal.
So a moderation problem that began years ago largely with copyrighted content has now expanded to include terrorism recruitment and propaganda videos, child exploitation content, and porn and other explicit material, among millions upon millions of other non-advertiser friendly videos. YouTube has made substantial changes to its platform to appease advertisers, quell criticism, and improve the safety and legality of its product. Those changes include pledging to hire more human moderators, mass demonetization and banning of accounts, and updates to its terms of service and site policies.
According to BuzzFeed, YouTube went through and began scrubbing its platform of the videos and accounts responsible for the bestiality thumbnail content once the news organization notified it of the issue. In a story published by The New York Times today, YouTube said it took down a total of 8.28 million videos in 2017, with about 80 percent of those takedowns having started with a flag from its artificial intelligence-powered content moderation system. Still, so long as YouTube relies mostly on software to address so-called problem videos, it will have to manually scrub its platform clean of content like bestiality thumbnails and whatever other dark corners of the internet surface on public-facing YouTube search results and through its recommendation engine.
https://www.theverge.com/2018/4/24/17277468/youtube-moderation-bestiality-thumbnail-problem-ai-fix