YouTube Ramps Up Its Efforts Against QAnon-Related Content

  Rassegna Stampa, Social
image_pdfimage_print

YouTube said Thursday that it is updating its policies on hate and harassment to include the prohibition of content that suggests that someone is complicit with conspiracy theories such as QAnon or Pizzagate.

The Google-owned video site clarified that context is taken into account, so content such as news stories discussing those topics without targeting individuals or protected groups will not be impacted by the changes, which go into effect Thursday.

The updates Thursday follow YouTube’s move earlier this week to ban content about vaccinations that runs counter to expert consensus from local health authorities or the World Health Organization.

QAnon is a conspiracy theory claiming that high-profile Democrats and celebrities are ritually sacrificing children, and that President Donald Trump is fighting this cabal online. Its followers resort to relentless attacks on those who they believe are part of the cabal, both public figures and private individuals.

Twitter began a purge of accounts linked to QAnon in July, while Facebook and Instagram followed suit in August.

Facebook and Instagram also formally banned ads promoting QAnon conspiracy theories at the end of last month and pledged earlier this month to scrub content from all pages, groups and accounts representing QAnon, regardless of whether they share content related to potential violence.

YouTube began its efforts against QAnon as part of the massive overhaul of its recommendations engine in January 2019, and it said those moves caused a 70% drop in views coming from its search and recommendations, as well as an 80% drop in views from non-subscribed recommendations to prominent QAnon-related channels since January 2019.

The company said it has removed tens of thousands of QAnon videos and terminated hundreds of channels, with a focus on those that explicitly threaten violence or deny the existence of major violent events.

YouTube said in a blog post Thursday, “Managing misinformation and harmful conspiracy theories is challenging because the content is always shifting and evolving. To address this kind of content effectively, it’s critical that our teams continually review and update our policies and systems to reflect the frequent changes.”

https://www.adweek.com/digital/youtube-ramps-up-its-efforts-against-qanon-related-content/