YouTube will put disclaimers on state-funded broadcasts to fight propaganda

Valentina Palladino

YouTube’s latest strategy to fight the spread of misinformation involves putting a disclaimer on videos from certain news sources. The online video website announced it will start labeling videos posted by state-funded broadcasters to alert viewers that the content is, in some part, funded by a government source. YouTube will begin labeling videos today, and the policy extends to outlets including the US’s Public Broadcasting Service (PBS) and the Russian government broadcaster RT.

According to a report by The Wall Street Journal, PBS videos will now have the label “publicly funded American broadcaster,” while RT will have this disclaimer: “RT is funded in whole or in part by the Russian government.”

The new policy is YouTube’s way of informing viewers about where the content they’re watching is coming from, a piece of information often hidden or left unsought by the viewers themselves. “The principle here is to provide more information to our users, and let our users make the judgment themselves, as opposed to us being in the business of providing any sort of editorial judgment on any of these things ourselves,” YouTube Chief Product Officer Neal Mohan told the WSJ.

While providing more information about the source from which viewers get their news on YouTube is helpful, Mohan’s sentiment is at odds with another strategy currently in development: YouTube is reportedly considering surfacing “relevant videos from credible news sources” when conspiracy theory videos pop up about a specific topic. For now, YouTube will reserve editorial judgement—until it starts deciding which news sources are deemed credible on its website. However, we don’t know if this strategy will become a reality anytime soon, as it’s still in the early development stages.

YouTube’s decision to label all state-funded news videos comes after heavy criticism from the US government and others about big tech companies’ involvement in the spread of misinformation. Facebook, Google, and others have had to answer questions about how Russian actors were able to easily spread misinformation regarding the 2016 election to millions of Americans.

The new policy also comes after YouTube has dealt with a number of controversies surrounding inappropriate content on its website. In just the past year, YouTube went through an ad-pocalypse after advertisers found out their ads were running over extremist videos; it had to address public outcry to the distorted and inappropriate children’s content on the site (some of which misused popular children’s characters or involved the potential abuse of children themselves); and it had to set up new rules to police its biggest creators after Logan Paul uploaded a video featuring the dead body of a suicide victim.

Conspiracy theories abound

In short, it was only a matter of time before news organizations on YouTube would have to deal with new rules made specifically for them. The new labeling policy will be helpful for some YouTube viewers as it will shed a bit more light on their favored news sources. It will also show Congress that YouTube is, at the very least, trying to inform its audience of possible misinformation and propaganda coming from government-backed sources.

But general conspiracy-theory videos are just as big of an issue on YouTube as government propaganda videos. The company has been tweaking its algorithm ever since conspiracy-theory videos about last year’s Las Vegas shooting populated search results immediately after the incident. However, most of the reported algorithm changes surround promoting more reputable sources rather than downgrading or hiding misleading sources.

YouTube is reportedly still working on changing its algorithm to serve more mainstream news results in news-related searches. But it’s unlikely that algorithm tweaks will be able to totally squash conspiracy theory videos from gleaning millions of views when those misleading videos continue to pop up in a viewer’s “recommended” section.

Until now, YouTube’s algorithm for serving up content never focused on truthfulness—it has always been focused on delivering videos that viewers are most likely to click on next. It’s unclear (and likely will be for quite some time) if the new changes will successfully convert users away from sensationalized and inaccurate conspiracy videos.

https://arstechnica.com/?p=1253577