SCOTUS weighs first case testing Big Tech liability for recommending content

SCOTUS weighs first case testing Big Tech liability for recommending content

A key protection shielding social media companies from liability for hosting third-party content—Section 230 of the Communications Decency Act—is set to face its first US Supreme Court challenge.

The question before the court hinges on whether Google-owned YouTube is responsible for aiding and abetting ISIS terrorists by actively recommending ISIS videos to users via its algorithms.

According to plaintiffs, ISIS allegedly relied on YouTube during efforts to ramp up recruitment before the terrorist group took credit for killing 130 people and injuring more than 350 others during six coordinated attacks in 2015. The lawsuit now headed to the Supreme Court focuses on the killing of an American woman named Nohemi Gonzalez, who was dining in a Paris bistro when ISIS militants attacked.

On Monday, the Supreme Court announced it would review the case—raised by Nohemi’s father, Reynaldo Gonzalez—which argues that YouTube helped spread ISIS’ messaging. Although plaintiffs say in court documents that YouTube did not play a direct role in Gonzalez’s death, they allege that YouTube “knowingly permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence,” NBC News reported.

Google did not provide Ars with comment before deadline, but in a court filing in July, it asked the Supreme Court to decline to review the case. At that time, Google said it was confused.

“It remains unclear which specific aspects of YouTube’s technology are even at issue,” Google said in the court filing. The tech company also suggested that an overhaul of YouTube’s terrorism-related policies since Gonzalez’s death meant “the relevance of this case to the YouTube of 2022 is far from clear.”

Some of the changes that YouTube has made since 2017 include devoting more engineering resources to automatically identify and remove terrorism-related content, nearly doubling the number of human experts working with YouTube to more accurately flag content for removal, and preventing monetization and comments on flagged content that doesn’t violate policies but warrants a warning.

“Together, we can build lasting solutions that address the threats to our security and our freedoms,” wrote Kent Walker, Google’s general counsel, in 2017. “It is a sweeping and complex challenge. We are committed to playing our part.”

Displaying versus recommending content

Google’s stated position is that Gonzalez’s case is doomed, partly because of “the ambiguity of petitioners’ pleadings as to what, specifically, makes YouTube’s technology distinct from other widely used technologies,” which are covered by Section 230. The company defended its algorithms, saying that its recommendation engine works based on user inputs like most other Internet technologies—such as search engines—and claiming that a Supreme Court decision to hold YouTube accountable for recommending ISIS videos “would threaten the basic organizational decisions of the modern Internet.”

Ars reached out to Gonzalez’s lawyer, Eric Schnapper, who couldn’t immediately respond to provide comment. However, in a court filing responding to Google, Schnapper wrote that Google relies on cherry-picking partial quotes from prior court opinions and twisting semantics of the complaint to claim that what’s at issue is how it “displays” versus “recommends” content. Schnapper wrote that if what was at issue was simply what YouTube displayed, Section 230 immunity would indeed cover Google’s tracks, but that’s not what Gonzalez alleges.

At the heart of Gonzalez’s complaint and what the court must now decide is whether YouTube actually contributed to ISIS recruitment by recommending videos, not just displaying them. Schnapper wrote that because Google made such a “highly dubious interpretation of Section 230,” it’s now time for the Supreme Court to step in and clarify how the law should work.

As Google indicated, the court’s ultimate decision could impact how other social media companies like Facebook or Twitter recommend content in the future. There’s political pressure from both Democrats and Republicans to diminish the strength of Section 230, NBC News reported, “with conservatives claiming that companies are inappropriately censoring content and liberals saying that social media companies are spreading dangerous right-wing rhetoric.”

The Supreme Court will also review a related appeal, a lawsuit seeking to hold Twitter accountable for hosting terrorist content on its platform in violation of the Anti-Terrorism Act. As in Gonzalez’s case, the question of Twitter’s Section 230 immunity has not yet been decided. Twitter did not immediately respond to Ars’ request for comment. Previously, Supreme Court Justice Clarence Thomas criticized Section 230, but otherwise, the jury’s out on which way the court will go in the Google or Twitter case.

Google wants lawmakers, not the Supreme Court, to be responsible for amending Section 230. In the company’s court filing, Google lawyers wrote that Congress is reviewing multiple laws connected to Section 230 and suggested that “if Congress thought Section 230’s application in this context was problematic, Congress could act.”

In his response, Schnapper pointed out a seeming flaw in that argument: “This Court has repeatedly warned against attaching meaning to congressional inaction.” He also argued that delaying until Congress acts imperils online users in the meantime, noting that even Google has acknowledged in the past that combating increasingly widespread online extremism is a matter of “great public importance.”

https://arstechnica.com/?p=1886430