The Kids Online Safety Act isn’t all right, critics say
Debate continues to rage over the federal Kids Online Safety Act (KOSA), which seeks to hold platforms liable for feeding harmful content to minors. KOSA is lawmakers’ answer to whistleblower Frances Haugen’s shocking revelations to Congress. In 2021, Haugen leaked documents and provided testimony alleging that Facebook knew that its platform was addictive and was harming teens—but blinded by its pursuit of profits, it chose to ignore the harms.
Sen. Richard Blumenthal (D-Conn.), who sponsored KOSA, was among the lawmakers stunned by Haugen’s testimony. He said in 2021 that Haugen had showed that “Facebook exploited teens using powerful algorithms that amplified their insecurities.” Haugen’s testimony, Blumenthal claimed, provided “powerful proof that Facebook knew its products were harming teenagers.”
But when Blumenthal introduced KOSA last year, the bill faced immediate and massive blowback from more than 90 organizations—including tech groups, digital rights advocates, legal experts, child safety organizations, and civil rights groups. These critics warned lawmakers of KOSA’s many flaws, but they were most concerned that the bill imposed a vague “duty of care” on platforms that was “effectively an instruction to employ broad content filtering to limit minors’ access to certain online content.” The fear was that the duty of care provision would likely lead platforms to over-moderate and imprecisely filter content deemed controversial—things like information on LGBTQ+ issues, drug addiction, eating disorders, mental health issues, or escape from abusive situations.
So, regulators took a red pen to KOSA, which was reintroduced in May 2023 and amended this July, striking out certain sections and adding new provisions. KOSA supporters claim that the changes adequately address critics’ feedback. These supporters, including tech groups that helped draft the bill, told Ars that they’re pushing for the amended bill to pass this year.
And they might just get their way. Some former critics seem satisfied with the most recent KOSA amendments. LGBTQ+ groups like GLAAD and the Human Rights Campaign removed their opposition, Vice reported. And in the Senate, the bill gained more bipartisan support, attracting a whopping 43 co-sponsors from both sides of the aisle. Surveying the legal landscape, it appears increasingly likely that the bill could pass soon.
But should it?
Not all critics agreed that recent changes to the bill go far enough to fix its biggest flaws. In fact, the bill’s staunchest critics told Ars that the legislation is incurably flawed—due to the barely changed duty of care provision—and that it still risks creating more harm than good for kids.
These critics also warn that all Internet users could be harmed, as platforms would likely start to censor a wide range of protected speech and limit user privacy by age-gating the Internet.
“Duty of care” fatally flawed?
To address some of the criticisms, the bill changed platform restrictions so that platforms wouldn’t start blocking kids from accessing “resources for the prevention or mitigation of” any harms described under the duty of care provision. That theoretically means that if KOSA passes, kids should still be able to access online resources to deal with:
- Mental health disorders, including anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.
- Patterns of use that indicate or encourage addiction-like behaviors.
- Physical violence, online bullying, and harassment of the minor.
- Sexual exploitation and abuse.
- Promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol.
- Predatory, unfair, or deceptive marketing practices, or other financial harms.
Previously, this section describing harmful content was slightly more vague and only covered minors’ access to resources that would help them prevent and mitigate “suicidal behaviors, substance use, and other harms.”
Irene Ly—who serves as counsel on tech policy for Common Sense Media, a nonprofit that provides age ratings for tech products—helped advise KOSA authors on the bill’s amendments. Ly told Ars that these changes to the bill have narrowed the duty of care enough and reduced “the potential for unintended consequences.” Because of this, “support for KOSA from LGBTQ+ groups and policymakers, including openly gay members of Congress” was “significantly boosted,” Ly said.
But the duty of care section didn’t really change that much, KOSA critics told Ars, and Vice reported that many of KOSA’s supporters are far-right, anti-LGBTQ organizations seemingly hoping for a chance to censor a broad swath of LGBTQ+ content online.
The only other KOSA change was to strengthen the “knowledge standard” so that platforms can only be held liable for violating KOSA if they don’t take “reasonable efforts” to mitigate harms when they know that kids are using their platforms. Previously, platforms could be found liable if a court ruled that platforms “reasonably should know” that there are minors on the platform.
Joe Mullin, a policy analyst for the Electronic Frontier Foundation (EFF)—a leading digital rights nonprofit that opposes KOSA—told Ars that while the knowledge standard change is “actually positive” because it’s “tightened up a bit.” Still, the revised KOSA “doesn’t really solve the problem” that many critics still have with the legislation.
“I think the duty of care section is fatally flawed,” Mullin told Ars. “They really didn’t change it at all.”
In a letter to lawmakers last month, legal experts with the think tank TechFreedom similarly described the duty of care section as “incurably flawed” and cautioned that it seemingly violates the First Amendment. The letter urged lawmakers to reconsider KOSA’s approach:
The unconstitutionality of KOSA’s duty of care is highlighted by its vague and unmeetable nature. Platforms cannot ‘prevent and mitigate’ the complex psychological issues that arise from circumstances across an individual’s entire life, which may manifest in their online activity. These circumstances mean that material harmful to one minor may be helpful or even lifesaving to another, particularly when it concerns eating disorders, self-harm, drug use, and bullying. Minors are individuals, with differing needs, emotions, and predispositions. Yet KOSA would require platforms to undertake an unworkable one-size-fits-all approach to deeply personal issues, thus ultimately serving the best interests of no minors.
ACLU’s senior policy counsel Cody Venzke agrees with Mullin and TechFreedom that the duty of care section is still the bill’s biggest flaw. Like TechFreedom, Venzke remains unconvinced that changes are significant enough to ensure that kids won’t be cut off from some online resources if KOSA passes in its current form.
“If you take the broad, vague provisions of the duty of care, platforms are going to end up taking down the bad as well as the good, because content moderation is biased,” Venzke told Ars. “And it doesn’t do youths any good to know that they can still search for helpful resources, but those helpful resources supported have been swept up in the broad takedowns of content.”
https://arstechnica.com/?p=1960313