Texas will require parental consent for kids to use social media

Texas will require parental consent for kids to use social media

Teens will likely soon be losing online privileges in Texas, which this week became the third state to require parental consent for minors under 18 to access social media. Utah passed a similar law in March, and Louisiana followed suit this month.

Texas Governor Greg Abbott signed HB 18 into law on Wednesday. It takes effect on September 1, after which platforms will be required to verify the ages of all minors, secure parental consent to register the minor as a user, or risk legal action from the state attorney general or private actions from parents who report violations.

“Online platforms have been collecting data and manipulating our children’s online behavior,” the Texas House Republican Caucus tweeted Thursday after the bill, sponsored by Representative Shelby Slawson (R), was signed into law. Slawson also tweeted, thanking Abbott and Texas House Speaker Dade Phelan for “prioritizing this issue.”

“Texas is leading to empower parents to protect our kids online,” Slawson tweeted.

Slawson did not immediately respond to Ars’ request for comment.

The broad law comes with heavy burdens for online platforms. It requires basically any digital services provider that collects an email at sign-up to conduct age verification to identify all minors, verify parents or guardians connected to all minors identified, and secure parental consent for a wide range of account activity.

The expectation is that online platforms must go above and beyond to protect minors from harmful, deceptive, or unfair trade practices.

In addition to the burden of verifying minors and parents, guardians, or caregivers, the law stipulates that online platforms will also be responsible for creating new parental controls, building a portal to communicate with parents about minor activity, and allowing parents to more easily monitor minors’ behaviors and control their activity on online platforms.

Platforms must also take steps to restrict minors from accessing harmful content that “promotes, glorifies, or facilitates” suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child sexual abuse materials, or other sexual exploitation or abuse. Part of that effort includes developing a strategy to maintain “a comprehensive list of harmful material” to “block from display to a known minor” and hiring actual people to review and verify that filters are working—not just relying on automated content moderation.

Any missteps could lead to additional requirements, including platforms being subjected to periodic independent audits to ensure content filters are functioning optimally to protect kids.

On top of all of that, the Texas law requires online platforms to make their algorithms more transparent to users, clearly disclosing in terms of service or privacy policies precisely how algorithms organize and filter content.

https://arstechnica.com/?p=1948255