Patreon denies child sex trafficking claims in viral TikTok “conspiracy” theory
After a TikTok accusing Patreon of ignoring reports and knowingly profiting off accounts posting child sexual abuse materials (CSAM) attracted hundreds of thousands of views, more TikTokers piled on, generating more interest. Patreon immediately responded by branding the TikToks as disinformation. In a blog, Patreon denied allegations, accusing TikTokers of spreading a “conspiracy that Patreon knowingly hosts illegal and child-exploitative material.”
According to Patreon, the conspiracy theory sprang from a fake post on a job-posting site; Vice later reported that site was Glassdoor. The Glassdoor review was posted in August and claimed that Patreon refused to respond to reports of accounts suspected of “selling lewd photographs” of children. As TikTokers described their failed attempts to report these accounts, Patreon laid off members of its security team and, Patreon said, “onlookers inaccurately linked” the “small-scale staffing changes we made last week to our security organization.” The TikTokers claimed that Patreon laid off its staff specifically for not complying with orders to allow CSAM to stay on the platform.
“Dangerous and conspiratorial disinformation began circulating on social media recently,” Patreon said. “We want to let all of our creators and patrons know that these claims are unequivocally false and set the record straight.”
In the TikTok videos, the “lewd photos” from Patreon, Vice’s reporting found, were from a mixture of family accounts and accounts featuring adult models engaging in “age play” and posing as children. The creator of the age play account, Vice reported, was eventually banned from Patreon and other platforms. Patreon US Policy Head Ellen Satterwhite told Vice that “Patreon has zero tolerance for the sexualization of children or teenagers and will remove any user and all associated content found to be in violation of this policy.” Patreon told Ars it has no further comment at this time.
Developing tech to block all CSAM
Based on 2020 data, UK-based Internet Watch Foundation reported that more than a fifth of global CSAM is hosted in the US. Worldwide, the problem has increased by 15,000 percent in the last 15 years, according to Thorn, an organization that develops new technologies to combat online child sexual abuse (founded by Ashton Kutcher and Demi Moore).
Patreon said in its statement that it works with Thorn and other organizations, like the National Center for Missing and Exploited Children (NCMEC) and INHOPE, to “keep the Internet safer.” Organizations like these, dedicated to ending child sexual abuse, are tasked with raising awareness of CSAM abuses online, while also experiencing a potential drain on their resources driven by hysteria over fraudulent sex trafficking charges, which, Vice reported, has become common fodder for viral TikToks.
A Thorn spokesperson told Ars that it can’t comment on the TikTok case but could explain more about their partnership with Patreon.
Because the issue of “CSAM on technology platforms is pervasive and continues to grow exponentially,” a Thorn spokesperson said that “any platform with an upload button can be—and likely is—used to distribute CSAM.” Today, Thorn said that “predators have more access to children and the ability to distribute CSAM than ever before,” and that’s why Thorn works with tech companies like Patreon, which have widely used platforms, to try to push for “systems-level change” to a “ubiquitous problem.”
As part of its partnership with Thorn, Patreon uses a Thorn tool called Safer to “identify, detect, and report CSAM at scale,” in part by using “Safer’s hashing and matching technology” to block content from being shared that’s already flagged in a CSAM database, such as NCMEC’s.
Thorn is one organization serving as a cog between police and tech companies to produce more impactful solutions, but the IWF data suggests that the US has a long way to go to effectively put an end to CSAM online.
Because Patreon remains a widely used platform, the potential that CSAM will be posted on its platform remains, Thorn’s spokesperson said. “The Internet has evolved in a way that makes it easy for abusers to share child sexual abuse material within communities that can flourish on the same platforms we all use every day,” Thorn said.
https://arstechnica.com/?p=1881861