Facebook and Twitter struggle with online fury from Trump supporters
Facebook and Twitter are struggling to contain rising anger from Donald Trump’s supporters online, as a wave of momentum builds around the US president’s claims that the election is being stolen from him.
In the past 24 hours, Facebook has invoked emergency measures to make it harder for users to share posts that contain misleading information, to remove such posts from people’s newsfeeds, and to restrict the circulation of poll-related livestreams.
The social media company has also expanded its warning labels from posts by political candidates to a wider web of rightwing influencers, many of whom were echoing Mr. Trump’s messages, saying that it was flagging individuals based on whether their posts were going viral.
On Thursday afternoon, Facebook moved to shut down a “Stop the Steal” group, which had amassed more than 360,000 followers in just a day, saying that it was concerned about calls for violence from some of the group’s members.
The move prompted outcry from the group’s founders, the pro-Trump Women for America First, who also complained that their account on MailChimp, the group emailing service, had been suspended. “Is this how social media treated Black Lives Matter protesters?” Chris Barron, a spokesperson for the group, wrote on Twitter.
Another 79,000-strong private group called Stand Up Michigan to Unlock Michigan was also shut down by Facebook, but only after it had successfully called for its members to storm a Detroit counting center.
Alex Stamos, Facebook’s former chief information security officer and a professor at Stanford University, said that while “nothing really stuck on election day,” online narratives had since “consolidated” around several themes, particularly “Stop the Steal,” a catch-all slogan for all accusations of voter fraud against Democrats and election officials.
In one sign of the growing discontent and tension on the platform, an internal metric at Facebook that tracks “violence and incitement trends” rose 45 percent over the past five days, according to a report by BuzzFeed that Facebook did not deny.
Paul Barrett, deputy director of the Center for Business and Human Rights at NYU Stern, acknowledged that Facebook and other platforms had been “much more aggressive” in this election in labelling posts and removing potentially dangerous online groups.
But he said in the coming days, Facebook will need “to do a lot more and a lot sooner,” and called for automated systems to “catch these things.”
Twitter, meanwhile, permanently banned the account of Steve Bannon, Mr. Trump’s former chief strategist, after he called for the execution of Dr Anthony Fauci and FBI Director Christopher Wray.
The moves angered Mr. Trump, who took to Twitter to accuse the platform of being “out of control, made possible through the government gift of Section 230,” the law that grants social media groups immunity for the user-generated content they host.
A Twitter spokesperson also said the group had been “proactively monitoring #StopTheSteal and related Tweets since Tuesday morning,” adding warning labels to a number of tweets from high-profile Republicans pushing the narrative.
The approach contrasted with that of Google-owned YouTube, which has come under fire during this election for failing to respond to misinformation.
In one example, the company refused to take down a video posted by rightwing media group One America News Network that included false claims that President Trump won the election and allegations of voter fraud conducted by Democrats.
YouTube said that the video, entitled “Trump won,” contained “demonstrably false content that undermines trust in the democratic process”—but added that this still did not breach its policies in any way.
Additional reporting by Kiran Stacey in Washington, DC.
© 2020 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.
https://arstechnica.com/?p=1720076