The Trump effect
Reactions across the industry tie Meta’s policy change directly to Trump’s election victory.
Journalist, podcast host, and political commentator Saagar Enjeti said of Zuckerberg’s video, “I highly recommend that you watch all of it, as tonally, it is one of the biggest indications of ‘elections have consequences’ I have ever seen.”
“The move will elate conservatives, who have often criticized Meta for censoring speech, but it will spook many liberals and advertisers, showing just how far Zuckerberg is willing to go to win Trump’s approval,” Emarketer principal analyst Jasmine Enberg said in an email.
The shift started nearly one full year before the vote, with Meta reversing its ban on ads questioning the legitimacy of the 2020 U.S. presidential election, in which current President Joe Biden held off Trump’s bid for a second term.
Kaplan debuted Zuckerberg’s video in an appearance on Fox News’ Fox & Friends, The New York Times reported, adding that officials in the incoming administration were alerted about the changes before Tuesday’s announcement.
In the weeks since Trump’s victory, Zuckerberg has met with Trump and potential secretary of state appointee Marco Rubio at the president-elect’s Mar-a-Lago resort in Palm Beach, Fla.; donated $1 million to Trump’s inauguration fund; promoted Republican Party ally Kaplan; and added Trump ally Dana White, CEO of UFC, to the company’s board of directors.
The Real Facebook Oversight Board, an accountability organization not affiliated with the company, said the changes announced Tuesday represent Meta going “full MAGA” and “political pandering,” adding in a statement, “Meta’s announcement today is a retreat from any sane and safe approach to content moderation.”
Too many mistakes
However, Kaplan maintained that the motivation behind the move was cutting back on mistakes when content is erroneously removed, or when users find themselves in “Facebook jail” with little recourse and slow response times. Kaplan noted that while Meta removed “millions of pieces of content every day” in December 2024, representing under 1% of posts to its surfaces each day, it now believes “one to two out of every 10” removals were mistakes.
Kaplan said the goal of launching independent third-party fact-checking in 2016 was a desire by then-Facebook to avoid being “the arbiters of truth,” and it was the best solution at the time, but experts “have their own biases and perspectives,” which led to too much legitimate political speech and debate being fact-checked.