Facebook “Supreme Court” overrules company in 4 of its first 5 decisions

  News
image_pdfimage_print
Facebook CEO Mark Zuckerberg in 2017.
Enlarge / Facebook CEO Mark Zuckerberg in 2017.
Mark Zuckerberg

The Oversight Board, an independent organization set up to review Facebook’s content moderation decisions, handed down its first batch of rulings on Thursday. The rulings didn’t go well for Facebook’s moderators. Out of five substantive rulings, the board overturned four and upheld one.

Most technology platforms have unfettered discretion to remove content. By contrast, Facebook decided in 2019 to create an independent organization, the Oversight Board, to review Facebook decisions, much as the US Supreme Court reviews decisions by lower courts. The board has an independent funding stream, and its members can’t be fired by Facebook. Facebook has promised to follow the board’s decisions unless doing so would be against the law.

The board’s first batch of five substantive decisions (a sixth case became moot after the user deleted their post) illustrates the difficult challenge facing Facebook’s moderators:

  • A user in Myanmar posted to Facebook that “Muslims have something wrong in their mindset” (or maybe “something is wrong with Muslims psychologically”—translations differ) arguing that Muslims should be more concerned about the genocide of Uighurs in China and less focused on hot-button issues like French cartoons mocking the Prophet Muhammad. Facebook removed the post as anti-Muslim hate speech. The Oversight Board re-instated it, finding that it was best understood as “commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China.”
  • A user wrote that (in the board’s paraphrase) Azerbaijanis “are nomads and have no history compared to Armenians” and called for “an end to Azerbaijani aggression and vandalism.” The post was made during an armed conflict between Armenia and Azerbaijan. The board upheld Facebook’s decision to remove it as hate speech.
  • A user in Brazil uploaded an image to raise awareness about breast cancer that featured eight pictures of female breasts—five of them with visible nipples. Facebook’s software automatically removed the post because it contained nudity. Human Facebook moderators later restored the images, but Facebook’s Oversight Board still issued a ruling criticizing the original removal and calling for greater clarity and transparency about Facebook’s processes in this area.
  • A user in the United States shared an apocryphal quote from Joseph Goebbels that “stated that truth does not matter and is subordinate to tactics and psychology.” Facebook removed the post because it promoted a “dangerous individual”—Goebbels. The user objected, arguing that his intention wasn’t to promote Goebbels but to criticize Donald Trump by comparing him to a Nazi. The board overturned Facebook’s decision, finding that Facebook hadn’t provided enough transparency on its dangerous individuals rule.
  • A French user criticized the French government for refusing to allow the use of hydroxychloroquine for treating COVID-19. The post described the drug as “harmless.” Facebook removed the post for violating its policy against misinformation that can cause imminent harm. The board overturned the ruling, concluding that the post was commenting on an important policy debate, not giving people personal medical advice.

As you can see, Facebook has to make decisions on a wide range of topics, from ethnic conflict to health information. Often, Facebook is forced to choose sides between deeply antagonistic groups—Democrats and Republicans, Armenians and Azerbaijanis, public health advocates and anti-vaxxers. One benefit of creating the Oversight Board is to give Facebook an external scapegoat for controversial decisions. Facebook likely referred its suspension of Donald Trump to the Oversight Board for exactly this reason.

The big challenge is that Facebook is far too large—it has 2.7 billion users and 15,000 moderators—for the Oversight Board to review more than a tiny fraction of its moderation decisions. In order for the Oversight Board to have a meaningful impact on Facebook as a whole, the small number of cases it does review must create precedents that are followed by moderators in millions of other cases. This is how the Supreme Court supervises other courts in the US: the high court only reviews a fraction of cases, but lower courts treat those rulings as binding precedent for all future decisions.

Making this work for US courts isn’t easy. Parties in US court cases frequently hire lawyers who know how to find relevant Supreme Court precedents and present them to the judge. Judges and juries often spend hours or even days hearing testimony and deliberating before they reach a result. Appeals go to intermediate courts before they reach the Supreme Court.

It’s almost certainly not realistic for Facebook to replicate these complex procedures for moderation decisions. Moderators need to make decisions in minutes, not hours or days. They might not have time to research past Oversight Board precedents and make sure their ruling is consistent with them.

So what matters more than Thursday’s Oversight Board rulings is what Facebook does with them. As the Oversight Board hands down more rulings in the coming months, Facebook needs a process to make them understandable and searchable to its thousands of moderators.

Even if Facebook can’t bring a court-like level of consistency to its moderation decisions, the existence of the Oversight Board could still improve Facebook’s moderation process at the margins. In addition to ruling on individual cases, the Oversight Board also makes broader recommendations to Facebook to clarify its rules or improve its procedures. Those recommendations won’t be binding like its ruling on individual decisions. But it could still be healthy to have an external entity pressuring Facebook to make its policies clearer and more consistent.

https://arstechnica.com/?p=1737901