Facebook released its first report today detailing which content it says is widely viewed on the site and Instagram. The report comes as research and news stories have highlighted how misleading posts and outright disinformation can draw intense engagement on the company’s platforms.
Much of the scrutiny has focused on far-right accounts, which according to Facebook’s own tool, CrowdTangle, receive the most engagement—likes, shares, and comments. For example, Kevin Roose, a reporter at The New York Times, uses CrowdTangle to tweet out a list of the “10 top-performing link posts by US Facebook pages every day, ranked by total interactions.” His experiment has revealed that, day after day, far-right accounts and pages from the likes of Ben Shapiro, Dan Bongino, and Newsmax appeared on the list, sometimes occupying multiple spots. Facebook’s critics have pointed to the list as evidence that the platform has become a right-wing media machine.
The Twitter account, Roose said, “drove executives crazy” at Facebook. They felt it was making Facebook look like it favored right-wing accounts. All of that brings us to today.
Facebook released the first of what will be a quarterly “Widely Viewed Content Report.” The report will appear alongside the company’s Community Standards Enforcement Report, an existing release that includes data on hate speech and child endangerment. The newest report is Facebook’s attempt to provide more “transparency and context,” said Anna Stepanov, the company’s director of product management.
The top-performing link posts by U.S. Facebook pages in the last 24 hours are from:
1. Dan Bongino
2. Ben Shapiro
3. Ben Shapiro
4. Ben Shapiro
5. Dan Bongino
6. Dan Bongino
7. Dan Bongino
8. Ben Shapiro
9. Ben Shapiro
10. Ben Shapiro— Facebook’s Top 10 (@FacebooksTop10) May 14, 2021
The new report shows a slew of information, including which domains, links, pages, and public posts cracked the top 20, along with how many people saw that content. In a blog post, Stepanov pointed out that just under 13 percent of content views were posts with links—post types that may send users to other sites with disinformation—and that 57 percent of posts users viewed were from family and friends. Notably, the report does not include data on ads.
Scattered among the top posts are the expected meme questions (e.g., “What is something you will never eat no matter how hungry you get?”), a Joe Biden post, and how to DIY a swimming pool from old pallets.
Domains that received the most views are another significant part of the report, and here, Facebook could use more transparency and context. Among the top 10 domains are sites that have struggled to tame disinformation themselves, including YouTube and Twitter. The report doesn’t include any additional information like which channels or accounts the posts pointed to.
Partial picture
In focusing on views over engagement, Facebook is only providing part of the picture. As anyone who has used Facebook knows, there’s a significant difference between content appearing on a user’s news feed and their engaging with it. Many posts are displayed but quickly scrolled past. Those that draw engagement, though, are literally worth more. Advertisers will pay 10 times more for one engagement than they will a single view. By withholding information about engagement, the “Widely Viewed Content Report” obfuscates the impact of the disinformation.
The fact that family and friends are a primary generator of content doesn’t help counter the idea that Facebook might be reinforcing partisan echo chambers. One study showed that when a headline agreed with a person’s political views, the person was 20 percent more likely to share the link with friends, whether or not the person felt the information was true.
The way that Facebook calculates “widely viewed” is complicated, too. Only posts that are created or shared publicly are considered and shown in the report. All links, though, whether shared publicly or privately, are included, creating a misalignment in reporting the two different post types.
The report also omits what might be the most important number—the denominator that underlies these percentages, or the total amount of content that is viewed on Facebook every day. Saying that the 20 most widely viewed posts represent less than 0.1 percent of all US views is like saying only 0.03 percent of Americans got COVID yesterday. That may not seem like a lot, but when you multiply that number by how many people live in this country and when you consider the number of available hospital beds, the scope of the problem comes into sharper focus. Ars asked Facebook how much content was seen by users in the US, but the company refused to provide that information. Until Facebook discloses more data, no one will know the true extent of the company’s disinformation problem.
Still, it’s easy to make a quick-and-dirty estimate. Facebook has 1.88 billion daily active users, and many post several times per day. So there are likely tens of billions of content views per day. To pressure test that theory, I tallied the report’s top 20 posts, which the report said represent about 0.1 percent of all content viewed in the US, and then spread it out over the 91 days in the second quarter. The math suggests that, at a minimum, at least 10 billion posts are viewed every day in the US. Even that estimate is probably extremely conservative since it includes only links and public posts. If just a tiny fraction of that number is disinformation, the problem could be small as a percentage, like the report suggests, and simultaneously enormous in absolute terms.
https://arstechnica.com/?p=1788178