Tracking Facebook connections between parent groups and vaccine misinfo

Tracking Facebook connections between parent groups and vaccine misinfo

Misinformation about the pandemic and the health measures that are effective against SARS-CoV-2 has been a significant problem in the US. It’s led to organized resistance against everything from mask use to vaccines and has undoubtedly ended up killing people.

Plenty of factors have contributed to this surge of misinformation, but social media clearly helps enable its spread. While the companies behind major networks have taken some actions to limit the spread of misinformation, internal documents indicate that a lot more could be done.

Taking more effective action, however, would benefit from more clearly identifying what the problems are. And, to that end, a recent analysis of the network of vaccine misinformation provides information that might be helpful. It finds that most of the worst misinformation sources are probably too small to stand out as being in need of moderation. The analysis also shows that the pandemic has brought mainstream parenting groups noticeably closer to groups devoted to conspiracy theories.

Network analysis

The researchers involved have an incredibly useful resource for performing this analysis. Shortly before the pandemic, they explored the Facebook networks that helped spread vaccine misinformation. They subsequently repeated their analysis after pandemic misinformation had spread widely.

Their analysis was fairly simple: simply check whether different groups had hit “like” for the landing page of another group. This is a relatively minor action, but it has a significant consequence: any post by the liked group has a chance of showing up in the timeline of the group doing the liking. The exact frequency of a post’s appearance will vary based on undisclosed features of Facebook’s algorithms, but without that like, none of posts would ever appear.

To get a clearer picture of the network formed by these likes, the researchers categorized the groups involved as being pro-vaccine, anti-vaccine, or neither. They then went in and labelled the interests of the “neither” groups, which could be focused on things like sports or diet. They then used software called ForceAtlas2 to provide a visual display of the network. The software would show network nodes—the individual Facebook groups—with the distances between them based on the strength of their connections. Any two groups that were connected would be closer to each other, with those that shared connections to the same group being even closer.

As you might expect, this algorithm caused groups with shared interests to be pulled together into a cluster. The resulting network diagram, for example, shows clear clusters of pro- and anti-vaccine groups.

In the nodes

Even in the prepandemic analysis there were a couple of problematic trends. The first is that the pro-vaccine groups, while they have nearly twice as many members as the anti-vaxxers, tend to be off on their own. They’re tightly clustered, indicating numerous connections among them. But there aren’t a lot of strong connections with any other groups. This suggests that, to a large extent, vaccine advocates are talking with each other.

Anti-vaccine groups also clustered together. But these groups were heavily connected with others—most notably groups that are focused on parenting issues. In fact, the borders between parenting groups and the anti-vax community is very difficult to distinguish.

So what’s changed with the pandemic? Many of the anti-vaccine groups expanded their advocacy into general misinformation about the pandemic. That’s not a huge surprise, given how closely the pandemic and vaccines have been connected. But their relationship with parenting groups didn’t change much over the course of the pandemic.

What did change was that parenting groups became closer to those interested in alternatives to medicine, like homeopathy and spiritual healing. The homeopath and healer groups weren’t explicitly anti-vaccine, although at best they have an awkward relationship with modern medicine. But the thing that concerned the researchers is that the alternative medical groups had many connections to groups devoted to more typical conspiracy theories and the misinformation that accompanies them. This included misinformation on things like climate change, the safety of water fluoridation, and the safety of 5G cellular service.

A key mediator of this connection appears to be the anti-GMO groups, which have connections to alternative health and often think in conspiratorial terms. On the network graph, these connections draw parenting communities closer to conspiracy theorists. And many of the posts on alternative medicine that were shared with the parenting community ended up letting their comments devolve into arguments over various conspiracies.

Moderation’s limits

Facebook has attempted to moderate some of the misinformation it hosts. This typically takes to form of tagging posts with a message about where to find accurate information. And the researchers did find tags attached to posts made by some of the larger anti-vaccine groups. But they found that many of the smaller groups managed to evade Facebook’s attention, perhaps because they simply weren’t large enough for algorithms to rate them as a significant threat. Despite their size, however, these groups often had significant connections to groups with unrelated interests.

Meanwhile, a look at the pages of some of the smaller groups show that they have backup plans in case Facebook ever gets serious about moderating them. The researchers found instances of groups directing their users to shift discussions to platforms such as Parler, Gab, and Telegram.

In an attempt to make moderation somewhat easier, the researchers go on to develop a mathematical model that describes the behavior they’ve seen. Their hope is that this would help spot the groups that pose the largest misinformation threat. With only one data set to test it on, however, its usefulness is unclear.

In any case, the results the researchers find are concerning. “Mainstream parenting communities on Facebook were subject to a powerful, two-pronged misinformation machinery during the pandemic,” they conclude. One prong involved their previous connections to anti-vaccine groups; as those expanded their focus to pandemic misinformation, parents ended up exposed to that as well. And, thanks to a growing interest in alternative health, parents ended up being pulled in to a world rife with conspiracy theories.

Obviously, exposure to occasional posts on Facebook won’t instantly change anyone’s beliefs. But regular exposure to misinformation can have a cumulative impact, especially if it’s accompanied by the impression that your peers express it. And, so far at least, Facebook’s moderation doesn’t appear to be capable of disrupting that.

IEEE Access, 2021. DOI: 10.1109/ACCESS.2021.3138982  (About DOIs).

https://arstechnica.com/?p=1823742