Anti-Vaxxers Countermeasures Show Why It's Not So Simple To Just 'Delete' Anti-Vax Misinfo On Social Media
from the countermeasures-happen dept
It's not a new thing that those without any experience in content moderation assume that it's somehow "easy" to just find and delete misinformation and disinformation online -- but it's often stunning how little they've thought through how all of this plays out. As the White House has stupidly been using its bully pulpit to pressure Facebook into deleting anti-vax misinformation, and elected officials are threatening legislation they must know is unconstitutional, none of them seem to recognize that it's not that easy.
Anyone who has done any work related to content moderation knows this. They know that the vast majority of misinformation is not that easy to spot. First of all, it's not clear what is misinformation. You could have someone who gets something inadvertently wrong. Or, perhaps they just misread something or misunderstand something. Is that misinformation that needs to be deleted? Also, there are things like sarcasm or criticism that frequently repeat the misinformation in order to respond to it. Then there are plenty of things that may seem like misinformation but tend to just be people posting stuff that is technically true, but without the necessary context. Does that need to also be deleted? There are tons of degrees involved in misinformation, and figuring out what should stay up and what should be taken down is not nearly as easy as many commentators make it out to be.
But, on top of that, there's the simple fact that those spreading misinformation know that they may face consequences for it, and thus they adapt their techniques. Ben Collins & Brandy Zadrozny, NBC News' two excellent reporters who focus on misinformation, are noting that anti-vax groups on Facebook are effectively trying to cover their tracks in advance of any possible crackdown on the nonsense and propaganda they spew:
Some anti-vaccination groups on Facebook are changing their names to euphemisms like “Dance Party” or “Dinner Party,” and using code words to fit those themes in order to skirt bans from Facebook, as the company attempts to crack down on misinformation about Covid-19 vaccines.
The groups, which are largely private and unsearchable but retain large user bases accrued during the years Facebook permitted anti-vaccination content, also swap out language to fit the new themes and provide code legends, according to screenshots provided to NBC News by multiple members of the groups.
They also note that the groups have already set up secret "backup groups" in case their primary groups get shut down, they can immediately just switch over to the other group. And if you think that now that NBC News has reported on this, well, then it'll be easy for Facebook to find, that's silly as well. It assumes that no further countermeasures will be taken.
Beating Facebook’s moderation system “feels like a badge of honor,” the administrator wrote, followed by a crying-laughing emoji. At the end of the post, the administrator reminded users to stay away from “unapproved words,” and pointed them to a code legend on the side of the page.
Using code words to evade bans is not new among the anti-vaccine community, and it borrows from a playbook used for years by extremists on Facebook and elsewhere. The practice leans heavily on “leetspeak,” or modified language used by coders and gamers that frequently replaced letters in words for numbers or symbols during online discussions.
And, the groups seem effective at finding words that will make it more difficult to search them out:
Group members have incorporated a range of coded language to mask their discussions, many of which perpetuate debunked theories about the vaccines. “Danced” or “drank beer” mean “got the vaccine.” References to “Pfizer” generally use the terms “pizza” or “Pizza King,” and Moderna is referred to as “Moana.” Users generally play around with unofficial language about dancing to create more coded language.
For example, one group member said her husband had become sick after going on a “cross country trip where we spent 2 nights with dancers,” referring to two people who had just been vaccinated.
None of this is to say that Facebook should just throw up its hands and do nothing. But it remains stunning to me how people who just don't understand the challenges of content moderation always seem to think that (1) these things are easy to find and (2) if Facebook just took down a few accounts, these people would magically go away and the disinformation would stop spreading.
It's not that simple!
There are important questions to ask about how Facebook should handle this stuff, but anyone coming up with simple solutions that don't take into account reality -- both in the difficulty in identifying what is truly problematic and the kinds of countermeasures people will take -- isn't helping at all.
Filed Under: anti-vax, code words, content moderation, countermeasures, disinformation, misinformation, vaccines
Companies: facebook