Reason Shows How To Properly Respond To A Questionable Social Media Takedown: By Calling It Out
from the speak-up dept
Content moderation at scale is impossible to do well. I will keep repeating this point forever if I must. Now, I recognize that when you're on the receiving end of a content moderation decision that you disagree with, it's natural to feel (1) angry and (2) that it's a personal affront to you or a personal attack on your view of the world. This is a natural reaction. It's also almost certainly wrong. The trust and safety teams working on content moderation are not targeting you. They have policies they are trying to follow. And they need to make a lot of subjective calls. And sometime they're wrong. Or sometimes you just have a different view of what happened.
The publication Reason recently had a video pulled down from YouTube, and rather than freaking out and talking about how YouTube is "out to get" them, they instead wrote an article that clearly said that they support YouTube's right to make whatever content moderation decisions it wants, but also calmly explained why they think this decision was probably a mistake. As the article notes:
It remains essential to defend YouTube's right to make poorly reasoned and executed content moderation decisions; any government regulation of speech on social media is likely to backfire and hamper the free exchange of ideas. But it's also essential to recognize and critique censorious overreach if we want the market to respond to such errors. And a healthy market response is exactly what we need when the boundaries of acceptable discourse are being hemmed in by large companies susceptible to political pressure.
And, frankly, it's not that difficult to make some educated guesses on how the video ended up being moderated. It was a video from early in the pandemic about self-described DIY biohackers looking to see if they could create their own vaccines for COVID. Given what was known about COVID-19 at the time, and the speculative/experimental nature of DIY biohacking, some of the thoughts and ideas were probably a bit out there. The video described people who were trying to create their own "knockoff" versions of the mRNA vaccines (which have now proven to be massively successful), in part because of the (certainly at the time) reasonable belief that the FDA would be impossibly slow in approving such vaccines. In retrospect, that didn't really happen (though there are arguments about how the FDA could have moved even faster).
So, you can easily understand how a content moderation review of the content of such a video might flag it as potentially medical misinformation -- or even potentially dangerous. After all, it's talking about injecting a non-FDA approved "vaccine" (and one that, at the time, was highly experimental and hadn't gone through rigorous clinical trials). But, within the context (when it was done, what was being said, how it was framed), there's a strong argument that it should have been left up (and, indeed, has certain historical relevance to talk about the various approaches that people were considering early in the pandemic).
But, this is the very nature of content moderation and why we consider it so impossible to do well at scale. Context is always so important, and that can even include temporal context. Without thinking about the context when the video went up, it could appear to be more questionable a year and a half later. Or not. It's all pretty subjective.
But, Reason's response is the correct one. It's not blaming YouTube. It's not taking the decision personally, or acting like its viewpoints were systematically targeted. It recognizes that opinions may differ, that YouTube has every right to manage its platform how it wants, but also that Reason can use other means to push a response and counter-argument. If only others who felt similarly wronged were willing to do the same.
Filed Under: content moderation, more speech, section 230, takedowns
Companies: reason, youtube