Content Moderation Case Study: Twitter Freezes Accounts Trying To Fact Check Misinformation (2020)
from the misinformation-vs-fact-checking dept
Summary: President Trump appeared on Fox News’ “Fox & Friends” and made some comments that were considered by many experts to be misinformation regarding the COVID-19 pandemic. One quote that particularly stood out was: "If you look at children, children are almost -- and I would almost say definitely -- but almost immune from this disease. They don't have a problem. They just don't have a problem." This is false. While it has been shown that children are less likely to get seriously ill or die from the disease, that is very different from being “immune.”
In response to this both Twitter and Facebook decided to remove clips of the video including those posted by the Trump Campaign. Given both platforms’ aggressive policies regarding COVID-19 disinformation (and the criticism that both have received for being too slow to act) this was not all that surprising. For additional context, just a week and half earlier there was tremendous controversy over a decision to remove a video of some doctors giving speeches in front of the Supreme Court that also presented misleading information regarding COVID-19. While the major platforms all blocked the video, they received criticism from both sides for it. Some argued the video should not have been taken down, while others argued it took the platforms too long to take it down.
Thus it was not surprising that Facebook and Twitter reacted quickly to this video, even though it was statements made by the President of the United States. However, more controversy arose because in taking down those video clips, Twitter also ended up removing reporters, such as Aaron Rupar, who were fact checking the claims, and activists, like Bobby Lewis, who were highlighting the absurdity of the clip.
Decisions to be made by Twitter:
- How aggressively should content moderation rules be applied to statements from the President of the United States?
- How important is it to remove potentially harmful information regarding health and immunity to a disease like COVID-19?
- Is it better to have such videos taken down too quickly, or too slowly?
- How do you determine who is fact-checking or debunking a video and who is spreading the misinformation?
- How do you handle situations where different people are sharing the same video for divergent purposes (some to spread misinformation, some to debunk it)?
Questions and policy implications to consider:
- Should the President’s speech receive special consideration?
- The same content can be used by different users for different reasons. Should content moderation take into account how the content is being used?
- Counterspeech can often be useful in responding to disinformation. What role is there in content moderation to promote or allow counterspeech?
Filed Under: content moderation, fact checking, journalism, misinformation, reporting
Companies: twitter