Content Moderation Case Studies: Facebook Removes Militia Event Following A Shooting (August 2020)
from the moderating-groups dept
Summary: Following the shooting of Black man Jacob Blake by Kenosha police officers, protests erupted in the Wisconsin town.
As law enforcement attempted to rein in the damage, citizens aligning themselves with private "militias" discussed taking action during the civil unrest.
Some of this organizing began on Facebook. A Facebook "event" created by the Kenosha Guard account (and promoted by conspiracy theorist/far right website Infowars) possibly caught the eye of 17-year-old Kyle Rittenhouse. Rittenhouse traveled from his home in Antioch, Illinois with his weapons to the protest/riot occurring less than 30 minutes away in Kenosha, Wisconsin. Before the night was through, Rittenhouse had killed two residents and injured one other.
Facebook finally removed the "event" posted by the Kenosha Guard account -- one the account referred to as a "call to arms." Posts by the group asked "patriots" to "take up arms" against "evil thugs." The event was deemed a violation of Facebook's policy regarding "Dangerous Individuals and Organizations." Facebook also claimed it could find no link between the account and this event and Kyle Rittenhouse.
Some viewed this response by Facebook as too little too late. Someone had already apparently heeded the call to "take up arms" and had taken people's lives. According to a report by BuzzFeed, the event had been reported 455 times before Facebook removed it. Four moderators had responded to multiple flaggings with a determination that the event (and the account behind it) did not violate Facebook's rules. During an internal meeting with moderators, CEO Mark Zuckerberg admitted the company should have reacted sooner to reports about the event.
Decisions to be made by Facebook:
- Should moderators be given more leeway to remove events/accounts/pages (at least temporarily) that have generated hundreds of complaints, even if they don't immediately appear to violate policies?
- Would a better/more transparent appeal process allow moderators to make more judgment calls that might address issues like this more expediently by allowing them to make mistakes that can be undone if no violation occurred?
- How does the addition of more forms of content to the "unwanted" list complicate moderation efforts?
- Does the seemingly constant addition of new forms of content to "banned" lists invite closer government inspection or regulation?
- Is a perceived failure to react quickly enough an impetus for change within the company?
- Are policies in place to allow for judgment calls by moderators? If so, do they encourage erring on the side of caution or overblocking?
- Does taking credit for actions not actually performed by Facebook make it appear more focused on serving its own interests, rather than its users or public safety in general?
Filed Under: content moderation, kenosha, militia, shootings, wisconsin
Companies: facebook