Content Moderation Case Study: Facebook Targets Misinformation Spread By The Philippines Government (2020)
from the misinformation-challenges dept
Summary: Philippines president Rodrigo Duterte's rise to power was greatly aided by Facebook and its overwhelming popularity within the country. An estimated 97% of Filipinos have Facebook accounts and the company itself co-sponsored a Q&A session with local journalists that was broadcast on 200 radio and television stations and livestreamed on the platform. Questions were crowdsourced from Facebook users, helping propel the mayor of Davao to the highest office in the country.
Duterte's run for office was also directly assisted by Facebook, which flew a team of reps in to help the candidate's campaign staff maximize the platform's potential. As his campaign gathered critical mass, he and his team began weaponizing the tools handed to him by Facebook, spreading misinformation about other candidates and directly targeting opponents and their supporters with harassment and threats of violence.
Not much has changed since Duterte took office in 2016. Facebook continues to be his preferred social media outlet. But Facebook's latest attempt to tackle the spread of misinformation on its platform may prompt Duterte to find another outlet to weaponize. In September 2020, Facebook's moderation team announced they had removed a "network" linked to the Philippines government for violating its rules against "coordinated inauthentic behavior."
We also removed 64 Facebook accounts, 32 Pages and 33 Instagram accounts for violating our policy against foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This network originated in the Philippines and focused on domestic audiences. (Updated on October 12, 2020 at 6:35PM PT to reflect the latest enforcement numbers.)
Facebook's removal of this content prompted immediate comments from President Duterte. The president's response to Facebook's moderation efforts was a reminder from Duterte that he has the power to shut the platform down in his country if he believes he's being treated unfairly.
“I allow you to operate here,” Mr. Duterte said. “You cannot bar or prevent me from espousing the objectives of government. Is there life after Facebook? I don’t know. But we need to talk.”
Questions and policy implications to consider:
- Does targeting official government accounts increase the risk of the platform being banned or blocked in targeted countries?
- Does the possible loss of market share affect moderation decisions targeting governments?
- Should Facebook be directly involved in setting up social media campaigns for political figures/government entities?
- Does Facebook have any contingency plans in place to mitigate collateral damage to citizens in countries where the platform has been subjected to retaliatory actions by governments whose content/accounts have been removed?
Originally posted to the Trust & Safety Foundation website.
Filed Under: content moderation, coordinated inauthentic behavior, philippines, rodrigo duterte, trolls
Companies: facebook