Digital Technology As Accelerant: Growth And Genocide In Myanmar
from the broader,-collaborative-view dept
Every person in Myanmar above the age of 10 has lived part, if not most, of their life under a military dictatorship characterized by an obsession with achieving autonomy from international influences. Before the economic and political reforms of the past decade, Myanmar was one of the most isolated nations in the world. The digital revolution that has reshaped nearly every aspect of human life over the past half-century was something the average Myanmar person had no personal experience with.
Recent reforms brought an explosion of high hopes and technological access, and Myanmar underwent a digital leapfrog, with internet access jumping from nearly zero percent in 2015 to over 40 percent in 2020. At 27-years-old, I remember living in a Yangon where having a refrigerator was considered high tech, and now, there are 10-year-olds making videos on Tik Tok.
Everyone was excited for Myanmar's digital revolution to spur the economic and social changes needed to transform the country from a pariah state into the next economic frontier. Tourists, development aid, and economic investment poured into the country. The cost of SIM cards dropped from around 1,000 US dollars in 2013 to a little over 1 dollar today.
This dramatic price drop was paired with a glut of relatively affordable smartphones and phone carriers that provided data packages that made social media platforms like Facebook free, or nearly free, to use. This led to the current situation where about 21 million out of the 22 million people using the internet are on Facebook. Facebook became the main conduit through which people accessed the internet, and now is used for nearly every online activity from selling livestock, watching porn, reading the news, to discussing politics.
Then, following the exodus of over 700,000 Rohingya people from Myanmar’s war-torn Rakhine State, Facebook was accused of enabling a genocide.
The ongoing civil wars in the country and the state violence against the Rohingya, characterized by the UN as ethnic cleansing with genocidal intent, put a spotlight on the potential for harm brought on by digital connectivity. Given its market dominance, Facebook has faced great scrutiny in Myanmar for the role social media has played in normalizing, promoting, and facilitating violence against minority groups.
Facebook was, and continues to be, the favored tool for disseminating hate speech and misinformation against the Rohingya people, Muslims in general, and other marginalized communities. Despite repeated warnings from civil society organizations in the country, Facebook failed to address the new challenges with the urgency and level of resources needed during the Rohingya crisis, and failed to even enforce its own community standards in many cases.
To be sure, there have been improvements in recent years, with the social media giant appointing a Myanmar focused team, expanding their number of Myanmar language content reviewers, adding minority language content reviewers, establishing more regular contact with civil society, and devoting resources and tools focused on limiting disinformation during Myanmar’s upcoming election. The company also removed the accounts of Myanmar military officials and dozens of pages on Facebook and Instagram linked to the military for engaging in "coordinated inauthentic behavior." The company defines "inauthentic behavior" as "engag[ing] in behaviors designed to enable other violations under our Community Standards," through tactics such as the use of fake accounts and bots.
Recognizing the seriousness of this issue, everyone from the EU to telecommunications companies to civil society organizations have poured resources into digital literacy programs, anti-hate-speech campaigns, social media monitoring, and advocacy to try and address this issue. Overall, the focus of much of this programming is on what Myanmar and the people of Myanmar lack—rule of law, laws protecting free speech, digital literacy, knowledge of what constitutes hate speech, and resources to fund and execute the programming that is needed.
In the frenzy of the desperate firefighting by organizations on the ground, less attention has been given to larger systemic issues that are contributing to the fire.
There is a need to pay greater attention to those coordinated groups that are working to spread conspiracy theories, false information, and hatred to understand who they are, who is funding them, and how their work can be disrupted—and, if necessary, penalized.
There is a need to reevaluate how social media platforms are designed in a way that incentivizes and rewards bad behavior.
There is also a need to question how much blame we want to assign to social media companies, and whether it is to the overall good to give them the responsibility, and therefore power, to determine what is and isn't acceptable speech.
Finally, there is a need to ask ourselves about alternatives we can build, when many governments have proven themselves more than willing to surveil and prosecute netizens under the guise of health, security, and penalizing hate speech.
It is dangerous to expect private, profit-driven multinational corporations to be given the power to draw the line between hate speech and free speech. Just as it is dangerous to give that same power to governments, especially in this time of rising ethno-nationalistic sentiments around the globe and the increasing willingness of governments to overtly and covertly gather as much data as possible to use against those they govern. We can see from the ongoing legal proceedings against Myanmar in international courts regarding the Rohingya and other ethnic minorities, and statements from UN investigative bodies on Myanmar that Facebook has failed release to them evidence of serious international crimes, that neither company policies nor national laws are enough to ensure safety, justice, and dignity for vulnerable populations.
The solution to all this, as unsexy as it sounds, is a multifaceted, multi-stakeholder, long-term effort to build strong legal and cultural institutions that disperses the power and the responsibility to create and maintain safe and inclusive online spaces between governments, individuals, the private sector, and civil society.
Aye Min Thant is the Tech for Peace Manager at Phandeeyar, an innovation lab which promotes safer and more inclusive digital spaces in Myanmar. Formerly, she was a Pulitzer Prize winning journalist who covered business, politics, and ethno-religious conflicts in Myanmar for Reuters. You can follow her on Twitter @ma_ayeminthant.
This article was developed as part of a series of papers by the Wikimedia/Yale Law School Initiative on Intermediaries and Information to capture perspectives on the global impacts of online platforms’ content moderation decisions. You can read all of the articles in the series here, or on their Twitter feed @YaleISP_WIII.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: content moderation, myanmar
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Thread
Rwandian Radio
Obviously there is a difference in responsibility but radio in Rwanda played a similiar role in formenting genocide and arguably in the rise of Nazi Germany.
The two aren't entirely comparable of course - radio has administration of scarce frequency bands by the government and sellers of radio equipment can't stop them after the fact if it turns out the person they sold them to turns them towards promoting genocide.
While Facebook could have done more in retrospect I wonder if it would have made a difference - the moral imperative if one is aware and capable of changing things remains regardless but it matters when it comes to trying to prevent it.
I am not sure that dispersed power would help in this circumstance - in effect it is more a dispersed lack of power. Institutional norms and mentalities could prevent the rise of popularity but it is influence and not force needed for power. It is the difference between it being illegal to go around in bell bottoms and it being heavily looked down upon as unfashionable essentially.
[ link to this | view in chronology ]
Roche Bros survey
An online survey is a questionnaire that people can complete over the Internet. Online surveys are typically created as Web forms.
https://topsurvey.onl/roche-bros-survey/
[ link to this | view in chronology ]
A new buzzword in the technical publishing world. New word? Not exactly new, but surely it’s a new word in the Indian Defense Technical Publishing segment.
S1000D is an Interactive Electronic Technical Publishing IETP. It’s also called IETM too. https://www.codeandpixels.net/blog/s1000d/
[ link to this | view in chronology ]
IETP / Code and Pixels
S1000D is an Interactive Electronic Technical Publishing IETP. It’s also called IETM too. S1000d is not new but very few Indian companies, which support documentation of foreign Airline documentation are similar to S1000d word. But almost 90 % of the people who do S1000D conversation do not know what s1000d is. https://www.codeandpixels.net/s1000d-ietm-ietp
[ link to this | view in chronology ]