The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Moderate Globally, Impact Locally

from the monumental-balancing-act dept

Every minute, more than 500 hours of video are uploaded to YouTube, 350,000 tweets are sent, and 510,000 comments are posted on Facebook.

Managing and curating this fire hose of content is an enormous task, and one which grants the platforms enormous power over the contours of online speech. This includes not just decisions around whether a particular post should be deleted, but also more minute and subtle interventions that determine its virality. From deciding how far to allow quack ideas about COVID-19 to take root, to the degree of flexibility that is granted to the President of the United States to break the rules, content moderation raises difficult challenges that lie at the core of debates around freedom of expression.

But while plenty of ink has been spilled on the impact of social media on America’s democracy, these decisions can have an even greater impact around the world. This is particularly true in places where access to traditional media is limited, giving the platforms a virtual monopoly in shaping the public discourse. A platform which fails to take action against hate speech might find itself instrumental in triggering a local pogrom, or even genocide. A platform which acts too aggressively to remove suspected “terrorist propaganda” may find itself destroying evidence of war crimes.

Platforms’ power over the public discourse is partly the result of a conscious decision by global governments to outsource online moderation functions to these private sector actors. Around the world, governments are making increasingly aggressive demands for platforms to police content which they find objectionable. The targeted material can range from risqué photos of the King of Thailand, to material deemed to insult Turkey’s founding president. In some instances, these requests are grounded in local legal standards, placing platforms in the difficult position of having to decide how to enforce a law from Pakistan, for example, which would be manifestly unconstitutional in the United States.

In most instances, however, moderation decisions are not based on any legal standard at all, but on the platforms’ own privately drafted community guidelines, which are notoriously vague and difficult to understand. All of this leads to a critical lack of accountability in the mechanisms which govern freedom of expression online. And while the perceived opacity, inconsistency and hypocrisy of online content moderation structures may seem frustrating to Americans, for users in the developing world it is vastly worse.

Nearly all of the biggest platforms are based in the United States. This means not only that their decision-makers are more accessible and receptive to their American user-base than they are to frustrated netizens in Myanmar or Uganda, but also that their global policies are still heavily influenced by American cultural norms, particularly the First Amendment.

Even though the biggest platforms have made efforts to globalize their operations, there is still a massive imbalance in the ability of journalists, human rights activists, and other vulnerable communities to get through to the U.S.-based staff who decide what they can and cannot say. When platforms do branch out globally, they tend to recruit staff who are connected to existing power structures, rather than those who depend on the platforms as a lifeline away from repressive restrictions on speech.

For example, the pressure to crackdown on “terrorist content” inevitably leads to collateral damage against journalism or legitimate political speech, particularly in the Arab world. In setting this calculus, governments and ex-government officials are vastly more likely to have a seat at the table than journalists or human rights activists. Likewise, the Israeli government has an easier time communicating their wants and needs to Facebook than, say, Palestinian journalists and NGOs.

None of this is meant to minimize the scope and scale of the challenge that the platforms face. It is not easy to develop and enforce content policies which account for the wildly different needs of their global user base. Platforms generally aim to provide everyone with an approximately identical experience, including similar expectations with regard to the boundaries of permitted speech. There is a clear tension between this goal and the conflicting legal, cultural and moral standards in force across the many countries where they operate.

But the importance and weight of these decisions demands that platforms get this balancing right, and develop and enforce policies which adequately reflect their role at the heart of political debates from Russia to South Africa. Even as the platforms have grown and spread around the world, the center of gravity of these debates continues to revolve around D.C. and San Francisco.

This is the first in a series of articles developed by the Wikimedia/Yale Law School Initiative on Intermediaries and Information appearing here at Techdirt Policy Greenhouse and elsewhere around the internet—intended to bridge the divide between the ongoing policy debates around content moderation, and the people who are most impacted by them, particularly across the global south. The authors are academics, civil society activists and journalists whose work lies on the sharp edge of content decisions. In asking for their contributions, we offered them a relatively free hand to prioritize the issues they saw as the most serious and important with regard to content moderation, and asked them to point to areas where improvement was needed, particularly with regard to the moderation process, community engagement, and transparency.

The issues that they flag include a common frustration with the distant and opaque nature of platforms’ decision-making processes, a desire for platforms to work towards a better understanding of local socio-cultural dynamics underlying the online discourse, and a feeling that platforms’ approach to moderation often did not reflect the importance of their role in facilitating the exercise of core human rights. Although the different voices each offer a unique perspective, they paint a common picture of how platforms’ decision making impacts their lives, and of the need to do better, in line with the power that platforms have in defining the contours of global speech.

Ultimately, our hope with this project is to shed light on the impacts of platforms’ decisions around the world, and provide guidance on how social media platforms might do a better job of developing and applying moderation structures which reflect their needs and values of their diverse global users.

Michael Karanicolas is a Resident Fellow at Yale Law School, where he leads the Wikimedia Initiative on Intermediaries and Information as part of the Information Society Project. You can find him on twitter at @M_Karanicolas.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, global, greenhouse, impact, local


Reader Comments

Subscribe: RSS

View by: Thread


  1. identicon
    Anonymous Coward, 5 Aug 2020 @ 2:50pm

    If the owner of a platform can determine which speech will be permitted and what gets the big delete button, how are these platforms going to be politically unbiased since that seems to be the tipping point of the balance?

    link to this | view in thread ]

  2. icon
    Stephen T. Stone (profile), 5 Aug 2020 @ 3:20pm

    That’s the trick: They can’t be “unbiased” unless they host all legally protected speech, regardless of whether they want to refuse hosting certain kinds of content.

    link to this | view in thread ]

  3. icon
    Anonymous Anonymous Coward (profile), 5 Aug 2020 @ 6:01pm

    Re:

    Where does it say that platforms must be politically unbiased? Please don't quote anything Trump, or anyone in his political sphere.

    Now whether a platform should be honest about their political bent is up to the platform, and the users who accept that bent, or not (when the should stop being users), or whether the platform actually elucidates their political bent.

    No matter which, bent expressed or not, it is none of the governments business, unless they decide to use the platform for their own political reasons. Then they can only say that they have different views, rather than doing anything about taking down any view expressed on the platform, which tend to come from users.

    Expression of a political view via moderation choices are the opinion of the observer, rather than an expression of the platform. One needs to look at the reasons for moderation choices, rather than the fact of moderation. Violations of terms of service are not political decisions, no matter how much someone else maintains they are.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 5 Aug 2020 @ 6:48pm

    Re: Re:

    If speech is to be censored then, but the goal is an attempt to be balanced, how can a platform not be political? Because everything political seems to be what causes the imbalance to speech, and platforms censoring speech do so to create the bias. How can a platform remain unpolitical without judgement?

    link to this | view in thread ]

  5. icon
    Stephen T. Stone (profile), 5 Aug 2020 @ 7:23pm

    Platforms don’t censor speech unless they actively try to stop certain speech from being posted anywhere.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 5 Aug 2020 @ 7:28pm

    Re:

    I think the world should take note of what Techdirt does. This model should be the rule not the exception to protecting free speech without exploiting political agendas. Its like a very tasty salad with a great meal. It just feels good!

    link to this | view in thread ]

  7. icon
    Stephen T. Stone (profile), 5 Aug 2020 @ 7:43pm

    Your insincerity is palpable.

    link to this | view in thread ]

  8. icon
    Anonymous Anonymous Coward (profile), 5 Aug 2020 @ 7:51pm

    Re: Re: Re:

    Where does it say that platforms must be balanced? Cite actual laws including the relevant Title, Chapter, Section and Sub Section. Quoting Trump or any of his minions, or for that matter any other politician who feels slighted does not count. They are the government, and by definition are not allowed to 'censure'. Or argue censorship, though they often cannot help themselves as they don't actually seem to know the current law.

    But platforms are allowed to (and whether I agree that TOS's are legal binding contracts that have not been negotiated is a different argument) apply their TOS's. If any contributor to a platform violates the TOS, then they may be sanctioned, whether that is a temporary, single post, or permanent ban is up to the platform. Whether or not the poster agrees with the decision is (unfortunately simply because the systems to object are currently unworkable) irrelevant. It is the platforms decision and the poster does not actually have anything to say about it (again, unfortunately because the objection systems don't work as they should).

    And to go a bit further, read you some more Techdirt. Those decisions don't appear to be political, at least to anyone who uses logic and reason rather than ideology as a cornerstone. Look at all the trash political speech that is allowed (see most of Trumps tweets). Those decisions are based upon the platforms reading of the TOS and since they wrote the TOS (without any user input, which may or may not be a problem) they get to interpret them. Users don't, but courts might. In that case the TOS would simply be changed, again, and without user input (possibly, in the long run, to the platforms ruin).

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 5 Aug 2020 @ 8:00pm

    Re: Re: Re: Re:

    Where does it say it is legal to throw the first amendment before the bus or sweep it under a rug? There is no law that says a platform must be balanced, but everything that is political is already corrupted. It deserves a great bit of admiration for those sites hosting platforms of discussion to remain open to debate.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 5 Aug 2020 @ 8:01pm

    Re:

    Why do you say I am insincere?

    link to this | view in thread ]

  11. icon
    Stephen T. Stone (profile), 5 Aug 2020 @ 8:24pm

    Where does it say it is legal to throw the first amendment before the bus or sweep it under a rug?

    The First Amendment applies to government attempts at regulating speech because the Founding Fathers didn't want the government telling people what they could and couldn’t say. It doesn’t apply to privately-owned institutions — open to the public or otherwise — for the same reason.

    link to this | view in thread ]

  12. icon
    Stephen T. Stone (profile), 5 Aug 2020 @ 8:25pm

    I think the world should take note of what Techdirt does.

    That alone is enough.

    link to this | view in thread ]

  13. identicon
    MathFox, 6 Aug 2020 @ 1:55am

    Newspapers

    In the good old days, when news was printed on mashed dead trees, editors decided what to print and what to discard. Those decisions on what to print were influenced by the political slant of the newspaper.

    This was the generally accepted situation. Why should an Internet publisher be politically unbiased? I don't think that "Internet publishing is tree-friendly" counts as argument here.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 6 Aug 2020 @ 7:54am

    Re:

    Well you are too easy to suspect evil then. I am absolutley being serious.. ok maybe not the whole world, but our back yard of it.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 6 Aug 2020 @ 8:21am

    Re:

    But all that said, it is still a very chilling feeling to have anyone tell me what I cannot say. That is what I mean.

    link to this | view in thread ]

  16. icon
    Anonymous Anonymous Coward (profile), 6 Aug 2020 @ 9:06am

    Re: Re:

    You can say anything you want. You don't have an absolute right to use other peoples property to do so. Platforms are owned by other people (including this one, though they choose to use a different moderation method than Facebook or Twitter or YouTube).

    Start your own blog, say what you want and no one can tell you you can't, though if what you have to say isn't very interesting, don't expect much of a audience. Your right to say what you want does not include forcing people to listen, and could backfire as other people might ridicule your statements. Then again, you could become a new hero, to some.

    link to this | view in thread ]

  17. icon
    Stephen T. Stone (profile), 6 Aug 2020 @ 9:14am

    it is still a very chilling feeling to have anyone tell me what I cannot say.

    Then go somewhere else. You have the right to speak your mind. But you don’t have the right to do it on someone else’s private property if they say “we don’t do that here”.

    link to this | view in thread ]

  18. icon
    Samuel Abram (profile), 6 Aug 2020 @ 12:43pm

    Re: name of the blog

    À propros of the name of the blog to which you linked, I'm just glad it's a far side reference (considering that Gary Larson has finally caught up with everyone else and started making new cartoons!)!

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 6 Aug 2020 @ 4:18pm

    Re:

    You misconstrue. I am not talking about Techdirt specifically, but the censoring of speech anywhere or everywhere.

    Stone, I know you hang out here like a morrey eel and bite anyone for anything. You remind me of a dog being kept on the corner of a backyard with just enough chain to bite anyone who ventures into Mike's world. It's unfortunately a nasty world out there. It just sucks to see how much you enjoy being a barking fish while pretending to be human. See you in the funnies.

    link to this | view in thread ]

  20. icon
    Stephen T. Stone (profile), 6 Aug 2020 @ 5:24pm

    I am not talking about Techdirt specifically, but the censoring of speech anywhere or everywhere.

    Moderation is not censorship. Twitter telling you to fuck off doesn’t stop you from going to Facebook, and vice versa.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 6 Aug 2020 @ 5:45pm

    Re: Moderation is not censorship

    Tell that to the politicians who are driving so hard to force moderation on independant platforms as a way to control speech they don't like.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 6 Aug 2020 @ 5:53pm

    Re: Newspapers

    Well, how can a politically biased anyone be trustworthy who is writing papers telling the news?

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 6 Aug 2020 @ 5:55pm

    Re:

    Twitter being Twitter and Facebook being Facebook stops me from going to either!

    link to this | view in thread ]

  24. icon
    Stephen T. Stone (profile), 6 Aug 2020 @ 6:04pm

    All journalism has biases. Someone must decide what to publish, what to distill out of the mass of available data, and what facts to check. The best journalists try to keep their biases in check; the worst journalists make their biases clear and peddle in both mis- and disinformation.

    A good journalist tells truth to power, consequences be damned. A bad journalist seeks power instead of truth, facts be damned.

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 6 Aug 2020 @ 6:37pm

    Re:

    I give you some points for that. Everything is so messed up today in the news industry. Who has the best unbiased unslanted scoop today? What news corp actually keeps politics at bay since the centralization of the news in the 90s? It is extremely agitating not being able to know what is truly happening when it all seems so devided.

    link to this | view in thread ]

  26. identicon
    Anonymous Coward, 6 Aug 2020 @ 6:41pm

    Re: Re:

    Sorry to all you english majors.. divided!

    link to this | view in thread ]

  27. icon
    Stephen T. Stone (profile), 6 Aug 2020 @ 7:32pm

    Who has the best unbiased unslanted scoop today?

    Nobody. Nobody is “unbiased” or “unslanted”. Did you really not read a word I wrote in that post?

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 6 Aug 2020 @ 9:37pm

    Too bad there's no way to give the control directly to the user. You know, give us all little applications that filter out what WE, personally do not like - on our own machines - regardless of what is on and not touching anything that is on the website at any time. Like, you know. We could like say No Nazis, No Titties, No, etc., etc. in a personal Censor Assistant File that appends with a click to grab and record the "type", or "flavour" of stuff you want to prevent ever being displayed on your machine again, ever. A text based config file, so you can quickly add URLs you don't like manually. Or we could add a YN to tell the app to ask you first.

    Joe the Nazi gets his daily rascist reinforcement propaganda, as desired, and Joanie the housewife gets nothing but romance/sexless - from the same, un-moderated website.

    In this way, no moderation is necessary... or did I miss something?

    :)

    link to this | view in thread ]

  29. icon
    Stephen T. Stone (profile), 7 Aug 2020 @ 4:20am

    No Nazis

    Good luck reading about World War II or the Wolfenstein video game series, then. Computer filters lack the ability to discern context; a mention of a Nazi on a Wikipedia article is the same as a mention of a Nazi on a White supremacist forum. Hell, this comment itself would probably cause this page to be filtered under a “no Nazis” rule.

    You can’t set up filters like the ones you’re suggesting without doing one of two things: filtering a bunch of content you didn’t meant to filter, or spending more time than you’d like to set up those filters in ways that avoid “collateral damage”.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 7 Aug 2020 @ 10:58am

    Re:

    Maybe I read it but I want proof not third party lecture.

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 9 Aug 2020 @ 9:06am

    Re:

    You know, give us all little applications that filter out what WE, personally do not like - on our own machines

    That would work with word filters only, but filtering images require more computer power, and data storage than most people can afford. Besides which, that involves people storing that which they find offensive, so that they can hide it from themselves.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.