Why The History Of Content Moderation Matters

from the it's-not-a-grand-plan dept

On February 2nd, Santa Clara University is hosting a gathering of tech platform companies to discuss how they actually handle content moderation questions. Many of the participants have written short essays about the questions that will be discussed at this event -- and over the next few weeks we'll be publishing many of those essays, including this one.

The first few years of the 21st century saw the start of a number of companies whose model of making user-generated content easily amplified and distributable continues to resonate today. Facebook was founded in 2004, YouTube began in 2005 and Twitter became an overnight sensation in 2006. In their short history, countless books (and movies and plays) have been devoted to the rapid rise of these companies; their impact on global commerce, politics and culture; and their financial structure and corporate governance. But as Eric Goldman points out in his essay for this conference, surprisingly little has been revealed about how these sites manage and moderate the user-generated content that is the foundation for their success.

Transparency around the mechanics of content moderation is one part of understanding what exactly is happening when sites decide to keep up or take down certain types of content in keeping with the community standards or terms of service. How does material get flagged? What happens to it once it's reported? How is content reviewed and who reviews it? What does takedown look like? Who supervises the moderators?

But more important than understanding the intricacies of the system is understanding the history of how it was developed. This gives us not only important context for the mechanics of content moderation, but a more comprehensive idea of how policy was created in the first place, so as to know how best to change it in the future.

At each company, there were various leaders who were charged with developing the content moderation policies of the site. At YouTube (Google) this was Nicole Wong. At Facebook, this was Jud Hoffman and Dave and Charlotte Willner. Though it seems basic now, the development of content moderation policies was not a foregone conclusion. Early on, many new Internet corporations thought of themselves as software companies—they did not think about "the lingering effects of speech as part of what they were doing."

As Jeff Rosen wrote in one of the first accounts of content moderation's history, while "the Web might seem like a free-speech panacea: it has given anyone with Internet access the potential to reach a global audience. But though technology enthusiasts often celebrate the raucous explosion of Web speech, there is less focus on how the Internet is actually regulated, and by whom. As more and more speech migrates online, to blogs and social-networking sites and the like, the ultimate power to decide who has an opportunity to be heard, and what we may say, lies increasingly with Internet service providers, search engines and other Internet companies like Google, Yahoo, AOL, Facebook and even eBay."

Wong, Hoffman and the Willners all provide histories of the hard questions dealt with at each corporation related to speech. For instance, many problems existed simply because flagged content lacked necessary context in order to apply a given rule. This was often the case with online bullying. As Hoffman described, "There is a traditional definition of bullying—a difference in social power between two people, a history of contact—there are elements. But when you get a report of bullying, you just don't know. You have no access to those things. So you have to decide whether you're going to assume the existence of some of those things or assume away the existence of some of those things. Ultimately what we generally decided on was, 'if you tell us that this is about you and you don't like it, and you're a private individual not a public figure, we'll take it down.' Because we can't know whether all these other things happened, and we still have to make those calls. But I'm positive that people were using that function to game the system. . . I just don't know if we made the right call or the wrong call or at what time."

Wong came up against similar problems at Google. In June 2009, a video of a dying Iranian Green Movement protestor shot in the chest and bleeding from the eyes was removed from YouTube as overly graphic and then reposted because of its political significance. YouTube's policies and internal guidelines on violence were altered to allow for the exception. Similarly, in 2007, a YouTube video of a man being brutally beaten by four men in a cell and was removed for violence, but restored by Wong and her team after journalists contacted Google to explain that the video was posted by Egyptian human rights activist Wael Abbas to inform the international community of human rights violations by the police in Egypt. 

What the stories of Wong and Hoffman reveal is that much of the policy and the enforcement of that policy developed in an ad hoc way at each company. Taking down breastfeeding was a fine rule, until it wasn't. Removing an historic photo of a young girl running naked in Vietnam following a napalm attack was acceptable for years, until it was a mistake. A rule worked until it didn't.

Much of the frustration that gets expressed towards Facebook, Twitter, and YouTube seems to build itself off a fundamentally flawed premise: that online speech platforms have had one seminal moment in their history where they established a fundamental set of values that would guide their platform. Instead, however, most of these content moderation policies were a series of piecemeal long, hard, and deliberations about the policies to put in place. There was no "Constitutional Convention" moment at these companies, decisions were made reactively in response to signals that were reported to companies through media pressure, civil society groups, government, or individual users. Without a signal, these platforms couldn't develop, change or "fix" their policy.

Of course, it's necessary to point out that even when these platforms have been made aware of a problematic content moderation policy, they don't always modify their policies, even when they say they will. That's a huge problem -- especially as these sites become an increasingly essential part of our modern public square. But learning the history of these policies, alongside the systems that enforce them, is a crucial part of advocating effectively for change. At least for now, and for the foreseeable future, online speech is in the hands of private corporations. Understanding how to communicate the right signals through amidst the noise will continue to be incredibly useful.

Kate Klonick is a PhD. in Law candidate and a Resident Fellow at the Information Society Project at Yale.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, filtering, history
Companies: facebook, google, twitter, youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 30 Jan 2018 @ 12:17pm

    WHY are corporations developing this not just from scratch, but in vacuo?

    It's NOT different because "on teh internets".

    Common law, made by "natural" persons, not corporate persons. Learn it. With that basis, we kicked a King out of the US of A.

    This is not rocket science, it's PLAIN EVERYDAY DECENCY THAT ALMOST EVERY BAR DOES A GOOD JOB OF ENFORCING. -- But you WEENIES know nothing of it! That's the problem! You're always holding "meetings" with other IGNORANT WEENIES and learn nothing!

    Nearly all websites EXCEPT Techdirt have WRITTEN RULES on words and attitudes allowed. But Techdirt tries to do it the sneaky way, first with the "hiding" which is falsely claimed to be just "community standard" and not to involve a moderator who makes the decision to hide comments. Then there's the fanboy / dissenter distinction: NOT ONE fanboy has ever had a comment hidden here, ONLY those who dissent, and for NO articulable reason. Then there's the un-admitted blocking of home IP address, which was done to me.

    link to this | view in thread ]

  2. icon
    Stephen T. Stone (profile), 30 Jan 2018 @ 12:35pm

    Re: WHY are corporations developing this not just from scratch, but in vacuo?

    Common law, made by "natural" persons, not corporate persons. Learn it.

    Mr. SovCit, why do you think language like this is some magic argument that instantly negates any kind of rebuttal?

    link to this | view in thread ]

  3. identicon
    Christenson, 30 Jan 2018 @ 12:58pm

    Context...

    Dear Me... let me point out how one of the hard, reversed moderation cases above is, shall I say, highly ambiguous:

    Let's think about that video of the dying Iranian Green protestor from 2009. How I embed that within my own website (which may well be out of the reach of the main platform) really sets *arbitrary* context to it. Did I mock the silly, weak greens? or did I mention the faceless "other"that did the shooting? or did I ("wait! the children!") set it up to traumatize small children? And what happens when OOTB takes over the comments, and the attention Techdirt pays to its comments isn't available?

    We are all familiar with twisted quotes forcibly transplanted from one context and twisted to mean something else entirely. That's a fundamental problem with moderation.

    link to this | view in thread ]

  4. icon
    Richard (profile), 30 Jan 2018 @ 1:25pm

    Re: WHY are corporations developing this not just from scratch, but in vacuo?

    NOT ONE fanboy has ever had a comment hidden here, ONLY those who dissent, and for NO articulable reason.

    If you dissent on EVERY issue - then it does begin to look like your reason for being here is just to be contrary.

    Dissenting comments coming from users who don't make a point of ALWAYS dissenting don't get hidden.

    link to this | view in thread ]

  5. icon
    TKnarr (profile), 30 Jan 2018 @ 1:56pm

    We might want to go back even further, to look at moderation in bulletin-board networks and services like CompuServe and Genie all the way back into the 80s. The insights garnered from those are still applicable today, although unfortunately most of them lead to depressing conclusions about the feasibility of successful moderation at current scales.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 30 Jan 2018 @ 3:06pm

    Re: WHY are corporations developing this not just from scratch, but in vacuo?

    Ah OOTB, I thought I smelled failure.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 30 Jan 2018 @ 3:09pm

    I spent several years moderating topical forums which were designed to bring together people with extremely divergent and contradictory goals and ideals.

    It's not easy. The temptation to fling feces is strong, on all sides, even on people who are trying to communicate. And there are people who are not trying to communicate, who are bitterly opposed to any kind of communication happening--the equivalent of talk-radio shock jocks or carnie chicken-head-biting geeks--who just want to make people angry, to give an excuse for their own inner rage.

    Fortunately, most of the chicken-biting people are stupider than yeast; they get a 20-word vocabulary list from some other insanely angry idiot and repeat it ad nauseum. But, as a group gets larger, it attracts a more suave class of drooling hate-mongers. With a bit of study, they find hot-buttons that seem innocent out of context, but can reduce a discussion to feces-flinging.

    (You see the same thing in elementary school. In first grade, the bullies are obvious. By middle school, some of them have gotten really good at seeming innocent while using verbal triggers to incite other people to bullying actions.)

    In the forums I moderated, some hot-button "religious" topics, like politics, culture, and, well, religion, were out of place, which considerably simplified things.

    I think Mr. SovCit's real problem is that he has never actually worked with a group of people on any useful goal--no experience with volunteer civic associations, churches, hospitals, schoolteaching, or even for-profit corporations. And so he doesn't understand that "natural" people do not lose all their human rights just because they're working together. In Hitler's Germany, or Stalin's Russia, people were allowed to go to church--but only the church approved by the dictatorial "leader". People were allowed to work in factories, or on farms, or join labor unions--but only the factory or farm or union that the "leader" assigned them to. Hospitals, schools, etc.--the same: all were all agencies of the state.

    In a free country, it is not so. The state may create some agencies--such as schools--but is forbidden to create others--such as churches. The people have the fundamental right to form their own agencies, and exercise their fundamental human rights through those agencies.

    As a matter of historical fact: in the United States, the same man who freed the enslaved "natural" persons to pursue their human rights also argued the Supreme Court case that allowed "natural" persons to exercise their rights in an incorporated organization.

    link to this | view in thread ]

  8. identicon
    Rich Kulawiec, 30 Jan 2018 @ 3:28pm

    Re:

    There are lessons to be learned there, but -- once again -- I'd point everyone at Usenet, because of its scale and because of the (relatively) anarchic nature of its topology, propogation, and moderation.

    We faced a lot of the same problems that are present today. Some of them we solved rather well. Some of them we screwed up pretty badly. But the chronicles of this, and our debates about them, are all archived for anyone who wants to read them.

    And about this:

    "about the feasibility of successful moderation at current scales"

    One of the maxims that I pound into the heads of fledgling architects and sysadmins and netops is "never build something you don't know how to run". That sometimes refers to function, but it also refers to scale. Nobody forced Google or Twitter or anyone else to get that large that fast: they CHOSE to. And then, in an act of stunning irresponsibility, they also chose not to learn how to run it. Instead, they wring their heads and lament how terribly, terribly hard it is - thus excusing themselves from culpability for their own actions.

    I'm not having any. My message to them is if you can't run it properly -- and all evidence indicates that you can't -- then shut it down and leave it down until you can.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 30 Jan 2018 @ 4:40pm

    If Google has all this fancy AI,

    Why is it not in Chrome?

    Business's prefer to externalize expenses whenever possible. Moderation is an expense. It makes no money. It could be externalized by pushing it to the edges. But they don't do that.

    Why?

    And the reason is of course, if you demonstrate that you can do that, the fed will make it mandatory. So the only reasonably solution is to do it as an open source plugin, so that end users can train their browsers themselves.

    But they all killed plugins too didn't they?

    This could have been 90% solved decades ago with the tuning of a few RFC's. But that would have created some anchor points the fed might understand, and they would probably have leveraged that to napalm the Constitution. So nobody did it, probably out of consideration for civil rights.

    But I think we are pretty much past the point where we can assume that ignorance is keeping the Internet free. At this point the only thing that is going to restore freedom on the Internet is full end to end crypto at layer 4, a peer to peer DNS architecture, and a transmission architecture that prunes analytics.

    And once we've got that, we can go back to speaking freely and enjoying digital interpersonal communications. Instead of what we have now, which is to suffer the same bullshit appeal to humanities basest behaviors that constitutes the bulk of big corp media content.

    Funny how in the 90's we didn't need Congress or the FCC to save us from ourselves. Back then moderation meant that you would at least _try_ and stay sober enough to drive yourself home.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 30 Jan 2018 @ 5:58pm

    Re: WHY are corporations developing this not just from scratch, but in vacuo?

    > It's NOT different because "on teh internets"

    Half the patents you so vigorously defend are precisely because of claims that it's different because "on teh internets", you blue jackass.

    link to this | view in thread ]

  11. icon
    TKnarr (profile), 30 Jan 2018 @ 6:22pm

    Re: Re:

    I'd argue that they did know how to run their services. They chose to build them without moderation, and they knew exactly how to run them that way. You're asking them to run them a different way, so it shouldn't be a surprise that they don't know how to run them in a way they didn't build them for. Whether they should've picked a different way of running things is another matter, but I'm not sure they're under any obligation to make their services useful to us (as opposed to useful to them). The traditional BBS sysop/netop answer to the issue is "If you don't like our rules and how we operate, feel free to go somewhere else. Push it and we'll help you along.".

    I'd also argue that you can't do moderation successfully on Internet scales. Every successful moderation system I know of depended on aspects of the system that services like Twitter and Google don't have any control over.

    link to this | view in thread ]

  12. icon
    PaulT (profile), 31 Jan 2018 @ 1:00am

    Re: If Google has all this fancy AI,

    "If Google has all this fancy AI,

    Why is it not in Chrome? "

    Because Chrome is a piece of general purpose client software, and the AI that's generally discussed is specialised and server-based? Because it's a hell of a lot more complicated than your mind seems to allow for?

    "But they all killed plugins too didn't they? "

    For very good reason. I'd look into the reasons for that, especially with regard to how you perfect little scheme could be abused by the same bad actors who caused other things to be removed.

    "Funny how in the 90's we didn't need Congress or the FCC to save us from ourselves. Back then moderation meant that you would at least _try_ and stay sober enough to drive yourself home."

    You might wish to look up the reality of user numbers and use cases and see what's changed in the meantime. You might be shocked to see that some fundamental things you have based your assumptions on have changed somewhat.

    link to this | view in thread ]

  13. icon
    Richard (profile), 31 Jan 2018 @ 2:08am

    Re: WHY are corporations developing this not just from scratch, but in vacuo?

    Even Donald Trump doesn't disagree with techdirt as consistently as you do!

    link to this | view in thread ]

  14. icon
    Richard (profile), 31 Jan 2018 @ 2:18am

    Re: Re: Re:

    _I'd argue that they did know how to run their services. They chose to build them without moderation,_

    The reason for that choice - and the root of the problem is that these services deliberately blur the boundaries between private conversations and public ones. That is a key feature of the business models that have been so successful.

    No one outside a totalitarian state would want to censor the contents of 1 to 1 telephone calls (except where one party is in prison).

    Most people would expect broadcast radio to some editorial control.

    Use the same technology for both purposes (and introduce a grey area between the two) and you have a problem

    Everyone would agree that

    link to this | view in thread ]

  15. icon
    Richard (profile), 31 Jan 2018 @ 2:22am

    "If Google has all this fancy AI,

    It doesn't.

    Google AI - like all AI - is 90% propaganda and 10% carefully tuned systems that do (very) specific and well defined tasks.

    link to this | view in thread ]

  16. icon
    Richard (profile), 31 Jan 2018 @ 2:25am

    Re:

    carnie chicken-head-biting geeks

    YEA!! Someone else who knows the true definition of the word "geek".

    link to this | view in thread ]

  17. icon
    Richard (profile), 31 Jan 2018 @ 2:34am

    Re:

    _In Hitler's Germany, or Stalin's Russia, people were allowed to go to church--but only the church approved by the dictatorial "leader"._

    Actually Stalin didn't even allow people to go to the church that was (semi) state approved without consequences. Only old women could get away with it because they fitted the state's narrative of a dying institution. The only "church" that was approved was the "church" of dialectical materialism.

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 31 Jan 2018 @ 6:33am

    Re: Re: If Google has all this fancy AI,

    "and the AI that's generally discussed is specialised and server-based"

    Note that this is problem is an expense to anyone interfacing with it. My experience has been that most of the solutions are ad-hoc.

    I acknowledged in my previous post that doing this client side has serious challenges in terms of the inevitable encroachment of regulatory malfeasance. I think your suggestion that this can't be done client side for technical reasons, is worth reconsideration on your end. If you think about the technical challenges involved, there are plenty of examples where those problems have been solved client side.

    If it is cheaper to do it server side, the only reason I can think of for that, is litigation costs. Problems of scale are almost always solved by pushing CPU load to the edges of the system. It is cheaper to process there because the hardware isn't yours.

    "You might wish to look up the reality of user numbers and use cases and see what's changed in the meantime. You might be shocked to see that some fundamental things you have based your assumptions on have changed somewhat."

    That is an offline discussion. I would love nothing better than to be enlightened on that subject, but this isn't the place to talk about that.

    Sufficed to say, that if you can refer me to some RFC's or related white papers, I would be grateful.

    link to this | view in thread ]

  19. identicon
    Wendy Cockcroft, 1 Feb 2018 @ 2:28am

    Re: Re: WHY are corporations developing this not just from scratch, but in vacuo?

    Confirmed correct; I once had a barney with Mike over UBI; comments not hidden and we get along fine.

    I often argue with the other commenters but in a reasonable, articulate way that addresses their arguments. I don't resort to name-calling and am willing to admit when I am wrong. Try doing that and can the histrionics, wild accusations, and strawman arguments you're famous for, Blue.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.