The Scale Of Content Moderation Is Unfathomable

from the it's-way-more-than-you-think dept

Sometimes it's difficult to get across to people "the scale" part when we talk about the impossibility of content moderation at scale. It's massive. And this is why whenever there's a content moderation decision that you dislike or that you disagree with, you have to realize that it's not personal. It wasn't done because someone doesn't like your politics. It wasn't done because of some crazy agenda. It was done because a combination of thousands of people around the globe and still sketchy artificial intelligence are making an insane number of decisions every day. And they just keep piling up and piling up and piling up.

Evelyn Douek recently gave a (virtual) talk at Stanford on The Administrative State of Content Moderation, which is worth watching in its entirety. However, right at the beginning of her talk, she presented some stats that highlight the scale of the decision making here. Based on publicly revealed transparency reports from these companies, in just the 30 minutes allotted for her talk, Facebook would take down 615,417 pieces of content, YouTube would take down 271,440 videos, channels, and comments, and TikTok would take down 18,870 videos. And, also, the Oversight Board would receive 48 petitions to review a Facebook takedown decision.

And, as she notes, that's only the take down decisions. It does not count the "leave up" decisions, which are also made quite frequently. Facebook is not targeting you personally. It is not Mark Zuckerberg sitting there saying "take this down." The company is taking down over a million pieces of content every freaking hour. It's going to make mistakes. And some of the decisions are ones that you're going to disagree with.

And, to put that in perspective, she notes that in its entire history, the US Supreme Court has decided a grand total of approximately 246 1st Amendment cases, or somewhere around one per year. And, of course, in those cases, it often involves years of debates, and arguments, and briefings, and multiple levels of appeals. And sometimes the Supreme Court still gets it totally wrong. Yet we expect Facebook -- making over a million decisions to take content down every hour -- to somehow magically get it all right?

Anyway, there's a lot more good stuff in the talk and I suggest you watch the whole thing to get a better understanding of the way content moderation actually works. It would be helpful for anyone who wants to opine on content moderation to not just understand what Douek is saying, but to really internalize it.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, due process, evelyn douek, scale, takedowns
Companies: facebook, tiktok, youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 2 Nov 2021 @ 12:54pm

    "Content moderation at scale is TOO EXPENSIVE." There.

    link to this | view in chronology ]

    • icon
      nasch (profile), 2 Nov 2021 @ 4:22pm

      Re:

      "Content moderation at scale is TOO EXPENSIVE." There.

      How do you suppose spending more money would solve the problem?

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Nov 2021 @ 9:09pm

        Re: Re:

        You build the cost of moderation into the cost of broadcasting over the internet, just like you build the cost of safety into an automobile. Even now moderation can be done just fine, with the few aberrations dealt with as they arise.

        It's not that we will do this anytime soon, but moderation at scale is certainly possible, just not at current prices.

        link to this | view in chronology ]

        • icon
          Strawb (profile), 4 Nov 2021 @ 1:00am

          Re: Re: Re:

          Even now moderation can be done just fine, with the few aberrations dealt with as they arise.

          "Tell me you're not a regular reader of Techdirt without telling me you're not a regular reader of Techdirt."

          link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 4 Nov 2021 @ 4:00am

          Re: Re: Re:

          "but moderation at scale is certainly possible"

          It really isn't. This has been hashed out a few hundred or thousand times by now. No. Sensible moderation at scale isn't possible. Every time an attempt is made just results in collateral effects ranging from the hilarious to the horrifying, or both.

          Like Art Museums getting continually blocked over artists, churches and law institutions blocked over imagery of cherubs or lady justice, or rock bands having their greatest hits yanked from online outlets because the cover image includes an insufficiently clad baby.

          "You build the cost of moderation into the cost of broadcasting over the internet, just like you build the cost of safety into an automobile."

          An extraordinarily efficient way of showing people you haven't a clue about what online moderation is and how it works, nor how automobile safety works.

          If the cost of safety for an automobile cost a hundred times more than the rest of the car and involved the buyer paying a persistent retainer to "security personnel" included in the car I'm pretty sure we'd have the same discussion about that as well.

          Your analogy is bad, and you should feel bad about that.

          "...just not at current prices."

          The assumption that this will ever change relies on either the invention of functional and genuine AI or a radical drop of the cost of skilled labor.
          Either case would have to involve near-slavery standards of salary. It might be possible to achieve by using imprisoned convicts not bound by minimum wage standards for labor...but the use of convicted felons as the default moderators of all social services does raise other issues.

          This is not an issue where doubling down on an ill-conceived notion will magically provide a different answer than you got on the initial one.

          link to this | view in chronology ]

      • icon
        nerdrage (profile), 3 Nov 2021 @ 9:04am

        Re: Re:

        Content moderation at scale is too expensive to be worthwhile in an ad-based business model (or a more-lucrative subscription model, to the extent that model is even being used).

        Advertisers do sometimes care about their ads being placed next to crazy Nazi screeds. YouTube got in trouble about that. A subscription model would put the users in charge of what content they consider acceptable. LinkedIn I guess would be a (rare) example of that, and content seems to be a lot less unsavory on that site.

        link to this | view in chronology ]

        • icon
          nasch (profile), 3 Nov 2021 @ 9:59am

          Re: Re: Re:

          What do you mean by a subscription model? If it's just revenue from subscriptions instead of ads, I don't see how that solves the moderation issue, unless it's by drastically reducing the number of users. Which kind of misses the point of the issue of moderation at scale.

          link to this | view in chronology ]

    • icon
      That One Guy (profile), 2 Nov 2021 @ 7:10pm

      Talk about a time-saver

      And here I was just about to post a comment about how the response from many to the numbers would be some variant of 'Nerd Harder!'...

      link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 2 Nov 2021 @ 11:40pm

      Re:

      "Content moderation at scale is TOO EXPENSIVE." There.

      Nope. I mean, Facebook spends more on content moderation than Twitter makes in revenue each year. It's not like they're not spending on it. Saying that it's too expensive shows you are ignorant of the scale here, just as the point of this post was trying to show. Thanks for confirming your ignorance.

      link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 3 Nov 2021 @ 2:25am

      Re:

      ""Content moderation at scale is TOO EXPENSIVE." There."

      Not really, no. TOO EXPENSIVE implies there's someone out there with the money and resources to accomplish what you want accomplished.

      The issue with content moderation at scale is the same as that of, say, relocating the pacific ocean. Yeah, in theory you could do that, given more manpower than exists on earth. But the realistic assumption will always be "impossible".

      By the time Facebook, for instance, manages to moderate content sensibly they will have employed a significant proportion of the citizens in the US and of those living in every nation Facebook is represented in.

      But I realize the alt-right has problems with math so let's just dumb it down to the easier and just as accurate explanation - if you insist on moderation which costs ten times more than your business brings in then you are no longer in business at all. THAT is PART of why moderation at scale is impossible.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 3 Nov 2021 @ 6:04am

        Re: Re:

        By the time Facebook, for instance, manages to moderate content sensibly they will have employed a significant proportion of the citizens in the US and of those living in every nation Facebook is represented in.

        That is a non sequitur, as a significant portion of the population will not be able to agree on what sensible moderation is.

        link to this | view in chronology ]

        • icon
          nerdrage (profile), 3 Nov 2021 @ 9:06am

          Re: Re: Re:

          Facebook is ad-based so the advertisers would decide what sensible moderations is. Sometimes they push back hard enough that the platforms do something about the crazy stuff. Advertisers don't like their nice ads being associated with toxic drivel. At least that happened to YouTube but I guess FB advertisers are fine with the insanity.

          If the advertisers aren't motivated enough to care about solving the problem, the problem will not be solved. Congress can huff and puff but they aren't the ones footing the bill.

          link to this | view in chronology ]

          • icon
            nasch (profile), 3 Nov 2021 @ 10:00am

            Re: Re: Re: Re:

            If the advertisers aren't motivated enough to care about solving the problem, the problem will not be solved.

            This implies that the only thing missing that is needed to solve the problem is motivation, which is not correct.

            link to this | view in chronology ]

          • identicon
            Anonymous Coward, 3 Nov 2021 @ 2:41pm

            Re: Re: Re: Re:

            Didn't Infowars already disprove the assertion advertisers don't want to be associated with toxic drivel?

            link to this | view in chronology ]

          • icon
            Scary Devil Monastery (profile), 4 Nov 2021 @ 6:19am

            Re: Re: Re: Re:

            "Facebook is ad-based so the advertisers would decide what sensible moderations is."

            Well, in the end they'll be the ones setting the trend for the triggers setting off the banhammer. However, as nasch and an AC below note, that a lot of advertisers are just fine and dandy with toxic drivel - their target demographic being a lot of toxic people - and that the motivation they have is in the end irrelevant since in the end the platforms involved don't have the resources to moderate sensibly at scale, even to meet their own requirements.

            So it really doesn't matter how motivated the advertisers are, because they aren't a unified front. The target website has to be a true wonder of malicious rot indeed to drive off the ones who already know their ad targets are the people wearing tinfoil headgear or scouting for the chemtrails leading to the HQ of the liberal cannibal cult conspiracy helmed by the Kenyan Muslim and his sidekick Killary.

            link to this | view in chronology ]

            • icon
              nasch (profile), 4 Nov 2021 @ 7:21am

              Re: Re: Re: Re: Re:

              So it really doesn't matter how motivated the advertisers are, because they aren't a unified front.

              That still understates the problem. Even if every advertiser were in complete agreement about what they wanted to appear on a platform, and not appear, it would still not be possible to moderate a platform like Facebook perfectly.

              link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 4 Nov 2021 @ 6:10am

          Re: Re: Re:

          "That is a non sequitur, as a significant portion of the population will not be able to agree on what sensible moderation is."

          Well, no. I mean, yeah it's even more of an impossibility than my far more conservative assertion.

          But that's not a "non sequitor". That would be an object of irrelevance to either the ongoing argument or to any of the topics discussed.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 3 Nov 2021 @ 5:01am

      Re:

      Sure. If by "too expensive" you mean "To do effectively would cost more than the entire GDP of the planet", yeah, that would be the problem.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 2 Nov 2021 @ 1:20pm

    Zuckerberg

    Do you think that Mark was called Mark "Suck a turd" as a kid?

    link to this | view in chronology ]

  • identicon
    Bruce C., 2 Nov 2021 @ 3:40pm

    Content moderation at scale...

    is possible for old line media. They only publish stuff after they moderate it. And reject or ignore 99.9% (to several more decimal places) of the stuff that comes in.

    If section 230 removes platform/publisher/moderators exemption from liability, social media will just become...media. Stuff won't get posted until after moderation. A lot of stuff will never get moderated in the first place and never reach the public view. The internet will become a lot less interesting and dynamic.

    There just doesn't seem to be a middle ground here. Both models leave it up to the platform/publisher to decide who the unworthy trolls are, but apart from the extreme solutions of "publish everything, then do the best we can to clean up the mess" and "only publish what we validate as publicly acceptable", content moderation at scale truly is impossible.

    link to this | view in chronology ]

    • identicon
      christenson, 2 Nov 2021 @ 6:47pm

      Re: Content moderation at scale...

      Hey, @mmasnick, what is the approximate scale and effort of moderation on Techdirt?

      Moderation at scale might be possible, but it requires some things:
      a) Smallness, so context is possible. Techdirt moderation isn't difficult, because it doesn't try to be all things to all people, it's just a narrow thing for some of us wonks. Nobody loses sleep when our favorite miscreants posts get blocked or voted hidden.

      Also, the automatic context supplied by techdirt may make AI/ml/patter recognition possible because my favorite trick for reversing the moral valence of bad content, "this is bad!" can be broken.

      b) crowdsourcing. Techdirt has this nice best comment of the week contest, reddit has its subreddits, etc, where some people can feel seen and helpful contributing moderation.

      Communities, true communities, are the only known answer.

      link to this | view in chronology ]

      • icon
        nasch (profile), 2 Nov 2021 @ 8:54pm

        Re: Re: Content moderation at scale...

        Moderation at scale might be possible, but it requires some things:
        a) Smallness

        I think you have misunderstood the term "moderation at scale". It means moderating vast amounts of content, on the scale of a large social media platform.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 3 Nov 2021 @ 6:12am

          Re: Re: Re: Content moderation at scale...

          I think you also misunderstand the scale, and that is the scale of the human race, and not the size of any individual platform. It does not matter if there is one or one million social media sites, as its the number of posts to be moderated.

          link to this | view in chronology ]

      • icon
        Scary Devil Monastery (profile), 3 Nov 2021 @ 2:37am

        Re: Re: Content moderation at scale...

        "Communities, true communities, are the only known answer."

        Do note that your own argument presents the solution of "Don't try to do it at scale".

        Which is also what helps FB out. Users set up silos of their own to which they invite and friend participants selectively. There's your community in the making. thousands of individuals pitching tents on a large field and inviting likeminded to gather.

        The trolls would, in that metaphor, be ambulatory port-a-potties entering random tents uninvited and start leaking until they are carted out by the irate group in residence.

        link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 3 Nov 2021 @ 2:32am

      Re: Content moderation at scale...

      "...is possible for old line media. They only publish stuff after they moderate it. And reject or ignore 99.9% (to several more decimal places) of the stuff that comes in."

      As your argument clearly demonstrates old media can't moderate at scale either. A publisher rejects >99,99% of everything coming in before it even crosses an editor's desk.

      "...There just doesn't seem to be a middle ground here. Both models leave it up to the platform/publisher to decide who the unworthy trolls are..."

      There isn't, because the two models are apples and oranges. Completely unrelated.

      A publisher chooses, by luck or commission, to publish on their platform a piece written and approved in advance by an editor. This piece is usually written by the staff of said platform with very rare exceptions.

      Social media works instead like a bar. Everyone is free to enter and there's a set of rules by the door detailing expected behavior for that privilege. When patrons abuse said privilege the bartender has them escorted off the premises.

      And learning that everyone who tries to blend those two models in an argument is a troll goes a long way towards explaining why the issue is still considered contentious.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Nov 2021 @ 4:46pm

    Content moderation when you have millions of users is difficult, the good thing is different services can decide what they want, what they will block, what users they will ban, like on UK TV violent or 18 rated films are only shown on broadcast TV or cable TV after 9 pm, its up to parents to control what children watch,
    If section 230 goes most websites will offer bland content or maybe there, ll be a pay wall to read everything that's more controversial
    I think the tools mods have are getting better they can search for keywords in content or certain words used by extremists
    And the UK has no section 230. New laws are coming in to regulate online services and ban content that is legal but may be deemed harmful to younger users

    link to this | view in chronology ]

  • icon
    ECA (profile), 2 Nov 2021 @ 5:50pm

    And all the data

    And all the personal data lost from servers int he last 10 years is HUGE. And its not just the big guys loosing it.

    The biggest money maker in the USA for over 40 years has been advertising.
    Then in the background has been Personal ID, and facial recog.
    So who is taking all this data?

    The Big thing about content is "WHATS ALLOWED", and Who made those decisions. OR, who do we piss off first, most, always.
    I dont think the gov. gives a darn, but the corps have been yapping like paranoid Miniature dogs.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Nov 2021 @ 6:15pm

    Unfathomable?

    Inconceivable!

    link to this | view in chronology ]

    • icon
      Samuel Abram (profile), 2 Nov 2021 @ 7:32pm

      Re: Unfathomable?

      You keep using that word; it does not mean what you think it means.

      link to this | view in chronology ]

      • icon
        ECA (profile), 3 Nov 2021 @ 2:53pm

        Re: Re: Unfathomable?

        https://www.lexico.com/en/definition/unfathomable

        Incapable of being fully explored or understood.

        ‘her gray eyes were dark with some unfathomable emotion’

        (of water or a natural feature) impossible to measure the extent of.

        ‘Chewing my lips, I looked over the pool edge again; the deep unfathomable blue sped up my pulse rate and made my head spin.’

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Nov 2021 @ 8:12pm

    I think it means when you have millions of users posting content, txt audio, video images everyday it's impossible to do everything right there will be someone complaining you are removing too much
    Content censoring users or else leaving bad content online or favouring Liberal or Conservative content
    You cannot please everyone all the time when it comes to content

    link to this | view in chronology ]

  • icon
    DougHolland (profile), 3 Nov 2021 @ 2:12am

    Focus the takedown efforts

    Yeah, but nobody needs to be concerned when two or six or ten people are chattering about the lizard people putting microchips in SpaghettiOs. It’s only a problem worth worrying about when 50,000 people, or millions, are following the lizard people. Focus the takedown efforts on the big fish filling the tank with poop.

    link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 3 Nov 2021 @ 2:40am

      Re: Focus the takedown efforts

      "Focus the takedown efforts on the big fish filling the tank with poop."

      That's the first thing the platforms did. As a result of which the modern conspiracy theorist observes opsec worthy of a John Grisham novel.

      It's a source of unending interest to me that people so off their rocker about observable evidence can still muster the brain cells to carry on exchanging deranged statements in code.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 3 Nov 2021 @ 9:07am

      Re: Focus the takedown efforts

      Censoring conspiracy theorists only makes them seem more credible to those susceptible to that kind of thing.

      link to this | view in chronology ]

      • icon
        Toom1275 (profile), 3 Nov 2021 @ 11:33am

        Re: Re: Focus the takedown efforts

        When you lie and omit everything else, you can dishonestly make that appear as the bad outcome.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 3 Nov 2021 @ 2:46pm

          Re: Re: Re: Focus the takedown efforts

          I thought the Capitol Insurrection already demonstrated that you don't need everybody to believe it, just a critical mass of crazies. Also it is a metacommunication on the part of the censor. Look at China's clumsy censorship of oblique Tiananmen Square references.

          link to this | view in chronology ]

          • icon
            Scary Devil Monastery (profile), 4 Nov 2021 @ 6:36am

            Re: Re: Re: Re: Focus the takedown efforts

            "Also it is a metacommunication on the part of the censor. Look at China's clumsy censorship of oblique Tiananmen Square references."

            The issue in China's case isn't that they've been successful in eliminating the information but that they've managed to muddle it sufficiently that it won't, within China, present a rallying cry for disaffected chinese. By now half of the Chinese citizenry believes, I'm sure, that if they've heard of it at all they think it's a fabrication out of US troll farms.

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Nov 2021 @ 9:05am

    Can't speak for others but I don't expect them to get it right every time. I just expect them to err on the side of "leave up" when in doubt.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 3 Nov 2021 @ 9:24am

      Re:

      But but but ....

      Someone may see a nipple !!!! or use copyrighted content !!!!!!! ow would we live with ourselves shoud such horrors occur.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.