Top Myths About Content Moderation

from the so-many-myths-to-debunk dept

How Internet companies decide which user-submitted content to keep and which to remove—a process called “content moderation”—is getting lots of attention lately, for good reason. Under-moderation can lead to major social problems, like foreign agents manipulating our elections. Over-moderation can suppress socially beneficial content, like negative but true reviews by consumers.

Due to these high stakes, regulators across the globe increasingly seek to tell Internet companies how to moderate content. European regulators are requiring Internet services to remove extremist content within an hour and to install upload filters to prospectively block copyright infringement; and U.S. legislators have proposed to ban Internet services from moderating content at all.

Unfortunately, many of these regulatory efforts are predicated on myths about content moderation, such as:

Myth: Content moderation can be done perfectly.

Reality: Regulators routinely assume Internet services can remove all bad content without suppressing any good content. Unfortunately, they can’t. First, mistakes occur when the service lacks key contextual information about the content—such as details about the author’s identity, other online and offline activities, and cultural references. Second, any line-drawing exercise creates mistake-prone border cases because users routinely submit “edgy” content. Third, a high-volume service will make many mistakes, even if it’s highly accurate—1 billion submissions a day at 99.9% accuracy still yields a million mistakes a day.

Myth: Bad content is easy to find and remove.

Reality: Regulators often assume every item of bad content has an impossible-to-miss flashing neon sign saying “REMOVE THIS CONTENT,” but that’s rare. Content is often obviously bad only in hindsight or with context unavailable to the service. Regulators’ cherry-picked anecdotes don’t prove otherwise.

Myth: Technologists just need to “nerd harder.”

Reality: Filtering and artificial intelligence play important roles in content moderation. However, technology alone cannot magically solve the problem. “Edgy” and contextless content vexes the machines, too.

Myth: Internet services should hire more humans to review content.

Reality: Humans have biases and make mistakes too, so adding human reviewers won’t lead to perfection. Furthermore, human reviewers sometimes experience an unrelenting onslaught of horrible content to protect the rest of us.

Myth: Internet companies have no incentive to moderate content.

Reality: In 1996, Congress passed 47 U.S.C. 230, which says Internet services generally aren’t liable for third-party content. Due to this legal protection, critics often assume Internet services won’t invest in content moderation; and some companies have stoked that perception by publicly positioning themselves as “neutral” technology platforms. Yet, virtually every Internet service moderates content, and major services like Facebook and YouTube employ many thousands of content reviewers. Why? The services have their own reputation to manage, and they care about how content can affect their users (e.g., Pinterest combats content that promotes eating disorders). Furthermore, advertisers won’t let their ads appear on bad content, which provides additional financial incentives to moderate.

Myth: Content moderation, if done right, will make everyone happy.

Reality: By definition, content moderation is a zero-sum game. Someone gets their desired outcome, and someone else doesn’t—and those folks won’t be happy with the result.

Myth: There is a one-size-fits-all approach to content moderation.

Reality: Internet services cater to diverse audiences that have different moderation needs. For example, an online crowdsourced encyclopedia like Wikipedia, an open-source software repository like GitHub, and a payment service for content publishers like Patreon all solve different problems for their communities. These services shouldn’t have identical content moderation rules.

Myth: Imposing content moderation requirements will stick it to Google and Facebook.

Reality: Google and Facebook have enough money to handle virtually any requirement imposed by regulators. Startup enterprises do not. Increased content moderation burdens are more likely to block new entrants than to punish Google and Facebook.

Myth: Poor content moderation causes anti-social behavior.

Reality: Poorly executed content moderation can accelerate bad behavior, but often the Internet simply mirrors existing anti-social behavior or tendencies. Better content moderation can’t fix problems that are endemic in the human condition.

Regulators are right to identify content moderation as a critically important topic. However, until regulators overcome these myths, regulatory interventions will cause more problems than they solve.

Reposted from Eric Goldman's Technology & Marketing Law Blog.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: cda 230, congress, content moderation, section 230


Reader Comments

The First Word

Summary

All these myths are based on a single misconception: that content can be evaluated objectively and by itself.
However, nearly all content is evaluated subjectively and requires context (including in-service context, poster-profile context, overall social and cultural context...) Denying this fundamental problem leads to being blind to all the aspects you mentioned. Virtually anything depends on context.

  • Violence is bad in real-life, but is fundamental to lots of entertainment products.
  • Sincere hate speech is bad, but can be quoted or parodied for criticism.
  • Criticizing someone is allowed, as long as you avoid libel/slander, but will make the target feel bad. (Particularly when they are thin-skinned, even more so when orange-skinned.) They might lash out, claim victim-status, pretend the critic is lying... or claim copyright violation.

Nothing is easy to judge as the spin given to the reports of the instance can sway public opinion regardless of the merit of the report. Something is often presented as an "obvious" case despite not being objectively obvious at all. This is done by several means, such as slightly misquoting the content, ignoring context or inversely adding false context, etc.

This makes for lots of cases presented as "black-and-white" issues, whereas the immense majority of edgy cases are often ignored because they are harder to spin as "obvious". This issue, which is pretty common in the media landscape, leads to the biased myths above, that - in short - "moderation is easy".

—Wyrm

Subscribe: RSS

View by: Time | Thread


  • icon
    Anonymous Anonymous Coward (profile), 16 Oct 2019 @ 7:09am

    Nerds might impact, but don't create societies

    "Myth: Technologists just need to “nerd harder.”

    Reality: Filtering and artificial intelligence play important roles in content moderation. However, technology alone cannot magically solve the problem. “Edgy” and contextless content vexes the machines, too."

    The issue isn't technical, the issue is social, and to that end, while there might be 'one' society, it is made up from many sub-societies. Those sub-societies might have similarities, but they are in fact often different. Society is also impacted by the form of governing happening around those sub-societies, and within each government there are likely several to many sub-sub-societies.

    In authoritarian regimes there are probably supporters (those who endeavor to become authoritarian themselves) and opposers (those who wish a more democratic form of oppression).

    In democratic regimes there are those who wish for, and work to impose a more authoritarian government position while those who actually enjoy liberty and freedom work to impede more rigid government control. Or are just complacent.

    If the world wants to prevent bad, then they have to come to agreement about what bad is, and letting governments (or religions or for that matter any factions) decide for the populace what is or isn't actually bad hasn't gotten better with time, as theoretically 'good' societies seem to have tendencies to turn bad, and theoretically 'bad' societies seem to have tendencies to get worse. And just to keep things confused, sometimes the populace gets out of their complacent mode and vehemently opposes whatever regime is designing their current oppression's.

    So whatever answers anyone comes up with, there will be opposers to those answers and the subjective determination about whether the answer, or the content, is good or bad is merely a point of view. If people are concerned about influencing children with bad content, how about teaching 'parenting' without imposing ideologies? If I want my kids to play outside, unattended by adults (as I did as a child) then it is my business, not anyone else's. Keeping children from viewing shocking videos on the Internet is about me, as a parent and how I control my children's Internet usage, as well as what I tell them when they stumble across something I consider bad, and not about societies ability to censor. There is no actual way to keep them from seeing things I don't want them to entirely (on the Internet or in the real world), but there are ways for me to help them understand and cope with those things when they run across them.

    link to this | view in chronology ]

    • icon
      ECA (profile), 16 Oct 2019 @ 12:39pm

      Re: Nerds might impact, but don't create societies

      I love those that THINK, they can control thought/ideas/...

      To those that Think you can moderate the net...LET THEM SPEND 1 day doing the job.

      Lets look at this in a WIDE fashion.
      Those countries that are Asking the net, to moderate? That say they are for Free speech..
      I dont think we need to Think hard about this. They are asking the Net NOT to give free speech.

      Basic moderation is great, monitoring everything is stupid.. Ask the FBI/CIA/Others how well its going, trying to track Everything on the net...
      From Private chat, Public Chat, ingame Chats, Forums and all the rest. The amounts of data created is so large...its like counting the the highest number created, there is always 1 more.

      link to this | view in chronology ]

  • icon
    Stephen T. Stone (profile), 16 Oct 2019 @ 7:10am

    Filtering and artificial intelligence play important roles in content moderation. However, technology alone cannot magically solve the problem. “Edgy” and contextless content vexes the machines, too.

    Even contextual content can vex a machine. To wit: The word “retard” can be used as a verb and an ableist slur — and basic content filters generally can’t tell the difference between the two.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 16 Oct 2019 @ 7:18am

      Re:

      I suspect that even humans have a relatiely low accuracy for words like that.

      link to this | view in chronology ]

    • icon
      Thad (profile), 16 Oct 2019 @ 9:53am

      Re:

      In one of the versions of Carlin's "Seven Words" routine, he notes that "you can prick your finger, but you can't finger your prick."

      link to this | view in chronology ]

    • icon
      ECA (profile), 18 Oct 2019 @ 10:46am

      Re:

      Can we do it this way..

      That every time the computer cant figure it, it ships it to a human..
      that human has the ability to send it to others to figure it out also..

      How convoluted can Written speech be??
      The Word Sink has 29 explanations.
      Even if we goto concepts and Poems,. How many persons have a hard time NOT seeing PORN and drugs in Rock and roll songs...

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2019 @ 7:36am

    "Under-moderation can lead to major social problems, like foreign agents manipulating our elections. "

    The underlying problem is a gullible population, but that will not be solved anytime soon.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 16 Oct 2019 @ 8:10am

      Re:

      "Moderation" in that case should earn them a Nobel prize /and/ a Sainthood regardless of religion or other comduct as they would have educated not only one nation but many in critical thinking without the time commitment of schools or the staffing , funding, or power involved.

      In short that would be a miracle going far above the call of duty akin to your high school drop out hotel receptionist inventing and giving you for free a side effect pancaea in the form of a delicious cake. While it would be very nice to have expecting it would be insane.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 16 Oct 2019 @ 8:15am

      Re: gullible population

      cc "The underlying problem is a gullible population"

      but that's the whole point of any government regulation, even if done imperfectly.

      the public must be protected from its own ignorance, even if only partial protection is achievable currently.

      regulation is everywhere in our economic and social lives. Though it always has flaws, society would be much worse off without it. Consider the FAA, FDA, FCC, FTC, SEC CPSC, DEA, ATF, Copyright Office etc.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 16 Oct 2019 @ 9:17am

        Re: Re: gullible population

        "the public must be protected from its own ignorance"

        Relative to consumer goods, there are things that the product user needs to be made aware of. Like, ummm the required voltage or gasoline type, although it is a bit silly to put notices on hammers about striking ones own thumb.
        An ignorant populace is easier to "govern", or so I've heard. It does make one wonder why the present administration is hell bent on killing public education, the place where non-rich kids learn.

        link to this | view in chronology ]

      • icon
        seedeevee (profile), 16 Oct 2019 @ 10:51am

        Re: Re: gullible population

        Police Agencies (DEA/ATF) are not here to protect us from ignorance.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 10:57am

          Re: Re: Re: gullible population

          Agreed, they are not. (as if I had make such a claim)
          Police are also not here to moderate the internet.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 16 Oct 2019 @ 9:25pm

            Re: Re: Re: Re: gullible population

            Oh, yes they are. I thought you read Techdirt?!

            link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 9:27pm

          Re: Re: Re: gullible population

          Which explains you at least bro.

          link to this | view in chronology ]

      • icon
        ECA (profile), 16 Oct 2019 @ 12:42pm

        Re: Re: gullible population

        "the public must be protected from its own ignorance, even if only partial protection is achievable currently."

        NOPE..
        We must be educated, taught how to reason and decide..
        But thats not what school does.

        WE must learn from mistakes. Which is a hard taskmaster.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 2:54pm

          Re: Re: Re: gullible population

          "But thats not what school does."
          Some classes / professors do attempt and sometimes accomplish the above.

          link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2019 @ 8:00am

    Context

    First, mistakes occur when the service lacks key contextual information about the content—such as details about the author’s identity, other online and offline activities, and cultural references.

    Does anyone else see where this is heading? We just need to give the machines more data about ourselves and our interlocutors—all the data—and everything will be fine.

    link to this | view in chronology ]

  • identicon
    Comboman, 16 Oct 2019 @ 8:01am

    Myth: Ads are the same as user-created content

    Reality: Ads are not "at scale" and are already handled manually by actual humans (called salespeople). Requiring these people to review the ads to ensure they meet ethical requirements (the way they are already reviewed for trademark and other rules) is not just technically possible but relatively easy. TV, radio and print publishers do it all the time.

    link to this | view in chronology ]

    • icon
      Gary (profile), 16 Oct 2019 @ 9:23am

      Ads are the same as user-created content

      Just as wrong as when you said it yesterday.

      link to this | view in chronology ]

      • identicon
        Comboman, 18 Oct 2019 @ 11:02am

        Re: Ads are the same as user-created content

        ...and your "rebuttal" is as unconvincing as it was yesterday.

        link to this | view in chronology ]

    • icon
      Wyrm (profile), 16 Oct 2019 @ 10:01am

      Re: Myth: Ads are the same as user-created content

      That might have been the case before ads were automated like most other content.
      Newspaper had to make editorial choice on just about anything, including ads. Nowadays, things are not so simple. Website just subscribe to an ad provider who themselves have content submitted by announcers. Neither the website not the provider have full control over the content, although filtering (read "moderation") tools are provided.

      link to this | view in chronology ]

      • icon
        Gary (profile), 16 Oct 2019 @ 7:56pm

        Re: Re: Ads are the same as user-created content

        And the TV networks gleefully accepted all these adds without blinking....

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2019 @ 8:36am

    Different countrys have different rules,
    is content extreme, defaming someone, supporting terrorism,
    false and fake news,
    harmful or upsetting to teens or young people .
    is it parody or political comment which is legal , or just rude or ignorant .
    There,s no ai or automatic filter that can block all content
    that may be in those categorys .
    When social media websites have millions of users uploading images or comments it will need human moderators to block content
    that might be illegal or extreme .
    Do western societys want to become like china where all content
    is screened and filtered ?
    We have seen 1000,s videos, on youtube that should be fair use or parody
    removed by dmca notice,s .
    And those categorys do not even include imag,es or video that may be infringing on ip holders under the new laws in europe .
    Where all user uploads will have to be screened by filters .

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    seedeevee (profile), 16 Oct 2019 @ 9:20am

    Xenophobic trash peddlers like Eric Goldman

    Under-moderation can lead to major social problems, like foreign agents manipulating our elections.

    How about moderating this web-site better to keep out the xenophobic trash peddlers like Eric Goldman?

    link to this | view in chronology ]

    • icon
      lucidrenegade (profile), 16 Oct 2019 @ 10:26am

      Re: Xenophobic trash peddlers like Eric Goldman

      Nah, the users themselves do a pretty good job of moderation here. For example, your comment will be hidden shortly.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        icon
        seedeevee (profile), 16 Oct 2019 @ 10:53am

        Re: Re: Xenophobic trash peddlers like Eric Goldman

        You seem to have a lot of faith in your circle jerk.

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 16 Oct 2019 @ 11:32am

          Your lack of faith is…unfortunate.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 16 Oct 2019 @ 9:30pm

            Re:

            Somehow I feel you didn't clean your hands when you were done, either.

            link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 11:38am

          Re: Xenophobic trash peddlers like you bro

          Sorry bro I had to unhide your comment to see what you were on about. And soon enough someone will have to unhide the comment I’m replying to to see what I’m on about. Less a circle jerk and more you losing the soggy biscuit game.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 16 Oct 2019 @ 9:31pm

            Re: Re: Xenophobic trash peddlers like you bro

            How is that - in any way - not a circle jerk?

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 17 Oct 2019 @ 8:18pm

              Re: Re: Re: Xenophobic trash peddlers like you bro

              Because unhiding a comment to see what was written is not, in itself, remotely close to "circle jerk" behavior. It's to see what the comment was and decide for oneself if the comment was offensive, pointless, inciting or dumb enough to merit such treatment. Maybe I might respond by saying "I don't think that message was offensive". Or maybe if I'm in the mood I'll reply with "I agree, that was a stupid post". The latter would probably fit your definition of circle jerk.

              Either way, what does it matter? What did you expect was the point accomplished by wading into a comment thread and calling everyone xenophobic?

              link to this | view in chronology ]

    • icon
      Toom1275 (profile), 16 Oct 2019 @ 1:14pm

      Re: Xenophobic trash peddlers like Eric Goldman

      xenophobic trash peddlers like Eric Goldman

      [Asserts facts not in evidence]

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 16 Oct 2019 @ 9:29pm

        Re: Re: Xenophobic trash peddlers like Eric Goldman

        Under-moderation can lead to major social problems, like foreign agents manipulating our elections.

        Evidence found and presented

        link to this | view in chronology ]

        • icon
          Toom1275 (profile), 16 Oct 2019 @ 10:22pm

          Re: Re: Re: Xenophobic trash peddlers like Eric Goldman

          Evidence found and presented

          .. of something wholly unrelated to the original claim.

          Well done, useful(?) idiot.

          link to this | view in chronology ]

          • icon
            Toom1275 (profile), 17 Oct 2019 @ 9:07am

            Re: Re: Re: Re: Xenophobic trash peddlers like Eric Goldman

            It's like they didn't learn that "A therefore 7" didn't work when they claimed some neonazi being banned proved their "anti-conservative bias" delusion.

            link to this | view in chronology ]

        • icon
          bhull242 (profile), 17 Oct 2019 @ 12:56pm

          Re: Re: Re: Xenophobic trash peddlers like Eric Goldman

          That’s not in any way xenophobia. We elect people to represent our interests. Outsiders are perfectly fine, and I’m even okay with them moving here, but unless and until they become U.S. citizens and aren’t working on behalf of a foreign government, I don’t want them to be involved in our elections any more than they’d want me to interfere in their elections.

          link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        icon
        seedeevee (profile), 16 Oct 2019 @ 9:45pm

        Re: Re: Xenophobic trash peddlers like Eric Goldman

        Under-moderation can lead to major social problems, like foreign agents manipulating our elections.

        I'd say that is xenophobic trashtalk.

        link to this | view in chronology ]

    • icon
      bhull242 (profile), 17 Oct 2019 @ 12:53pm

      Re: Xenophobic trash peddlers like Eric Goldman

      How is Eric Goldman xenophobic? Also, there’s a difference between moderating content and moderating users (or, in this case, writers). Even if Goldman is xenophobic, this article isn’t, so the former doesn’t necessarily mean one should remove the article.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2019 @ 9:27am

    Poor content moderation causes anti-social behavior.

    Two points about this:

    First, poor content moderation can certainly attract anti-social behaviour. There's a reason why Gab, 8chan, YouTube comments, etc. are marked with "HERE BE MONSTERS" on the maps of the internet, where even Reddit and Twitter, themselves not exactly paragons of content moderation, aren't regarded with the same disdain.

    Second, specifically regarding YouTube videos: a lot of complaints I've heard haven't been regarding content moderation so much as the recommendation algorithm. And, while lax content moderation isn't going to cause anything, recommending content absolutely can, and I can certainly believe an increasingly-strict diet of extremism and conspiracy theory can send someone down the hole of antisocial behaviour.

    link to this | view in chronology ]

  • icon
    Wyrm (profile), 16 Oct 2019 @ 9:57am

    Summary

    All these myths are based on a single misconception: that content can be evaluated objectively and by itself.
    However, nearly all content is evaluated subjectively and requires context (including in-service context, poster-profile context, overall social and cultural context...) Denying this fundamental problem leads to being blind to all the aspects you mentioned. Virtually anything depends on context.

    • Violence is bad in real-life, but is fundamental to lots of entertainment products.
    • Sincere hate speech is bad, but can be quoted or parodied for criticism.
    • Criticizing someone is allowed, as long as you avoid libel/slander, but will make the target feel bad. (Particularly when they are thin-skinned, even more so when orange-skinned.) They might lash out, claim victim-status, pretend the critic is lying... or claim copyright violation.

    Nothing is easy to judge as the spin given to the reports of the instance can sway public opinion regardless of the merit of the report. Something is often presented as an "obvious" case despite not being objectively obvious at all. This is done by several means, such as slightly misquoting the content, ignoring context or inversely adding false context, etc.

    This makes for lots of cases presented as "black-and-white" issues, whereas the immense majority of edgy cases are often ignored because they are harder to spin as "obvious". This issue, which is pretty common in the media landscape, leads to the biased myths above, that - in short - "moderation is easy".

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2019 @ 10:31am

    Myth: Hiding anti-social behavior from the public internet will somehow make people stop being anti-social.

    Reality: They will continue to be just as anti-social as before if not more, only somewhere else.

    link to this | view in chronology ]

    • icon
      JoeCool (profile), 16 Oct 2019 @ 10:55am

      Re:

      Hiding anti-social behavior from the public internet will somehow make people stop being anti-social.

      Heh - that's the truth. Certain infamous commentators here are regularly down-voted enough to be hidden on virtually every post, but they constantly return, worse than ever.

      But that's why the posts are just hidden, not removed. Hiding them by popular consent is a very simple and straight-forward message that most of the people don't like what they're saying, but they're still allowed to say it. Removing the posts would send the message that since most people don't like it, they're not ALLOWED to say it, which would be against Freedom of Speech. All content moderation should be like here: enough down-votes just hide the content, but it's still there. And if it just happens to be illegal speech, it's still there for the police as evidence.

      link to this | view in chronology ]

      • icon
        Wyrm (profile), 16 Oct 2019 @ 11:18am

        Re: Re:

        Removing the posts would send the message that since most people don't like it,

        ... they're showing you the door.

        link to this | view in chronology ]

      • icon
        Wyrm (profile), 16 Oct 2019 @ 11:21am

        Re: Re:

        Removing the posts would send the message that since most people don't like it, they're not ALLOWED to say it, which would be against Freedom of Speech.

        More seriously, this wouldn't be against freedom of speech. It would be if it's legally mandated, but as long as it's voluntary moderation by the platform and/or its users, that's not a free speech issue.
        You could frame that as a snowflake issue, of people trying to make themselves a utopian "safe space" that doesn't exist in the real world, but that would not be a free speech issue.

        link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 16 Oct 2019 @ 11:34am

        Removing the posts would send the message that since most people don't like it, they're not ALLOWED to say it, which would be against Freedom of Speech.

        Maybe the spirit of that principle, but not the legality of it. Your usage of a third party platform for speech is a privilege, not a right — and that privilege can always be revoked.

        link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        identicon
        Dark Shops Spotty Phones Rotting Fish in SF, 16 Oct 2019 @ 1:01pm

        Exulting in your little bit of fanboy power

        Sheesh! You are exulting in your little bit of fanboy power without grasping the actual effect. "Hiding" infuriates and makes people leave the site, never to return. "Hiding" of reasonable on-topic comments is even worse, shows what normal people can expect here!

        I don't have to argue or show Alexa numbers, as that's easily visible and you know it. Masnick's little blog doesn't pay for itself, has almost no rational discussion just ad hominem attacks. It's not a "platform" getting his views to a wide audience, it's just a couple dozen ultra-partisan fanboys echoing.

        All that's evident to the few new readers. I may have missed some of late, but are definitely NOT MANY new accounts: last I have listed appeared Sep 23rd, 2019!

        link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          identicon
          Dark Shops Spotty Phones Rotting Fish in SF, 16 Oct 2019 @ 1:01pm

          Re: Exulting in your little bit of fanboy power

          Once Techdirt became a club and not a forum, it's been shrinking and is guaranteed to shrink.

          All as I predicted ten years ago. Oh, it's still going but only because of Millionaire Masnick's vanity. He's not validated anywhere else!

          Now, HIDE this, kids! It serves my purpose that you do!

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 16 Oct 2019 @ 2:37pm

            Re: putting the ignorant in ignorant motherfucker for a decade

            “All as I predicted ten years ago”

            And yet you’re still here bro

            If that don’t make you an ignorant motherfucker. I don’t know what does.

            link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 16 Oct 2019 @ 1:49pm

          "Hiding" infuriates and makes people leave the site, never to return.

          That isn’t true at all. We hide your comments all the time and you keep coming back because you wanna hatefuck us.

          link to this | view in chronology ]

          • This comment has been flagged by the community. Click here to show it
            identicon
            Anonymous Coward, 16 Oct 2019 @ 1:58pm

            Re:

            LOL.... I wonder if this will get hidden by Masnick. You can't see when you're own comments are hidden.

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 16 Oct 2019 @ 2:39pm

              Re: Re: I don’t really

              LOL. I wonder how many tears you cry at night when you realise that you have wasted a decade of your life here and accomplished nothing but earning two derogatory nicknames.

              link to this | view in chronology ]

            • icon
              Stephen T. Stone (profile), 16 Oct 2019 @ 3:28pm

              You can't see when you're own comments are hidden.

              Yes, you can. I should know — I’ve had a few comments hidden and I can see they were hidden. Don’t lie if the truth is that easy to dig up.

              link to this | view in chronology ]

              • identicon
                Anonymous Coward, 16 Oct 2019 @ 3:35pm

                Re:

                That is strange.

                When I viewed my own hidden comment from tor I could not see it but when I viewed it from my usual IP address I could.

                I can't explain our different experiences.

                link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 16 Oct 2019 @ 6:51pm

                  Re: Re:

                  Why would you need to use TOR? According to blue, only pirates and lawbreakers would need to use TOR just to see if people hid their comments. Are you a pirate?

                  link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 2:35pm

          Re: Exulting in kicking blue in the balls

          “Hiding" infuriates and makes people leave the site, never to return.”

          And yet. You’re still here. Despite the fact you promised to leave forever.

          link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          icon
          seedeevee (profile), 16 Oct 2019 @ 9:39pm

          Re: Exulting in your little bit of fanboy power

          Yup, fanboys sounds much nicer than when I called them a circle-jerk.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 17 Oct 2019 @ 8:39am

            Re: Re:

            What is this, complaining to the teacher that the rest of the kids in kindergarten wouldn't let you play insult fighting?

            link to this | view in chronology ]

        • icon
          bhull242 (profile), 17 Oct 2019 @ 1:21pm

          Re: Exulting in your little bit of fanboy power

          Hiding" infuriates and makes people leave the site, never to return.

          Outside of people who spam links, I can’t think of a single person whose comments were hidden and then actually left and never came back, though many have claimed that they would do so. In fact, not only do they all still read Techdirt, they keep writing stuff in the comments. Zof even still uses his account. And even if that’s true, how is that any different from removing comments? Also, if your comments keep getting hidden, you probably won’t be missed if you leave.

          "Hiding" of reasonable on-topic comments is even worse, shows what normal people can expect here!

          I might agree with that, but it’s actually pretty rare, and quite a few visible comments (i.e. not hidden) are actually not in favor of Techdirt’s views.

          In particular, your comments are almost never both reasonable and on-topic, so you of all people don’t have any justification for complaining about that.

          I don't have to argue or show Alexa numbers, as that's easily visible and you know it.

          If you can’t be bothered to do the research, why should I? I for one know nothing about Alexa numbers at all. If it’s so easy, you should have no problems showing us them. As the one making the positive claim, you have the burden of proof.

          Masnick's little blog doesn't pay for itself,

          Most things don’t.

          has almost no rational discussion just ad hominem attacks.

          That hasn’t been my experience, but at any rate not everyone comes to a blog for the comments.

          It's not a "platform" getting his views to a wide audience, it's just a couple dozen ultra-partisan fanboys echoing.

          And you were criticizing us for ad hominem attacks? And actually, we often criticize both parties for a lot of stuff, so I don’t think “ultra-partisan” really applies here.

          All that's evident to the few new readers.

          [Asserts facts not in evidence]

          I may have missed some of late, but are definitely NOT MANY new accounts: last I have listed appeared Sep 23rd, 2019!

          That’s less than a month ago. Also, most new readers don’t get accounts even if they stick around. You haven’t proven that new readers are few or don’t stick around.

          link to this | view in chronology ]

        • icon
          Toom1275 (profile), 17 Oct 2019 @ 4:13pm

          Re: Exulting in your little bit of fanboy power

          Hiding" of reasonable on-topic comments

          [Asserts facts not in evidence]

          Masnick's little blog doesn't pay for itself, has almost no rational discussion just ad hominem attacks

          If those bother you so much, then don't unhide any comments and you'll never see any.

          link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        icon
        seedeevee (profile), 16 Oct 2019 @ 9:36pm

        Re: Re:

        It is far far far away from "most of the people". Just how many accounts (not necessarily people) do you think it takes to hide a thought here?

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 10:14pm

          Re: Re: Re:

          You could always make 50 accounts and try that out yourself?

          link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2019 @ 11:32am

    Myth: Content moderation can be done perfectly.

    Not only can it not be done perfectly, but the error rate with increase with time as the bad guys map the behavior of the moderator/filter so that their content, which should be taken down, stays up

    link to this | view in chronology ]

  • identicon
    Bruce C., 16 Oct 2019 @ 11:46am

    Mythbusters...

    It's great to see some level-setting in the back and forth debates on this issue.

    It's a shame a lot of the big platforms for a long time took their section 230 immunity to mean that there was no need for moderation, and are still playing catch-up once it became clear that the tide had turned.

    It's also a shame that the public discourse seems to be centered around a very narrow definition of acceptable speech. Advertiser/shareholder-censored content is something we already have plenty of on the airwaves and cable TV. When all of the platforms are run by publicly held companies that earn their revenue through ads, the internet becomes indistinguishable from old media.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 16 Oct 2019 @ 11:58am

      Re: Mythbusters...

      It's a shame a lot of the big platforms for a long time took their section 230 immunity to mean that there was no need for moderation

      Eh, they had it right; You don't. Section 230 does not mean there is a need to moderate, only a "please do" provision.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 16 Oct 2019 @ 12:09pm

        Re: Re: Mythbusters...

        There's a difference between a "need" and a "legal obligation."

        They were right that there was no legal obligation to moderate. I'd argue that they were wrong if they thought there was no need.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 16 Oct 2019 @ 2:52pm

          Re: Re: Re: Mythbusters...

          Nothing about Section 230 indicates a need for moderation. Nobody (until you) said anything about obligation, we were discussing need. Neither has anything to do with S230.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 16 Oct 2019 @ 5:14pm

            Re: Re: Re: Re: Mythbusters...

            There's a difference between...

            Nothing about Section 230 indicates a need for moderation.

            and...

            It's a shame a lot of the big platforms for a long time took their section 230 immunity to mean that there was no need for moderation

            Namely, the difference between, "Section 230 doesn't indicate a need for moderation," and "Section 230 indicates that there isn't a need for moderation."

            Not indicating a need =/= indicating something isn't needed.

            link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 16 Oct 2019 @ 12:08pm

      Re: Mythbusters...

      Commentor has asserted facts not in evidence.

      link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 16 Oct 2019 @ 12:42pm

      a lot of the big platforms for a long time took their section 230 immunity to mean that there was no need for moderation

      [citation needed]

      link to this | view in chronology ]

      • icon
        Wendy Cockcroft (profile), 17 Oct 2019 @ 7:37am

        Re:

        Stephen is right; the market forced the platforms to moderate, then they ran into trouble for moderating, then we got Section 230 to tell them they could moderate without being taken to court for it and weren't responsible for 3rd party posts.

        link to this | view in chronology ]

  • icon
    ECA (profile), 16 Oct 2019 @ 12:54pm

    A comment...

    I love it sometimes when others look at my comments and Decide to correct me.
    I have to suggest to them that the English language is a conglomerate of many languages consolidated into a menagerie of Crap. We bring rules in for English from other languages that have no use except to use on those certain words from That 1 language. And expect Kids and even Adults to use those rules that have no use except those few words. Insted of converting the word to an English/Anglo spelling, we Just throw words into the mess and add more rules to cover them... I before E, has been changed as there are an Equal amount that dont use it. And so others may know, we have removed Letter from the english alphabet, because we found other ways to use a Shorter alphabet.

    http://mentalfloss.com/article/31904/12-letters-didnt-make-alphabet

    and probably a few others... Let alone pronunciation and inclusion of words from german, russian, italian, spanish and Other languages..
    We have words that have so many Meanings, that unless you KNOW the language, you will be mistaken to even use the correct ones..
    Im not supper educated, but my teacher gave me a dictionary because I like playing with words and meanings, even tho I do have a few handicaps that restricted me when I was younger.

    Good luck folks have fun trying to get a computer to understand all the connotations and convolutions of English.

    link to this | view in chronology ]

  • icon
    tp (profile), 17 Oct 2019 @ 12:27am

    Stackexchange shows proper content moderation is possible

    It seems there's a myth that content moderation is so difficult that noone can do it. But stackexchange has clearly succeeded in content moderation, there's only very small amount of trolling or bad behaviour in their platform, even after other users are evaluating work of others.

    Of course stackexchange have spent years perfecting their system to get content moderation working properly.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 17 Oct 2019 @ 8:19pm

      Re:

      Maybe try advertising on four busses in London this time.

      link to this | view in chronology ]

    • icon
      ECA (profile), 18 Oct 2019 @ 10:41am

      Re: Stackexchange shows proper content moderation is possible

      Moderation can be cheap and easy, if you use SIMPLE rules..

      but we have 2 groups..
      1 that understands this idea.
      1 that says TROLLS have rights. Even if they are calling everyone by every name in the book, and the Subject is lost to Rabble rousing discussion..

      the First group keeps asking the second for proof of what they are saying and the Second seems to think Something they heard from the 18th century has any factual Meaning, given from their GREAt grand father..

      link to this | view in chronology ]

    • icon
      bhull242 (profile), 21 Oct 2019 @ 2:33pm

      Re: Stackexchange shows proper content moderation is possible

      Of course, that could just be a coincidence. StackExchange is pretty niche, and there’s not much reason to troll people on it.

      Plus, StackExchange isn’t even close to being as large as, say, YouTube or Facebook. StackExchange gets thousands of uploads per year; others get millions per day. The scale is completely different.

      link to this | view in chronology ]

  • icon
    Peter (profile), 17 Oct 2019 @ 1:43pm

    How does China do it

    Totally not saying I want the US or any country to be China but it seems saying, "China controls content on its internet" and "Content Moderation can't scale and here are some myths related to it" are at odds. Is it a matter of liability is on the platform side? Liability in this sense meaning fear of the government... Just wondering really, this may not be a unique thought at all but just occurred to me the clash in these two beliefs.

    link to this | view in chronology ]

    • icon
      That One Guy (profile), 17 Oct 2019 @ 3:20pm

      'Collateral damage? Eh, we don't care.'

      To the extent that china 'controls' the internet within their country it appears to be two-fold: complete indifference to collateral damage, and having companies act as censors on their behalf.

      So long as you don't care about collateral damage and you're willing to impact ever increasing amounts of 'good' content in your scramble to squash the 'bad' content then moderation scales just fine, it's only when you aren't willing to have someone else pay those prices that it fails to.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 21 Oct 2019 @ 6:00pm

        Re: 'Collateral damage? Eh, we don't care.'

        Having a large portion of the populace be, for the most part, unquestioning sheep also helps...

        link to this | view in chronology ]

  • icon
    spenrider8 (profile), 22 Oct 2019 @ 11:34pm

    Content Moderation

    Content moderation is the practice of monitoring and applying a pre-determined guidelines and code of behavior for user-generated contents submission to determine best, if a particular comment, post or feedback is publicly permissible or not https://www.dgcustomerfirst.net/

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.