Before You Talk About How Easy Content Moderation Is, You Should Listen To This

from the radiolab-explains dept

For quite some time now, we've been trying to demonstrate just how impossible it is to expect internet platforms to do a consistent or error-free job of moderating content. Especially at the scale they're at, it's an impossible request, not least because so much of what goes into content moderation decisions is entirely subjective about what's good and what's bad, and not everyone agrees on that. It's why I've been advocating for moving controls out to the end users, rather than expecting platforms to be the final arbiters. It's also part of the reason why we ran that content moderation game at a conference a few months ago, in which no one could fully agree on what to do about the content examples we presented (for every single one there were at least some people who argued for keeping the content up or taking it down).

On Twitter, I recently joked that anyone with opinions on content moderation should first have to read Professor Kate Klonick's recent Harvard Law Review paper on The New Governors: The People, Rules and Processes Governing Online Speech, as it's one of the most thorough and comprehensive explanations of the realities and history of content moderation. But, if reading a 73 page law review article isn't your cup of tea, my next recommendation is to spend an hour listening to the new Radiolab podcast, entitled Post No Evil.

I think it provides the best representation of just how impossible it is to moderate this kind of content at scale. It discusses the history of content moderation, but also deftly shows how impossible it is to do it at scale with any sort of consistency without creating new problems. I won't ruin it for you entirely, but it does a brilliant job highlighting how as the scale increases, the only reasonable way to deal with things is to create a set of rules that everyone can follow. And then you suddenly realize that the rules don't work. You have thousands of people who need to follow those rules, and they each have a few seconds to decide before moving on. And as such, there's not only no time for understanding context, but there's little time to recognize that (1) content has a funny way of not falling within the rules nicely and (2) no matter what you do, you'll end up with horrible results (one of the examples in the podcast is one we talked about last year, explaining the ridiculous results, but logical reasons, for why Facebook had a rule that you couldn't say mean things about white men, but could about black boys).

The most telling part of the podcast is the comparison between two situations, in which the content moderation team at Facebook struggled over what to do. One was a photo that went viral during the Boston Marathon bombings a few years ago, showing some of the carnage created by the bombs. In the Facebook rulebook was a rule against "gore" that basically said you couldn't show a person's "insides on the outside." And yet, these photos did that. The moderation team said that they should take it down to follow the rules (even though there was vigorous debate). But, they were overruled by execs who said "that's newsworthy."

But this was then contrasted with another viral video in Mexico of a woman being beheaded. Many people in Mexico wanted it shown, in order to document and alert the world of the brutality and violence that was happening there, which the government and media were mostly hiding. But... immediately people around the world freaked out about the possibility that "children" might accidentally come across such a video and be scarred for life. The Facebook content moderation team said leave it up, because it's newsworthy... and the press crushed Facebook for being so callous in pushing gore and violence... so top execs stepped in again to say that video could no longer be shown.

As the podcast does a nice job showing, these are basically impossible situations, in part because there are all different reasons why some people may want to see some content, and others should not see it. And we already have enough trouble understanding the context of the content, let alone the context of the viewer in relation to the content.

I've been seeing a microcosm of this myself in the last few days. After my last post about platforms and content moderation around the Alex Jones question, Twitter's Jack Dorsey was kind enough to tweet about it (even though I questioned his response to the whole mess). And, so for the past week or so I've been getting notified of every response to that tweet, which seems pretty equally divided between people who hate Alex Jones screaming about how Jack is an idiot for not banning Jones and how he's enabling hate mongers, and people who love Alex Jones screaming about how Jack is silencing dissent and how he's a liberal asshole silencing conservatives.

And no matter where on the spectrum of responses you may fall (or even totally outside of that spectrum), it should come down to this: we shouldn't be leaving these decisions up to Jack. Or Mark. Yes, those companies can and must do a better job, but what people fail to realize is that the job we're asking them to do is literally an impossible one. And that's why we really should be looking to move away from the situation in which they even need to be doing it. My solution is to move the controls outwards to the ends, allowing individuals and third parties to make their own calls. But there may be other solutions as well.

But something that is not a solution is merely expecting that these platforms can magically "get it right."

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, radiolab
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • This comment has been flagged by the community. Click here to show it
    identicon
    Most Wanted Poster, 21 Aug 2018 @ 3:57pm

    YET you tacitly state that Google, Facebook, Twitter colluded

    rightly to "de-platform" Alex Jones.

    Again, this is just solely having everything your way, not least consistent.

    Anyone, state and link to something out-of-bounds from Alex Jones. -- And which isn't easily matched by my finding a piece on Techdirt repeating utterly unfounded "conspiracy" of Trump-Russia collusion. Comparatively, Jones is WELL within both visible evidence and likelihoods. It's ONLY that you neo-liberal partisan and netwits want to believe.


    It's also part of the reason why we ran that content moderation game at a conference a few months ago, in which no one could fully agree on what to do about the content examples we presented (for every single one there were at least some people who argued for keeping the content up or taking it down).

    Yeah, as I predicted: irresolute weenies.

    Look. What's acceptable is KNOWN to prosecutors and when in a court, else there'd be vastly more cases brought. Doesn't take any real acumen to study cases and state a simple policy aligned with common law. -- Dang near every web-site EXCEPT Techdirt has such stated clearly, AND no other web-site that I know of resorts to the hidden cheating of claiming that "the community" with a "voting system" is responsible. YOU won't even state the numbers of alleged clicks, or whether an Administrator has final approval. -- Though we all know one does, as proven by the sudden change on MY comments in August-September. -- The way YOU are doing "moderation" isn't acceptable under common law.

    link to this | view in chronology ]

    • identicon
      Christenson, 21 Aug 2018 @ 4:06pm

      Re: YET you tacitly state that Google, Facebook, Twitter colluded

      Useless name-calling crap, flagged. Techdirt moderation is *excellent*.

      I think Mike is pointing out that at scale, inconsistency is *inevitable*.

      link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 21 Aug 2018 @ 5:01pm

      Anyone, state and link to something out-of-bounds from Alex Jones.

      I absolutely refuse to give Alex Jones traffic via a link, but his whole “the Sandy Hook shooting was a false flag operation with crisis actors playing the roles of dead kids and grieving parents” schtick seems pretty fucking out-of-bounds to me.

      utterly unfounded "conspiracy" of Trump-Russia collusion

      More evidence exists to corroborate that alleged collusion than will ever exist to corroborate Jones’s claims about the Sandy Hook massacre.

      Jones is WELL within both visible evidence and likelihoods

      What specific evidence proves true, beyond a reasonable doubt, Jones’s claim that the Sandy Hook massacre was a false flag operation carried out by crisis actors?

      What's acceptable is KNOWN to prosecutors and when in a court

      “Legally acceptable” is not always the same thing as “morally acceptable”. To wit: Alex Jones’s claims about the Sandy Hook shooting. (And the “legally acceptable” status of those claims are in question.)

      no other web-site that I know of resorts to the hidden cheating of claiming that "the community" with a "voting system" is responsible

      Reddit immediately and prominently comes to mind.

      YOU won't even state the numbers of alleged clicks, or whether an Administrator has final approval.

      Why should they?

      The way YOU are doing "moderation" isn't acceptable under common law.

      If you really believed Techdirt’s moderation is illegal, you would have already filed a lawsuit over it.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 5:06pm

      Weren’t you people just saying collusions not a crime?

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 5:08pm

      Re: YET you tacitly state that Google, Facebook, Twitter colluded

      Sue. Do it you pussy. I want to watch you get literally laughed out of court.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 5:12pm

      Re: YET you tacitly state that Google, Facebook, Twitter colluded

      The podcast is 71 minutes.

      So...you didn't listen to it.

      Is this a corollary of your inability to read?

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 5:13pm

      Re: YET you tacitly state that Google, Facebook, Twitter colluded

      I'll assume you mean common law for the US, as this is a US based site and is thus under US law.

      Common law is derived from judicial precedent. Given this knowledge... You're contradicting your own argument, as case law regarding comments often brings up CDA 230 (a statutory law).

      CDA 230 does not require a voting system. Logicly, this also means it does not require the site to say how the "voting" (in this case by means of flagging) system works.

      link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      icon
      Sayonara Felicia-San (profile), 21 Aug 2018 @ 8:58pm

      Re: YET you tacitly state that Google, Facebook, Twitter colluded

      Yes, that is exactly what happened. This is verbatim quote from a noted researcher and investigative journalist, me:

      Newly leaked confidential Media Matters / Soros policy memos reveal that the recent de-platforming of Alex Jones and others are just a small part of a larger effort to destroy free speech and replace it with a bastardized corporate controlled shit show:

      "In the next four years, Media Matters will continue its core mission of disarming right-wing misinformation...


      "....Internet and social media platforms, like Google and Facebook, will no longer uncritically and without consequence host and enrich fake news sites and propagandists..."


      https://www.scribd.com/document/337535680/Full-David-Brock-Confidential-Memo-On-Fig hting-Trump#from_embed

      link to this | view in chronology ]

      • icon
        Stephen T. Stone (profile), 21 Aug 2018 @ 9:19pm

        a noted researcher and investigative journalist, me

        Oh good, now I know I can add you to the list of regular trolls.

        link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          icon
          Sayonara Felicia-San (profile), 21 Aug 2018 @ 10:05pm

          Re:

          Please do, it will save me the chore of dealing with mouth breathers such as yourself with your tiresome array of cookie cutter retorts.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 22 Aug 2018 @ 1:54am

            Re: Re:

            Yeah we are all kinda tired of watching Stephen hand you our ass every single thread.

            link to this | view in chronology ]

          • identicon
            Anonymous Coward, 22 Aug 2018 @ 9:36am

            Re: Re:

            Complains about cookie-cutter retorts.

            Uses Soros bogeyman without any sense of irony.

            link to this | view in chronology ]

      • icon
        PaulT (profile), 22 Aug 2018 @ 6:56am

        Re: Re: YET you tacitly state that Google, Facebook, Twitter colluded

        "a noted researcher and investigative journalist, me:"

        Lol. A researcher who uses themselves as the primary source of information is always going to be a very poor one.

        "Media Matters "

        Huh, I've never seen those on the usual lists of boogeymen you people use. Did they uncover something embarrassing about someone recently?

        link to this | view in chronology ]

    • icon
      PaulT (profile), 22 Aug 2018 @ 6:54am

      Re:

      "ightly to "de-platform" Alex Jones."

      He has a platform all of his own - Infowars (among others). He does not have any inherent right to use anybody else's space.

      "Anyone, state and link to something out-of-bounds from Alex Jones"

      As mentioned before, the Sandy Hook false flag theory was way, way out of bounds, and that one has led to the parents of victims being harassed in real life. Popular platforms do not want or need this kind of toxic lunatic infecting them, and they do not have to accept him if they don't want.

      "no other web-site that I know of resorts to the hidden cheating of claiming that "the community" with a "voting system" is responsible"

      Off the top of my head: Slashdot, Ars Technica and Reddit all do this, although there are certainly others However, I've never seen anybody so consistently moan and bring up stupid conspiracy theories about those systems, it's only here I see such a thing.

      You definitely need to brush up on your reading. Maybe try something outside of right-wing nutjob sites known to lie to their readership? You might accidentally catch a glimpse of the reality the rest of us live in.

      Although, ironically, such sites never seem to have a problem outright deleting comments they disagree with, so this site is definitely better then the ones you've admitted to reading even if your dumb theory was close to reality.

      link to this | view in chronology ]

  • identicon
    Christenson, 21 Aug 2018 @ 4:11pm

    Unicorns...or not

    I stated earlier that the perfect solution may be a unicorn.

    To my mind, we need more competition...so Twitter and Facebook don't have quite as much influence. That way Jack and Mark can be as megalomaniac as they like on their personal platforms.

    To get the diversity competition brings, twitter and Facebook (hell, craigslist too??) should not have any copyright in the user-generated content. Maybe some of the other barriers to entry can also be removed.

    link to this | view in chronology ]

    • icon
      Thad (profile), 21 Aug 2018 @ 4:45pm

      Re: Unicorns...or not

      I think small communities inherently work better. Only by being a member of a community can a moderator understand context.

      Remember the story a few months back about Twitter suspending That Anonymous Coward for using an offensive term in reference to himself? A moderator who knew him would have understood the context of his comment and would not have punished him for it.

      Even in small communities, there's going to be controversy over moderation action, or a lack thereof. No community is ever going to agree 100% on everything, unless it's a "community" of just one person. There's no perfect system. But community moderation is a better system than moderation by faceless employees of some megacorporation.

      link to this | view in chronology ]

      • icon
        Mike Masnick (profile), 21 Aug 2018 @ 4:53pm

        Re: Re: Unicorns...or not

        I think small communities inherently work better. Only by being a member of a community can a moderator understand context.

        I agree with this in general... but, on the flip side, I also see the massive benefits of enabling large communities to communicate with one another. The number of people I've met via Twitter has been incredibly useful. So I struggle with how to put those two issues together.

        I know that Mastodon is one attempt to do so, where you could have multiple smaller communities that "federate" and allow inter-instance communication, but that has some other issues as well.

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 21 Aug 2018 @ 5:02pm

          And those issues are not lost on both developers and users of Mastodon/the Fediverse, given how some Masto instances are forking the software and adding features they believe will improve it.

          link to this | view in chronology ]

        • icon
          Thad (profile), 21 Aug 2018 @ 5:04pm

          Re: Re: Re: Unicorns...or not

          Yeah, there's no perfect answer to that dilemma, but Mastodon's method might be the best one.

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 22 Aug 2018 @ 7:09am

        Re: Re: Unicorns...or not

        Large, global communities come into their own when dealing with natural or man made disasters. Look at how people round the world have contributed indirectly but significantly to disaster relief by watching #tags for requests for help that they could provide, or coordinating efforts to update maps from satellite images.

        Throwing a request for help into the Twitter ocean can get a response from a combination of people to solve the problem, or give the answer, far faster than any bureaucracy can route it through all its chains of command.

        It is the ability for people to retweet that makes twitter both a powerful tool for good and evil. Because of this I agree with Mike, give users the tools to tailor what they see, and the APIs, so that a group of people can route through a common filter if that is what they desire.

        link to this | view in chronology ]

        • icon
          Thad (profile), 22 Aug 2018 @ 12:11pm

          Re: Re: Re: Unicorns...or not

          Twitter's great for communicating short, simple, informational posts.

          It's godawful for saying anything complex or nuanced.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 22 Aug 2018 @ 1:12pm

            Re: Re: Re: Re: Unicorns...or not

            A lot of people, in a lot of circumstances use Twitter as a means of quick questions and responses, along with using it to notify and provide links to longer posts on other sites.

            The way Twitter works makes it a good tool for top level notification, responses, and providing an overview of what is happening.

            link to this | view in chronology ]

      • identicon
        John Smith, 22 Aug 2018 @ 11:15am

        Re: Re: Unicorns...or not

        From 1994-1997, AOL had a near-monopoly on a large chunk of internet speech, and exercised that power with an iron fist. You couldn't even send e-mail outside of AOL without it taking hours or days. It was like Studio 54 with th velvet rope on steroids.

        The problem always fixes itself when hungry competition exploits the censorship. Slowly, the web caught up to AOL, and by 2002, AOL was irrelevant. Best thing to do here would be to short the stocks of the censors as they will eventually crash as twitter did from the 50s to the teens.

        It's not backlash but irrelevance that harms the censors. The discussions become bland, and people seek the truth. In one controversial self-help area, AOL censored everyone, as did Prodigy, and everyone landed on USENET, their ideas caught fire, and literally took over the world.

        USENET still exists, btw. If people wanted free speech they'd be using that. YouTube is actuallyh quite permissive much more so than Twitter or Facebook, which is why I believe Google will ultimately win the FANG battle.

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 5:20pm

      Re: Unicorns...or not

      Competition may improve things, yet this brings a major problem. Users don't want to use a platform when the people they want to communicate with are not also there.

      This is likely why the current major social media platforms are doing well in spite of their flaws.

      link to this | view in chronology ]

      • identicon
        Christenson, 22 Aug 2018 @ 8:03am

        Re: Re: Unicorns...or not

        And this is why I claim copyright for "social media" platforms is the thing that needs to go.

        There's a twitter ocean out there, and the only barrier to sharing, say, the "techdirt mastodon feed" (which would also let you search the ocean, because twitter has no legal barriers to just that) would be whether Mike Masnick felt like getting it programmed and provisioned.

        link to this | view in chronology ]

        • identicon
          John Smith, 22 Aug 2018 @ 10:30pm

          Re: Re: Re: Unicorns...or not

          Twitter definitely shadowbans conservatives, to where they won't show up in searches or trends, and only in the timeline of their followers (who they won't get through searches or trends), or those to whom they reply directly.

          Eventually another service will siphon their audience by allowing true free speech. Google does this pretty much with YouTube, perhaps because they know if they chase people off, while they can afford to lose the revenue, they can NOT afford to have a competitior grow strong enough to challenge them.

          Google also protects the intellectual property of its creators so well that it's the only place oen can count on not being pirated, though "copycat" videos are more of an issue (but that's not a copyright law issue).

          Amazon has hosted files for major piracy sites and destroyed the market for e-books with their "royalty" structure that payws 70 percent if you charge between $2.99-9.99 a copy. This is almost price-fixing, and it forces authors to conform to a vertical constraint which is generally not tolerated like with what broke up the movie studios and theaters. I'd rather put a book on video on youTube for people to watch as i scroll through the pages than sell it on Amazon. Another thing Google does is eliminate advertising conflicts of interest. Videos on the law are not always spojnsored by law firms, for example, so there's no holding back on the content the way it would be if free material is used to market premium material.

          To divert to copyright for a moment, I agree that middlemen stink, that corporations exploit artists, and that the audience has a right to demand free samples, but I do not agree that anyone should use any of the above (or lack thereof) as a rationale to justify piracy or weakening protection, unless you want to just put everything in the public domain and let the best techie win.

          Content moderation is a nonstarter. Every time there has been censorship someone has risen up from it. Even Twitter had a no-holds-barred approach when it was building its audience, and something like Gab could easily catch fire.

          I've been tempted to pitch a defamation-free search engine that also does not allow links to infringing material even if the engine is immune, as an alternative to what we have now. How about an opt-in search engine, or a link-based portal that is constructed by humans, creating more jobs and spreading the wealth?

          The internet is too decentralized for censorship to be an issue. Too many ways around it, even if they seem inconvenient. Anyone who wants an audience will certainly find a way to build one, and any platform that wants money will find away to accommodate them. The syst em really does work. This website is a good example of that even if the guy who runs it uses slanted language, something frowned upon in journalism.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 23 Aug 2018 @ 1:56am

            Re: Re: Re: Re: Unicorns...or not

            I agree that middlemen stink, that corporations exploit artists, and that the audience has a right to demand free samples

            That's a first.

            but I do not agree that anyone should use any of the above (or lack thereof) as a rationale to justify piracy or weakening protection

            Sure, because the alternative of throwing more money at the corporations you claim to loathe so much, as well as letting invasive software ruin our machines as a penalty for purchasing legal products like you wanted, has clearly proven to be the constructive solution to solving the issues that you purportedly hate.

            Oh, wait, no it doesn't. It's just far easier for you to demand a pound of flesh from the rest of the planet because your corporate masters ripped you off. And after you sucked their cocks and everything! My heart bleeds, truly.

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 21 Aug 2018 @ 4:19pm

    Er...

    The most telling part of the podcast is the comparison between two situations (...) As the podcast does a nice job showing, these are basically impossible situations

    Your two examples, at least as stated, seem fairly possible to resolve. In both instances I'd say the newsworthy "verdict" is quite accurate. The only distinction seems to be the press losing their minds (my words, not yours and, yes, hyperbolic) over the second example, prompting a change in response to their pressure. That is my take on that, at least with the information you provided.

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 21 Aug 2018 @ 4:54pm

      Re: Er...

      Your two examples, at least as stated, seem fairly possible to resolve.

      Now resolve a million of those a day, with 5 seconds to decide. And don't make any mistakes or you get slammed in the press. Or don't make any that upset Republicans, or Democrats, or minority groups, or majority groups. Good luck.

      link to this | view in chronology ]

      • identicon
        the_wanderer, 21 Aug 2018 @ 5:54pm

        Re: Re: Er...

        Why only 5 seconds to decide?

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 21 Aug 2018 @ 5:58pm

          Because taking any longer would leave many other reports left in the inbox. Inefficiency gets people fired, after all.

          link to this | view in chronology ]

          • identicon
            the_wanderer, 21 Aug 2018 @ 9:51pm

            Re:

            Allocate more workers so each one can take longer to decide.

            link to this | view in chronology ]

            • icon
              That One Guy (profile), 22 Aug 2018 @ 4:15am

              Re: Re:

              Two questions:

              1) How much time(on average) would be 'enough' to fairly judge a particular piece, consider history of the poster for stuff like implied sarcasm/parody vs seriousness, consider context of the piece within that account and in general, decide how 'newsworthy' it might be, and other factors what could result in an otherwise 'obvious' violation being allowed?

              2) Assuming doing the above one million times a day, how many people would you estimate would be needed to properly vet said content in a timely manner?

              link to this | view in chronology ]

        • icon
          Mike Masnick (profile), 22 Aug 2018 @ 12:56am

          Re: Re: Re: Er...

          Why only 5 seconds to decide?

          Listen to the podcast... That's all the time that people have to review stuff because there are so many pieces of reported information. In short: the content keeps flowing and flowing. And, no, the answer isn't just "hire more people." They're doing that. But the content and report clicks are coming faster.

          link to this | view in chronology ]

  • identicon
    the_wanderer, 21 Aug 2018 @ 4:30pm

    In both those cases that content should have stayed up. Rules against "gore" are fucking retarded, and "think of the children" has been dismantled thoroughly and at length long ago (yet still keeps being brought up and paraded about by closet Puritans).

    link to this | view in chronology ]

  • identicon
    Christenson, 21 Aug 2018 @ 4:31pm

    News Flash: Facebook creates "trustworthiness" score

    https://www.usatoday.com/story/tech/news/2018/08/21/facebook-trust-reputation-score/1052839002/

    That seems like a good force multiplier for Facebook moderation.

    link to this | view in chronology ]

  • icon
    Hairy Drumroll (profile), 21 Aug 2018 @ 5:24pm

    Children, not boys

    The line:

    "you couldn't say mean things about white men, but could about black boys"

    should read

    "you couldn't say mean things about white men, but could about black children".

    The linked article talks about children, not boys. You could never say mean things about black boys on Facebook (two protected descriptors), although the identifier of "children" wasn't protected.

    And I think that I recall that in the Radiolab story, it was mentioned that this rule was changed, as it should have been, so that you can no longer say mean things about black children.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 22 Aug 2018 @ 11:54am

      Re: Children, not boys

      +1 on this. You need to edit the article because the mistake undermines your point.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 21 Aug 2018 @ 6:07pm

    What needs to happen...

    ...is, that outside of statutory infractions, the users should have the ability to block/ignore any poster who "triggers" them. If seeing/reading/hearing something offends you, just never go back. You have no reason to needlessly engage anyone. Don't like their opinion? Just block/ignore them.

    Amazing that people need to keep going back to see what else can offend them just so they can biatch about it.

    No one NEEDS social media. If all it does is upset you then you're doing it wrong

    link to this | view in chronology ]

    • icon
      Toom1275 (profile), 21 Aug 2018 @ 7:24pm

      Re: What needs to happen...

      Yeah, and when punks keep filling your mail slot with dogshit, you can just as easily ignore that, too!

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 22 Aug 2018 @ 9:16am

        Re: Re: What needs to happen...

        You obviously don't grasp the concept of the whole internet thing. You don't have to look at anything you don't want to look at.

        link to this | view in chronology ]

        • icon
          Toom1275 (profile), 22 Aug 2018 @ 10:00am

          Re: Re: Re: What needs to happen...

          Says the projecting moron who can't grasp that targeted harassment often follows victims outside of the main source, to their private accounts, emails, phone numbers, homes, etc.

          link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    John Smith, 21 Aug 2018 @ 6:22pm

    Straw-man argument: it's impossible for BIG INTERNET to monitor content by AUTOMATED means. Humans could do the job easily, but that would require silly things like respecting people's rights, which in turn would lead to spreading the wealth, full employment (as robots do our drudge work), and intelligent websites run by actual humans rather than bots.

    The modern "information superhighway" is built on a house of cards created by "tolltakers" who declared themselves essential to its existence (it's not), and set up toll bootsh which enabled them o steal hundreds of billions of dollars.

    As they say here about copyright, if a business model is unsustainable, it deserves to perish. That goes for Big Internet (and the monopolistic practice of law) as much as it does for copyright.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 6:29pm

      Re:

      This is literally a discussion about humans being unable to do the job easily.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Another Reasonable Question, 21 Aug 2018 @ 6:58pm

    HEY, why weren't "platforms" controlling speech from the start?

    Answer: because they deliberately kept away from controversy until big and influential enough that believe can make their move to gain total power.

    However, legally, that puts them in the wrong through deliberate non-feasance, breaking the explicit deal, and accepting without least caution previously so that now is a huge and apparently causeless change.

    Alex Jones is indeed a good example, has been ranting about Sandy Hook for 5 years. SO WHY NOW are these globalist corporations cracking down?

    In every way, these corporations have proved themselves lurking evils, just waiting for right time to attack. -- What else are they planning?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 7:07pm

      Re: HEY, why weren't "platforms" controlling speech from the start?

      They’re planning on using mind control waves emitted from power outlets. So you better get rid of all electrical devices and cover up all the outlets at your apartment.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 7:09pm

      Re: HEY, why weren't "platforms" controlling speech from the start?

      “deliberate non-feasance, breaking the explicit deal, and accepting without least caution previously so that now is a huge and apparently causeless change.”

      Do...do you smell toast?

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 7:26pm

      Re: HEY, why weren't "platforms" controlling speech from the start?

      Your latest conspiracy makes even less sense than the zombie hordes.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        icon
        Sayonara Felicia-San (profile), 21 Aug 2018 @ 9:10pm

        Re: Re: HEY, why weren't "platforms" controlling speech from the start?

        That sounds like a logical conclusion to me, and would be the basis for a solid law suit by Alex Jones.

        The fact that they don't enforce their terms equally AND the fact that they did not enforce their terms AT ALL for YEARS means that their current de-platforming event could be illegal.

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 21 Aug 2018 @ 9:34pm

          Show me the law that Google, Facebook, Twitter, etc. broke by kicking Alex Jones off of their respective platforms.

          link to this | view in chronology ]

        • identicon
          Anonymous Coward, 21 Aug 2018 @ 10:18pm

          Re: Re: Re: HEY, why weren't "platforms" controlling speech from the start?

          Sue on his behalf. Do it you pussy.

          link to this | view in chronology ]

        • identicon
          Anonymous Coward, 22 Aug 2018 @ 9:37am

          Re: Re: Re: HEY, why weren't "platforms" controlling speech from the start?

          You have no idea what you're talking about.

          It's embarrassing.

          link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 21 Aug 2018 @ 9:35pm

      globalist

      We get it, you’re an anti-Semite.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Sayonara Felicia-San (profile), 21 Aug 2018 @ 8:56pm

    I agree. But let's talk about what really happened

    Great article Mike! However, that theoretical third party you speak of, well, that's only going to make it easier to hijack and control speech.

    Now I'm not jacking your topic, no pun intended, but let's talk about the big obese elephant in the room first, and that's the recent coordinated de-platforming of Alex Jones.

    Newly leaked confidential Media Matters / Soros policy memos reveal that the recent de-platforming of Alex Jones and others are just a small part of a larger effort to destroy free speech and replace it with a bastardized corporate controlled shit show:

    "In the next four years, Media Matters will continue its core mission of disarming right-wing misinformation...


    "....Internet and social media platforms, like Google and Facebook, will no longer uncritically and without consequence host and enrich fake news sites and propagandists..."


    https://www.scribd.com/document/337535680/Full-David-Brock-Confidential-Memo-On-Fig hting-Trump#from_embed

    So basically we already have a conspiracy led by a corrupt un-elected billionaire to control and direct speech in this country through coordinated effort of several of the largest internet companies.


    My Point?!?!?

    The amount of pressure, effort, internal apparatchiks, and political gymnastics involved in pulling this off is again nothing short of brilliant. Again, I have nothing but admiration for George Soros and his "open" foundation organizations ability to subvert and manipulate society to his own twisted will.

    HOWEVER, what you propose is basically, consolidating and centralizing a system which could be a shared objective resource which these companies would defer to in the future when it comes to content moderation. I admire you Mike, and your idealistic disregard of reality. Unfortunately, intentional or not, all you are going to achieve is to make it easier to destroy practical access to speech and information.

    Sorry to rain on your gay parade, but history shows that centralization of power never increases freedom.

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 21 Aug 2018 @ 9:33pm

      we already have a conspiracy led by a corrupt un-elected billionaire to control and direct speech in this country through coordinated effort of several of the largest internet companies

      Please provide proof of your claim and the necessary citations required for verifying your evidence. FYI: That Scribd link is not proof; nothing in that document shows anything about any sort of plan to censor or outright control the speech of others. (Saying “we want to fight disinformation” is not the same thing as saying “we want to shut up Fox News and Donald Trump forever”.)

      what you propose is basically, consolidating and centralizing a system which could be a shared objective resource which these companies would defer to in the future when it comes to content moderation

      How is the idea to “move the controls outwards to the ends, allowing individuals and third parties to make their own calls” anything close to the idea of “consolidating and centralizing a system which […] these companies would defer to in the future when it comes to content moderation”?

      history shows that centralization of power never increases freedom

      …which is likely one reason why Mike advocated for the exact opposite of giving too much power to the social media companies.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        icon
        Sayonara Felicia-San (profile), 21 Aug 2018 @ 10:03pm

        Re:

        "Please provide proof..."

        I provided proof. Your bizarre retort only fools the most stupid and ignorant.

        The so called "scribd link" is actually a link to a pdf file from Media Matters. Scribd is a document hosting company. Nice try though.

        You've lost all credibility by attacking 'the link' and there is no point in continuing any further discourse.

        link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          identicon
          Anonymous Coward, 21 Aug 2018 @ 10:50pm

          Re: Re:

          Yes, everyone knows that Stephen lost any shred of credibility years ago. At least he is not throwing his “sex dolls” retorts at you, as he does with me.

          I like your writing, and hope you will continue. If you get ruffled by Stephen, you are only playing into his (slimy) hands.

          Continue to share your opinions, please. You are an interesting writer and seem to have skin thick enough to hang out here, such individuals are few and far between.

          Publius

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 22 Aug 2018 @ 1:58am

            Re: Re: Re:

            Sounds like a bromance made in heaven. One question though. Who’s wearing the Ayn Rand mask? And who’s wearing the Rand Paul mask?

            link to this | view in chronology ]

            • icon
              cattress (profile), 22 Aug 2018 @ 8:28pm

              Re: Re: Re: Re:

              Hey hey now! We Libertarians may be weird, but not that kind of weird. I'm pretty sure Mike Masnick identifies as libertarian- or at least libertarian leaning.

              Those nutjobs don't represent anything remotely Rand-ian...

              link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 22 Aug 2018 @ 8:28am

          I provided proof.

          Where does that document say, explicitly and unambiguously, that Media Matters is trying to outright control or censor the speech of others?

          You've lost all credibility by attacking 'the link'

          I did not attack “the link” (or the website it leads to), I attacked the idea that the document sitting behind that link says what you claim it says. If you were confident enough in your claims to back them up without bullshitting your way into “credibility” (or offering an anti-Semitic conspiracy theory about George Soros), you could give me a straight answer to that question I just posed.

          link to this | view in chronology ]

    • identicon
      ktetch, 21 Aug 2018 @ 10:14pm

      Re: I agree. But let's talk about what really happened

      yeah, you're right, it's a massive conspiracy.

      We call it 'civilization'. It's a whole group of us that get together, and decide what's acceptable and what's not, based on social norms.

      quite 'surprisingly', a self-admited fantasist, creating items he's stated under oath to be 'fictional' and 'invented for the purposes of entertainment' doesn't get the priviledges of factual reporting. How shocking that this 'civilization' conspiracy has foiled his efforts to dump that skanky shit-peddler where his too-shit-for-spam-email pills deserve to be, nowhere.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        icon
        Sayonara Felicia-San (profile), 21 Aug 2018 @ 10:20pm

        Re: Re: I agree. But let's talk about what really happened

        Perhaps google translate could help translate whatever it is you are trying to say?

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Aug 2018 @ 10:16pm

      Soros! drink!

      link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 22 Aug 2018 @ 1:01am

      Re: I agree. But let's talk about what really happened

      Newly leaked confidential Media Matters / Soros policy memos reveal that the recent de-platforming of Alex Jones and others are just a small part of a larger effort to destroy free speech and replace it with a bastardized corporate controlled shit show:

      1. Why bring up Soros?

      2. Last I checked, Media Matters controls neither Google nor Facebook (and, is more or less considered a partisan joke).

      3. So I'm unclear on why a document that clearly states what one small partisan organization's goal is... somehow means that Google and Facebook are in on this plan.

      So basically we already have a conspiracy led by a corrupt un-elected billionaire to control and direct speech in this country through coordinated effort of several of the largest internet companies.

      No. You have a document put together by a small partisan operation expressing its own goals, which have exactly as much impact on the operation of large platforms as the demands of, say, Breitbart.

      HOWEVER, what you propose is basically, consolidating and centralizing a system which could be a shared objective resource which these companies would defer to in the future when it comes to content moderation.

      What? I proposed literally exactly the opposite. I have proposed pushing everything out to the ends in a more distributed system. I advocate for the end users getting control over all of their own data and content, so that the platforms don't control it. I've advocated for the platforms taking a hands off approach on content moderation and instead providing tools and APIs so that others can either provide their own filters or interfaces, or that end users can design their own.

      So, why would you claim I'm advocating for literally the exact opposite of what I advocate for other than being a total and utter troll.

      history shows that centralization of power never increases freedom

      I agree. Which is why I've long advocated for more decentralization, entirely contrary to your claims.

      link to this | view in chronology ]

  • icon
    oliver (profile), 21 Aug 2018 @ 10:42pm

    CDA 230 ?

    Hi Mike
    One the one hand you happily dismiss any lawsuit that threatens platforms by pointing to CDA230, otoh now you eschew the burden behind keeping the CDA230 loophole open, by being required to at least moderate user content?

    I smell a little bit of hypocrisy here.

    Cheers, Oliver

    link to this | view in chronology ]

    • identicon
      cpt kangarooski, 21 Aug 2018 @ 10:58pm

      Re: CDA 230 ?

      47 USC § 230 doesn’t require anyone to moderate anything. It merely protects providers and users in the event that they moderate. Indeed, if the safe harbor went away, all that would happen is that there would be no moderation in order to continue avoiding liability. It’s not only not a loophole, it’s the opposite of one.

      But what do you know, you’re almost certainly a troll.

      link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 22 Aug 2018 @ 1:02am

      Re: CDA 230 ?

      One the one hand you happily dismiss any lawsuit that threatens platforms by pointing to CDA230, otoh now you eschew the burden behind keeping the CDA230 loophole open, by being required to at least moderate user content?

      CDA 230 does not require moderation of content. So I'm not sure where the hypocrisy is. CDA 230 does enable moderation of content by saying a service provider is not liable for the moderation choices. But it does not "require" it.

      So... what hypocrisy are you talking about? My position is entirely consistent.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 22 Aug 2018 @ 4:06am

    It's not impossible -- or even hard

    It's just beyond the capacity of ignorant newbies running operations like Facebook and Twitter. They lack both the experience to understand the problem and the humility to learn from those who are vastly superior to them.

    And overlaying all of that is their myopic, simplistic view that more is better. It's not. Twitter would be vastly more useful at 10% of its current size, a goal easily achieved by permanently blacklisting the people, domains, and networks responsible for abuse.

    This isn't difficult. It's a lot of hard work, but it's not difficult. It's much easier for snivelling coward Jack Dorsey to whine and whine than it is to actually roll up his sleeves and do what needs to be done.

    link to this | view in chronology ]

    • icon
      That One Guy (profile), 22 Aug 2018 @ 4:26am

      If you say so

      Well damn, in that case sounds like there's a grand opportunity just waiting for some brilliant go-getter to jump on it, and as someone claiming that it's an easy thing to do sounds like you're just the kind of person to do it.

      As such I'm sure many will be waiting with eager anticipation for you to roll out the competing platform that will utterly demolish both Facebook and Twitter and show them how they should have been doing it.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 22 Aug 2018 @ 5:49am

      Re: It's not impossible -- or even hard

      Twitter would be vastly more useful at 10% of its current size, a goal easily achieved by permanently blacklisting the people, domains, and networks responsible for abuse.

      They did. Then you pissed and moaned after Jared Taylor got his ass kicked off Twitter. There's no fucking pleasing you, Hamilton.

      link to this | view in chronology ]

      • icon
        That One Guy (profile), 22 Aug 2018 @ 6:08am

        "Well MY standard says you're off, and since I run the place..."

        I don't think that's Hamilton, but whoever they are you do bring up a good point and one I wish I'd caught. They claim that the services could be vastly improved by simply giving the boot to those abusing it. Great, abusing the platform according to who? Maybe that's exactly what they're starting to do, and in that case would they still be for it?

        link to this | view in chronology ]

  • icon
    Gary (profile), 22 Aug 2018 @ 6:35am

    Obvious

    I think the comments on this article really drive home the difficulty of moderating.
    On either side are absolutists - Any absolute position such as "You can't moderate anything because it's a violation of my natural rights" is obviously unworkable. The answer has to sit somewhere in the middle.
    The best comments always start with, "It's easy just..." Because I known damn well none of those people actually have a working solution to this or they would be shouting the praises of their site instead of sniping from their anonymous bunkers.
    Or "Mike is doing it all wrong..." while screaming about some ideology divorced from actual law. (Again, show me how your service works with absolute lack of moderation before you criticize the light hand of TD.)

    If there was an easy way this wouldn't generate so much heated debate.

    link to this | view in chronology ]

    • identicon
      Christenson, 22 Aug 2018 @ 8:25am

      Re: Obvious

      I think the correct answer is that consistent moderation at scale is simply impossible. There's too much content! We also have examples (usenet news) showing it is absolutely necessary.

      Not only that, but the problem is complex, so complex in fact that it finds itself having to constantly balance on two or more conflicting horns of the dilemma: newsworthy? too gory? too spammy? encourages fistfights? off topic? false? too un-funny? threats?

      I take Mike Masnick's position on this: decentralizing and diversifying the decisionmaking is best. The question is whether legal changes over CDA230 (such as not allowing copyright on user-generated content) make sense or not.

      link to this | view in chronology ]

  • icon
    MonkeyFracasJr (profile), 22 Aug 2018 @ 10:18am

    "the job we're asking them to do"

    The job we're asking them to do is to be our conscience for us, to do out parenting for us because we, especially in the US have completely abdicated from any personal responsibility.

    "We" want our content moderated because "we" cannot handle the idea of doing it for ourselves, it's too fccking hard. And if "we" did it ourselves we'd have no-one to put the blame on because it simply cannot be "my" fault.

    And that is just sad.

    If you don't like something, don't look at / or read it. If you don't feel it is appropriate for your children then take the time to BE A PARENT and curate what they have access to and take the time to EDUCATE THEM.

    And its none of your fccking business how others do their parenting, as long as they are being parents.

    link to this | view in chronology ]

    • identicon
      Christenson, 22 Aug 2018 @ 1:51pm

      Re: "the job we're asking them to do"

      Actually, it’s not ourselves or our Children that are of primary concern here.

      It is more about a certain fringe of others that get taken in by the honey-tongued devils and do evil of various degrees. For example, the poor soul that believed in Pizzagate and is now rotting in prison. For example #gamergate. For example doxxing and harassing the Sandy Hook survivors.

      Or how about the Slenderman girl, the one that tried to murder her best friend?

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 22 Aug 2018 @ 2:49pm

        Re: Re: "the job we're asking them to do"

        >It is more about a certain fringe of others that get taken in by the honey-tongued devils and do evil of various degrees.

        That is the same reasoning that has led to the security theater following 9/11. In other words it is well into the realm of good intentions that pave the road to hell. Those on the fringe need a safety net, and are not helped by a clamp down on the rest of society to try and protect them.

        link to this | view in chronology ]

        • identicon
          Christenson, 22 Aug 2018 @ 9:14pm

          Re: Re: Re: "the job we're asking them to do"

          I think it is axiomatic that central direction of moderation is *the wrong* way to go. German anti-semitism laws from before WWII provide an excellent object lesson.

          The problem is, the fringe is all kinds of shades of gray, and what are we supposed to do about my acquaintance, otherwise functioning, who believes in this #QANON crap and wants to act on it?

          I claim that decentralization is a reasonable direction. No, it won't address the panic in many that they are losing their position in the hierarchy as normalcy is upended and the purity of their childhood is lost (what are all these different-looking people, why can't I beat them up, and how can two men get married???)

          I don't know how to address the panic...sorry.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 23 Aug 2018 @ 2:23am

            Re: Re: Re: Re: "the job we're asking them to do"

            Decentralization works for some things, while centralization works for others. Pushing fringe and extremist groups into their own little enclaves is the best way of allowing them to become more extreme and weird in their views as you remove any moderating influences and increase their sense of isolation.

            Also a problem with federated systems is that they do not scale particularly well, which is one of the reasons that RSS for example, is not as useful as it should be. A popular node can end up with excessive traffic, especially when compared to it direct user base who are paying for the resources. Also, it is more difficult to make connections in a federated system, as they lack the reach of a centralized system untargeted open messages.

            A federated system has advantages when it comes to communications that should only reach a limited audience, such as family and smaller social groups.

            Therefore, in many ways federated and centralized are complementary, and have different strengths and weaknesses. I.e. Federated would be a good choice for an extended family to stay in touch, while a centralized system is better when a creator wants to announce new content, or inform fans of a delay etc. Federated is better for local politics and issues, but centralized for larger scale politics.

            what are we supposed to do about my acquaintance, otherwise functioning, who believes in this #QANON crap and wants to act on it?

            That should be what are you and your friends and acquaintances going to do about, as there is nothing I as a stranger can do about the problem. However, if you are looking for something to debunk that a conspiracy theory, a centralized system is going to be more useful to you, as it gives the widest coverage of debunking efforts, and a better chance of finding someone with a counter argument that resonates with your acquaintance.

            link to this | view in chronology ]

            • identicon
              Christenson, 24 Aug 2018 @ 7:35am

              Re: Re: Re: Re: Re: "the job we're asking them to do"

              what are we supposed to do about my acquaintance, otherwise functioning, who believes in this #QANON crap and wants to act on it?

              Hmmm...seems I'm fresh out of ideas and having a *moral panic* (TM Techdirt!)

              That's right, twitter and facebook are guilty by association....just like in those ridiculous "material support of terrorism" lawsuits that keep getting dismissed.

              It's not like this is the first time in history that desperate, panicked people like my friend have reacted irrationally and cruelly!

              link to this | view in chronology ]

  • identicon
    Anonymous Cowherd, 23 Aug 2018 @ 3:10am

    Why must "the platforms" do a better job? We don't want the phone company listening in on our calls and bleeping out controversial topics. We don't want the post office to open our mail and toss out any letters that say things they don't approve of, or sent by people they don't approve of. Why is it so different on the internet?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Aug 2018 @ 3:20am

      Re:

      The driving force is that phones calls and letters are basically private, while the Internet is public, coupled to the fact that a few very noisy people have appointed themselves the keepers of public morals. Those people will not accept solutions here individuals take responsibility for what they see because they can easily find things to be outraged about, and its visible existence is want annoys them, no matter how easy it would be for them to avoid it, or filter it out of their lives.

      Couple that to politicians, who do not accept as doing nothing is often the best solution at their level, and more paving slabs get laid on the road to hell.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Anderson Nascimento Nunes (profile), 23 Aug 2018 @ 4:33am

    Convert HTML to Feed

    What I do is convert html pages to atom feeds.

    Google Search? Facebook? Twitter? I convert all this and much more. Forums? Search pages on e-commerce sites? Yes, yes, yes!

    This way I can use my feed reader to filter everything with a blacklist of regular expressions, then send the updates as e-mail messages. Doing this I don't waste time with content I already know I don't want to see, as it is all filtered automatically. Plus some keywords are highlighted to make it easier to eyeball some categories of content.

    I can easily check 1000+ pages multiple times per day and not see a single advertising.

    These companies don't need to know what I like and what I don't like. I do my own computing and don't need their black box algorithms.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.