We Shouldn't Want Internet Giants Deciding Who To Silence; But They Should Let Users Decide Who To Hear

from the rethinking-moderation dept

A few weeks back I wrote a big piece on internet platforms and their controversial content moderation efforts. As I've pointed out more recently, part of the reason why what they do is so bad is it is literally impossible to do this well at the scale they do things at. That is, even if they can reach 99% accuracy, given the amount of content on these sites, it's still going to take down a ton of legitimate stuff, while leaving up an awful lot of awful stuff. This doesn't mean they shouldn't do anything -- but my own proposal is for them to shift the way they think about this issue entirely, and move the moderation out from the center to the ends. Let third parties create their own filters/rules and allow anyone else to not just use them, but to adjust and modify and reshare them as well. Then allow the users to not just "opt-in" to the kind of experience they want, but allow them to further tweak it to their own liking as well.

I've seen some pushback on this idea, but it seems much more viable than the alternatives of "do nothing at all" (which just leads to platforms overwhelmed with spam, trolls and hatred), and continue to focus on a centralized moderation system. There have been a number of articles recently that have done a nice job highlighting the problems of having Silicon Valley companies decide who shall speak and who shall not. EFF's Jilian York highlights the problems that occur when there's no accountability, even if platforms have every legal right to kick people off their platforms.

This is one major reason why, historically, so many have fought for freedom of expression: The idea that a given authority could ever be neutral or fair in creating or applying rules about speech is one that gives many pause. In Europe’s democracies, we nevertheless accept that there will be some restrictions – acceptable within the framework of the Universal Declaration of Human Rights and intended to prevent real harm. And, most importantly, decided upon by democratically-elected representatives.

When it comes to private censorship, of course, that isn’t the case. Policies are created by executives, sometimes with additional consultations with external experts, but are nonetheless top-down and authoritarian in nature. And so, when Twitter makes a decision about what constitutes ‘healthy public conversation’ or a ‘bad-faith actor,’ we should question those definitions and how those decisions are made, even when we agree with them.

We should push them to be transparent about how their policies are created, how they moderate content using machines or human labor, and we should ensure that users have a path for recourse when decisions are made that contradict a given set of rules (a problem which happens all too often).

Jillian's colleague at EFF, David Greene, also had an excellent piece in the Washington Post about how having just a few giant companies decide these things should worry us:

We should be extremely careful before rushing to embrace an Internet that is moderated by a few private companies by default, one where the platforms that control so much public discourse routinely remove posts and deactivate accounts because of objections to the content. Once systems like content moderation become the norm, those in power inevitably exploit them. Time and time again, platforms have capitulated to censorship demands from authoritarian regimes, and powerful actors have manipulated flagging procedures to effectively censor their political opponents. Given this practical reality, and the sad history of political censorship in the United States, let's not cheer one decision that we might agree with.

Even beyond content moderation's vulnerability to censorship, the moderating process itself, whether undertaken by humans or, increasingly, by software using machine-learning algorithms, is extremely difficult. Awful mistakes are commonplace, and rules are applied unevenly. Company executives regularly reshape their rules in response to governmental and other pressure, and they do so without significant input from the public. Ambiguous "community standards" result in the removal of some content deemed to have violated the rules, while content that seems equally offensive is okay.

Vera Eidelman, of the ACLU similarly warns of the pressures that are increasingly put on tech companies that will inevitably lead to the silencing of marginalized voices:

Given the enormous amount of speech uploaded every day to Facebook’s platform, attempting to filter out “bad” speech is a nearly impossible task. The use of algorithms and other artificial intelligence to try to deal with the volume is only likely to exacerbate the problem.

If Facebook gives itself broader censorship powers, it will inevitably take down important speech and silence already marginalized voices. We’ve seen this before. Last year, when activists of ...anchor markup--p-anchor" rel="nofollow noopener noopener" target="_blank">experiences of police violence, Facebook chose to shut down their livestreams. The ACLU’s own Facebook post about censorship of a public statue was also inappropriately censored by Facebook.

Facebook has shown us that it does a bad job of moderating “hateful” or “offensive” posts, even when its intentions are good. Facebook will do no better at serving as the arbiter of truth versus misinformation, and we should remain wary of its power to deprioritize certain posts or to moderate content in other ways that fall short of censorship.

Finally, over at Rolling Stone, Matt Taibbi makes a similar point. What starts out as kicking out people we generally all agree are awful people, leads to places we probably won't like in the end:

Now that we’ve opened the door for ordinary users, politicians, ex-security-state creeps, foreign governments and companies like Raytheon to influence the removal of content, the future is obvious: an endless merry-go-round of political tattling, in which each tribe will push for bans of political enemies.

In about 10 minutes, someone will start arguing that Alex Jones is not so different from, say, millennial conservative Ben Shapiro, and demand his removal. That will be followed by calls from furious conservatives to wipe out the Torch Network or Anti-Fascist News, with Jacobin on the way.

We’ve already seen Facebook overcompensate when faced with complaints of anti-conservative bias. Assuming this continues, “community standards” will turn into a ceaseless parody of Cold War spy trades: one of ours for one of yours.

This is the nuance people are missing. It’s not that people like Jones shouldn’t be punished; it’s the means of punishment that has changed radically.

This is why I think it's so important that the framework be shifted. People have long pointed out that "just because you have free speech doesn't mean I need to listen," but the way social media networks are constructed, it's not always so easy not to listen. The very limited "block / mute" toolset that Twitter provides is not nearly enough. The more platforms can push the moderation decision making out to the ends of the network, including by allowing third parties to create different "views" into those networks, the better off we are. It's no longer the internet giants making these decisions. In fact, it increases "competition" on the moderation side itself, while also increasing the transparency with which such systems operate.

So, really, it's time we stopped focusing on who the platforms should silence, and give more power to help the end users decide who they wish to hear.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: censorship, centralizataion, decentralization, filters, free speech, human rights, intermediary liability, platforms, silence, social media
Companies: facebook, google, twitter


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Gary (profile), 4 Sep 2018 @ 9:44am

    The easy fix

    The only easy fix is to shut off the interwebs. Repeal 230 and shut down all this user generated content. Fortunately, that isn't going to happen in the immediate future.
    Other than a complete shutdown, light handed moderation is necessary to keep things working. No Moderation hasn't been a workable answer since the "Greencard Lottery" advertisements went out over usenet.
    If anyone really wants an unmoderated experience as a matter of principal they really don't understand what spam is.
    Mass blocking astroturfers, spammers, bots and nazi's is fine but it's hard to stop there.
    TD puts some of that control in the users, and seems to do an acceptable job balancing that. So thanks for the article Mike, and the forum to discuss it.

    link to this | view in chronology ]

    • identicon
      i have no name, 8 Sep 2018 @ 6:01am

      Re: The easy fix

      We just need to go back to the good old days when having an internet presence meant learning html, asp/php, and db management or being relegated to geocities.com/something/else/blue/green/19839389/mypage.html

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 4 Sep 2018 @ 9:45am

    That sounds like a 'good idea'.
    However I think it's going to get push back from the people at social media giants to enjoy having the power they have. AND from the people who want one (or a small handful of) place to lay all the blame/accountability. I suspect that those two groups make up a major portion of the people best able to influence the situation.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Tired Old Ruse, 4 Sep 2018 @ 9:54am

    Just your ongoing way to avoid responsibility, lying that

    it's "the community" with a "voting system".

    As evidenced here, that alleged "vote" always works the way you want to suppress criticism and you give no details such as whether an Administrator has final approval.

    And now, censor this and give all the evidence needed that Techdirt simply advocates CENSORING.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 9:57am

      Re: Just your ongoing way to avoid responsibility, lying that

      TD advocates allowing the community to vote your garbage out of view. Maybe you need to repair your tinfoil hat. Your conspiracy theories are getting pretty wild and you seem unable to accept data from outside your own cranium.

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        identicon
        Tired Old Ruse, 4 Sep 2018 @ 10:01am

        Re: Re: Just your ongoing way to avoid responsibility, lying that

        TD advocates allowing the community to vote your garbage out of view.

        That's what I've been saying, and you approve, just don't call it CENSORSHIP (with the word trick that it's not gov't).

        But disadvantage of viewpoints IS CENSORSHIP enough for a "free speech" site that claims to want the margins kept broad. FACT IS that you kids can't stand my mild-mannered dissent, but prattle about how virtuous you are.

        link to this | view in chronology ]

        • identicon
          Baron von Robber, 4 Sep 2018 @ 10:07am

          Re: Re: Re: Just your ongoing way to avoid responsibility, lying that

          Gee, I have no idea what "This comment has been flagged by the community. Click here to show it" means.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 11:24am

            Re: Re: Re: Re: Just your ongoing way to avoid responsibility, lying that

            And a lot of readers must use that to actually view his comments because there are almost always replies to "flagged" comments

            link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 12:55pm

            Re: Re: Re: Re: Just your ongoing way to avoid responsibility, lying that

            The "click here" part doesn't actually work. It's not a link. I have to disable stylesheets to see the hidden comments.

            link to this | view in chronology ]

            • identicon
              Baron von Robber, 4 Sep 2018 @ 1:13pm

              Re: Re: Re: Re: Re: Just your ongoing way to avoid responsibility, lying that

              Works for me and I have scriptsafe and ABP running.

              link to this | view in chronology ]

        • identicon
          Anonymous Coward, 4 Sep 2018 @ 10:09am

          Re: Re: Re: Just your ongoing way to avoid responsibility, lying that

          Referring to the readers/posters as "kids", and your myriad other insults, don't qualify as "mild-mannered".

          And again, community moderation as supported on this site is not censorship. Your posts are still visible to the world. They're merely hidden behind a "show" link. If we could vote you off the island then you'd have censorship.

          The government isn't the only entity that can block something and have it be called censorship. The community can do that, too. The difference is that when the government does it it's usually unconstitutional. When the community does it it's culling the herd and totally legal.

          link to this | view in chronology ]

        • identicon
          I.T. Guy, 4 Sep 2018 @ 10:09am

          Re: Re: Re: Just your ongoing way to avoid responsibility, lying that

          You're not censored here. At any time any user can click unhide and see your drivel.

          Sorry wait...
          FACT you are NOT CENSORED HERE. At any time EVERY USER can click UNHIDE and see your DRIVEL. ;)

          link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 4 Sep 2018 @ 11:30am

          Your “dissent”, such as it is, relies on calling Techdirt names and bashing the commenter community for flagging your posts. You do not dissent on the basis of a fundamental disagreement with a central point being made. You do not refute the central point with which you disagree. All you do is whine and complain like a spoiled brat who keeps getting told “no” when he really wants to hear “yes”. If we thought you were offering anything of substance and nothing that sounded like you having a years-long grudge over something as trivial and meaningless as a perceived slight on a tech blog comments section, we would stop flagging your comments.

          Feel free to give us actual dissent and discussion, if you can do so. We welcome the opportunity to disagree—and to discuss our points of disagreement. But right now, all you deliver are the baseless complaints of a man-child who thinks he commands respect but only ever demands attention. Respect is earned, and you have a lot of earning to do, so either get to work or stop masturbating to the idea that you are some radical counter-culture hero of society because you act like an attention whoring asshole on a tech blog’s comment section.

          link to this | view in chronology ]

          • icon
            That One Guy (profile), 4 Sep 2018 @ 11:37am

            Re:

            But right now, all you deliver are the baseless complaints of a man-child who thinks he commands respect but only ever demands attention.

            "I am Rip Va- Anonymous Coward, and I command your respect!"

            "No, you demand my attention."

            link to this | view in chronology ]

            • identicon
              Wendy Cockcroft, 5 Sep 2018 @ 2:33am

              Re: Re:

              And you're getting neither, troll. There's a right to speak, not to be heard. I've yet to see a comment flagged by the community that didn't deserve it - even those of mine that went off topic.

              link to this | view in chronology ]

          • This comment has been flagged by the community. Click here to show it
            identicon
            Anonymous Coward, 5 Sep 2018 @ 5:56am

            Re:

            "Feel free to give us actual dissent and discussion, if you can do so. We welcome the opportunity to disagree—and to discuss our points of disagreement."

            Basically put: "Nothing you say will dissuade me from censoring your words."

            A conversation I do not wish to repeat with you hypocrites.

            I've said it before, and I'll say it again:
            You can censor those you do not like, but you do not have the right to censor them for me.

            This is "pitchfork" mentality, where Techdirt advocates putting censorship into the hands of the public.

            That'll work out. /sarcasm.

            Until Techdirt repairs the broken system by giving you hypocrites the power to block my content, I will not support this site financially despite the good its authors do.

            I don't give a damn if 99.99999% of you flag the content. None of you has the right to block it from me.

            CENSORSHIP is CENSORSHIP no matter who's in control.

            Why can't you hypocrites understand this.

            link to this | view in chronology ]

            • icon
              Killercool (profile), 5 Sep 2018 @ 6:29am

              Re: Re:

              COPYRIGHT is the law-enshrined CENSORSHIP of all NATURAL PERSONS' rights to copy everything they see. Because copying is how we learn. We copy our parents to learn to speak. We copy our teachers to learn to write. We must learn/copy everything that came before, before we can expand the bounds of current knowledge.

              CENSORSHIP is CENSORSHIP no matter who's in control.

              link to this | view in chronology ]

              • icon
                Killercool (profile), 5 Sep 2018 @ 6:39am

                Re: Re: Re:

                Also, I like how you call everyone here a hypocrite, then flat-out say you refuse to allow the site to make money until they agree with you.

                My, my. You're consuming content with paying for it.

                Dirty pirate.

                link to this | view in chronology ]

            • icon
              PaulT (profile), 5 Sep 2018 @ 6:38am

              Re: Re:

              "Basically put: "Nothing you say will dissuade me from censoring your words.""

              No, it means "provide reasonable words, and we will treat them reasonably". But, even simply stating that is enough to trigger your persecution complex, it seems. I don't believe you've ever attempted to be a reasonable person, however, so I doubt this will ever be tested.

              "You can censor those you do not like, but you do not have the right to censor them for me."

              We absolutely do. I've said this before, but when you have a drunk asshole in the room, the fact that he gets kicked out after annoying too many people is not infringing on his rights. Stop being that drunk asshole if you don't like being told to shut up. We don't even kick you out of the room here, we just sit you in the corner and warn people that you might be an obnoxious drunken asshole.

              I wonder if you're so dead set on whining about these things on sites that actually do kick you out for dissent, or if you're just too stupid to understand the freedom you have on this site that never blocks anyone?

              Also, I wonder what you alternative is. You steadfastly refuse to create a handle, let alone an account, so you provide no method by which TD can know what you wish to see, even if the option is provided. They therefore have to go for a default that appeases most users, which is always going to be "we'd rather not see spam and trolls".

              "I don't give a damn if 99.99999% of you flag the content"

              Obviously. But, the fact that literally everyone in the room is telling you to shut up would give an intelligent person at least pause to consider that maybe it's him who's the problem. If you meet 100 people in a day and 99 tell you you're an asshole, perhaps the problem is not that you happened to meet 99 other assholes?

              "None of you has the right to block it from me."

              Yes, we actually do. Part of freedom of speech is freedom of association and we are telling you we do not wish to associate with you. That is our right, and people choose to exercise it.

              The right that does NOT exist, except in the minds of delusional idiots such as yourself, is the right to demand to use privately owned space to say whatever you wish without recourse. If you go into a vegan restaurant and start shouting about the benefits of a good steak, your rights are not being infringed upon when everyone tells you to shut up or leave.

              link to this | view in chronology ]

              • icon
                Killercool (profile), 5 Sep 2018 @ 6:42am

                Re: Re: Re:

                Or just to leave, period.

                No one deserves a second chance, but most times a kind proprietor will give them one anyways.

                Or several hundred, if Techdirt is any example.

                link to this | view in chronology ]

                • icon
                  PaulT (profile), 5 Sep 2018 @ 7:02am

                  Re: Re: Re: Re:

                  The funny thing is, I never see this whining anywhere else. There's plenty of sites that hide posts depending on user votes. Some even do moderate, edit or even delete posts based on poor feedback. But, I never see this amount of prolonged whining about being censored on other sites - even when I do catch posts before they are edited/removed.

                  That's part of the reason I answer back - it's a strange specimen that I don't come across all that often, and does demand a little prodding t see how it reacts,

                  link to this | view in chronology ]

                  • identicon
                    Anonymous Coward, 5 Sep 2018 @ 7:38am

                    Re: Re: Re: Re: Re:

                    It's all coming from one or two people, maybe three, with a massive chip on their shoulder.

                    Were I to make some guesses, there are a few contributing factors:

                    A) First and foremost, Techdirt is a fairly small community of active participants. I've no way to measure readership, but there is what I would call a "handful" of regular commenters.

                    What this does is give the bitter boys targets. They have somewhere to aim their vitriol and expect it to reach someone.

                    The smaller amount of comments also means everyone looking in the comment section is going to see what they have to say - or at least, will see that they have said something. Despite the claims of censorship, even when flagged by the community, these posts are highly visible. In fact, by hiding the post, it actually draws the eye and makes you wonder what kind of nonsense was posted that it got flagged.

                    B) In line with the above, the article writers on occasion respond to comments - or mostly, I've observed Mike doing so. So again, despite all the claims of censorship and etc., they are getting responses. They are being heard. This is all validating in some fashion, even if everybody is telling them that what they are saying is insane bullshit.

                    I'd say it's partly because of the attention paid to them. They think they can actually accomplish something, or at least they can go about their day feeling self-righteous in having once again told those TD people what's what, even if they never actually listen.

                    Other sites don't have quite the same style of community engagement, or if they do, the community is a lot less welcoming to the vocal assholes, and is less welcoming in a way that gets them to leave.

                    This is all of course a bunch of guessing, but because I came up with it, I like it, and will treat it as true.

                    link to this | view in chronology ]

              • icon
                The Wanderer (profile), 5 Sep 2018 @ 8:02am

                Re: Re: Re:

                I'm not sure, but I think you may have missed a nuance.

                I don't think this is (necessarily) the same person, repeating the same "I'm being censored!" complaint.

                I think this is a second person, complaining that people flagging the first person blocks the second person from seeing the first person's comments (without having to allow scripts and click through the "flagged by the community" link).

                Parts of your reply would still apply equally well to that, but other parts read to me as if you're responding to "I'm being censored!" rather than to "the content I might want to read is being censored!", so I'm not sure whether I'm parsing things correctly.

                link to this | view in chronology ]

    • icon
      lucidrenegade (profile), 4 Sep 2018 @ 10:58am

      Re: Just your ongoing way to avoid responsibility, lying that

      I think I shall build one of these, and give your comments the attention they deserve.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 4 Sep 2018 @ 9:55am

    Crowdsourced moderation can work and work well. Some platforms have opted to allow users to "tag" content and content originators with labels such as "conservative", "abortion" or "left-wing" and then allow those same users to set labels for which content they don't wish to see. New labels require a certain number/percentage of users to corroborate in order to become affixed to the content or originator. This allows every individual to customize the content they view without affecting the availability of that content to any other user.

    It also requires that every user create an account in order to post so that their content may be labeled, too. I know anonymous posting is in very strong favor here at TD but really, an "Anonymous Coward" isn't really that much more anonymous than any given profile created here. Requiring a profile in order to post really doesn't make one less anonymous.

    link to this | view in chronology ]

    • icon
      Thad (profile), 4 Sep 2018 @ 11:40am

      Re:

      Crowdsourced moderation can work and work well. Some platforms have opted to allow users to "tag" content and content originators with labels such as "conservative", "abortion" or "left-wing" and then allow those same users to set labels for which content they don't wish to see. New labels require a certain number/percentage of users to corroborate in order to become affixed to the content or originator. This allows every individual to customize the content they view without affecting the availability of that content to any other user.

      That's fine on a system where you can trust the community to moderate.

      On Twitter, I think it would last about five minutes before somebody set up a botnet to affix fake tags to various tweets.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 4 Sep 2018 @ 2:08pm

        Re: Re:

        On Twitter, I think it would last about five minutes before somebody set up a botnet to affix fake tags to various tweets.

        You need some weighting system. Ex.: if some people mark as "left-wing", show it as a tentative tag and let people vote against the tag. People whose tags you agree with should have more effect on your own filters. If the bots only ever agree with each other their tags can be ignored by other users.

        link to this | view in chronology ]

        • icon
          Thad (profile), 4 Sep 2018 @ 3:02pm

          Re: Re: Re:

          That's a good system -- similar to the old meta-moderation system that Slashdot used to use -- but it's also complex, and might be difficult for a typical end user to understand. See Tarleton Gillespie's story today on claims of search engine bias and how they're fueled by people simply not understanding how search algorithms work.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 5:37pm

            Re: Re: Re: Re:

            I'm not suggesting a fully discrete step, as /. meta-moderation was. Rather, I might see a comment with some tags and +/- buttons for each, and a box to add one that's not shown. So I describe how I think it should be tagged, and then machine learning happens...somehow...and it will end up being somewhat inscrutable, as ML does, because we can't practically release the dataset (privacy concerns, plus spammers would game it). Source code could be released, though of course most people can't read source code or understand the math well enough to verify it.

            I'd be sure to add some randomness too—maybe 1% of the time, a comment will get a random tag added or removed on its way to the user, or get displayed when normally filtered, to see if people reach the same conclusions.

            link to this | view in chronology ]

      • icon
        PaulT (profile), 5 Sep 2018 @ 3:25am

        Re: Re:

        I remember fark.com tried such a thing years ago, it was quickly abandoned for the obvious reasons.

        Interestingly, IIRC it was also implemented for the same reasons - right wingers whining endlessly about too many articles being approved that didn't conform to what they wanted to herar.

        link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Tired Old Ruse, 4 Sep 2018 @ 9:57am

    How will The Public know it's themselves rather than them?

    Since corporations control the code and policies without least accountability?

    Sheesh. Only point transparent about you is how feeble your tricks are.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 10:10am

      Re: How will The Public know it's themselves rather than them?

      Here on TD the distinction isn't important. Even if the editors are the ones deciding which content stays visible and which does not it makes no difference. It's not illegal/unconstitutional either way.

      link to this | view in chronology ]

  • identicon
    I.T. Guy, 4 Sep 2018 @ 10:03am

    Matt still gets it wrong.
    "It’s not that people like Jones shouldn’t be punished;"
    Punished for what? Being stupid? If only. I say let the village idiot speak. If we end up with a few more idiots in the village than sobeit. Chances are we won't.

    "So, really, it's time we stopped focusing on who the platforms should silence, and give more power to help the end users decide who they wish to hear."

    It's amazing to me how everyone lets FB, Twitter, etc shape their online experience, it appears. I don't do any social media so I couldn't say, but it appears that way.

    link to this | view in chronology ]

    • icon
      Thad (profile), 4 Sep 2018 @ 11:41am

      Re:

      Punished for what? Being stupid? If only. I say let the village idiot speak. If we end up with a few more idiots in the village than sobeit. Chances are we won't.

      You mean besides the ones harassing grieving parents and shooting up pizza parlors, right?

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Tired Old Ruse, 4 Sep 2018 @ 10:06am

    "We Shouldn't Want Internet Giants Deciding" - WE don't; YOU DO!

    Masnick is for corporations CONTROLLING the speech and outlets of "natural" persons. He repeats it often, can't be mistaken. From last year:

    "And, I think it's fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."

    https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content -free-speech.shtml

    Masnick is not hedging "lawyers say and I don't entirely agree", or "that isn't what I call serving The Public", but STATES FLATLY. Masnick wants a few corporations to have complete and arbitrary control of ALL MAJOR outlets for The Public! He claims that YOUR Constitutional First Amendment Right in Public Forums are over-arched by what MERE STATUTE lays out!

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 10:16am

      Re: "We Shouldn't Want Internet Giants Deciding" - WE don't; YOU DO!

      If my "MERE STATUTE" you mean "The US Constitution" then sure, you might have a point.

      Your elected representatives have seen fit to give corporations certain constitutional rights. That includes the 1st Amendment and gives them the right to moderate/censor what appears on their platforms. It even includes you if you were to put up a blog that allowed comments; You can decide which comments appear and which do not. All perfectly legal.

      Just because FB, Twitter, Instagram and others have become super popular places for people to post and communicate doesn't make them government owned and/or controlled and thus required to allow everyone on the planet a voice to say whatever they like. Those platforms are still non-government and may do as they please with their platforms, just as you would with your blog.

      Why is this so hard for you?

      link to this | view in chronology ]

      • icon
        Killercool (profile), 4 Sep 2018 @ 10:42am

        Re: Re: "We Shouldn't Want Internet Giants Deciding" - WE don't; YOU DO!

        He can't seem to grasp the fact that corporations are made of people, and that they, collectively, are guaranteed many of the same rights that are recognized individually, by virtue of the First Amendment right of free assembly, and it's recognized (COMMON LAW!) corollary right of association.

        Which, in turn, has a recognized (COMMON LAW!) corollary right - to not associate. It is by that right that private individuals and businesses can tell him to fuck off completely, and why making his statements merely hidden is a very kind way of dealing with him.

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 4 Sep 2018 @ 11:33am

          I keep saying that he confuses private with privately-owned; he has yet to refute my assumption.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 12:38pm

            Re:

            He’s have to admit he’s wrong and we both know he’d rather gouge out an eye with a rusty spoon than do that.

            link to this | view in chronology ]

      • identicon
        Rich, 4 Sep 2018 @ 11:22am

        Re: Re: "We Shouldn't Want Internet Giants Deciding" - WE don't; YOU DO!

        Whether or not corporations have constitutional rights is irrelevant to their ability to moderate/censor their platforms. They have this right because it's their private site. The constitution defines the limits of government. What a site does to moderate content is their own business.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 4 Sep 2018 @ 4:32pm

          Re: Re: Re: "We Shouldn't Want Internet Giants Deciding" - WE don't; YOU DO!

          True, but they have the constitution to back them up. If you're trying to make the strongest argument possible, bring out all the whole argument.

          link to this | view in chronology ]

  • identicon
    Christenson, 4 Sep 2018 @ 10:09am

    Personal filter setups

    I, personally, could flag every previous comment on this thread as either trolling or feeding the trolls.

    Now, let's take Techdirt as an example of where moderating is failing...every time *moderation* comes up as a topic, the trolls roll out, there's a few hundred comments, which is more than the about 50 I personally would like to read.

    Let's assume Stephen P Stone likes my approach to TD comments...how would I share that with him in an automagic way???

    Note that we aren't really solving the accountability problem here, just distributing responsibility more widely. And even though I agree with TD's current moderation practice, it's not *terribly* accountable.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 11:40am

      Re: Personal filter setups

      Let's assume Stephen P Stone likes my approach to TD comments...how would I share that with him in an automagic way???

      I can think of a few ways off-hand, including a "meta-comment" section which includes all comments made by specific people that you "follow" regardless of the article it was made in, or a sort function allowing you to sort comments by a similar list (or perhaps, a list tracking which people you have upvoted in the past) etc.

      Note that we aren't really solving the accountability problem here, just distributing responsibility more widely.

      And once you have distributed the problem far enough, the "accountability problem" is solved (at least, as much as any accountability problem can be solved when humans would rather not be accountable for themselves). Or in other words, once the problem has been distributed, it ceases to be a problem related to any real application, and simply starts being a part of the human condition, joining the "accountability problems" in all other aspects of human society, which are caused not by those aspects, but by the fact that it is humans that are using those aspects.

      link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 4 Sep 2018 @ 11:40am

      even though I agree with TD's current moderation practice, it's not terribly accountable

      Why should it be more accountable? If the community believes a comment should be flagged, no matter the reason why, the community is the one whose votes are counted. You can appeal to the community for an unflagging if you so wish; we can then decide whether you deserve one.

      I do not flag on the basis of ideological dissent or disagreement with a central point of discussion. I flag on the basis of a comment being an outright troll, a spam comment, or offering only insults or bullshit in lieu of a discussion (or a good joke). I even flag my own comments sometimes, simply because I know I make bullshit comments that deserve a flagging. (And you have no idea how many comments I have drafted but deleted because I wanted to unload on a troll but thought better of it before I submitted those comments.)

      Besides, you are forgetting about someone who should also be held accountable for the flagging: the person who made the comment. They are responsible for posting soon-to-be-flagged bullshit; they are accountable, at a minimum, for the reception it gets.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 4 Sep 2018 @ 1:05pm

        Re:

        If the community believes a comment should be flagged, no matter the reason why, the community is the one whose votes are counted. You can appeal to the community for an unflagging if you so wish

        Is that a real thing? I see the "flag", lightbulb, and LOL buttons. Are the latter two counted as "unflag" votes, or do you mean something else?

        Part of being "unaccountable" is that it's not described how this works. How many people have to click that flag button before the comment disappears? If it gets hidden, is that for everyone—in which case few people will ever see it to give it a chance at unflagging—or does someone have to "validate" it? Is there any indication why a comment is hidden?

        Sometimes when I click "post" my comments are held for moderation. I don't know why. They appear hours or days later.

        link to this | view in chronology ]

        • icon
          Thad (profile), 4 Sep 2018 @ 1:18pm

          Re: Re:

          Is that a real thing? I see the "flag", lightbulb, and LOL buttons. Are the latter two counted as "unflag" votes, or do you mean something else?

          If you flag a comment, you can change your mind and unflag it later. It's the same button. The flag button.

          Part of being "unaccountable" is that it's not described how this works. How many people have to click that flag button before the comment disappears?

          We counted once. I think it's five.

          If it gets hidden, is that for everyone—in which case few people will ever see it to give it a chance at unflagging

          The only people who can remove their flag vote are people who have already flagged the post.

          Generally speaking, they have probably read the post.

          —or does someone have to "validate" it?

          No. When it hits the set number of flag votes (which, again, I believe is 5), it gets hidden.

          Is there any indication why a comment is hidden?

          It's usually pretty clear in context. It's usually spam, or one of the regular trolls getting downvoted for spouting the same old talking points they always spout.

          Sometimes when I click "post" my comments are held for moderation. I don't know why. They appear hours or days later.

          That's a spam filter. Typically if a post gets caught in the spam filter, it's because it has too many links in it.

          The mods check the filter; if the comment isn't spam, they let it through.

          All this has been discussed in the comments at some considerable length; Mike, Leigh, et all have been very forthcoming with how it works. But I understand if you haven't been around for those conversations, it all might seem a little opaque to you. I can see how putting up a moderation FAQ might be useful for newcomers.

          Then again, I think a lot of this stuff is pretty obvious or easy to figure out just by observing.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 2:02pm

            Re: Re: Re:

            I can see how putting up a moderation FAQ might be useful for newcomers.

            That would be great. I've complained, too, about how Mike sometimes says his writings are in the public domain, but hasn't written it anywhere "official". So of course we shouldn't be surprised that people, in general, don't know that.

            A fixed flag count of 5 like you describe could easily be abused. Get 4 friends (or 4 other accounts) and flag the messages of people you don't like. I don't believe we've seen serious abuse but can see why people call the system unaccountable. You'd have no idea which 5 people flagged it or what they considered objectionable. Sure the reason's usually going to be clear, but you never know how people will (or did) misread stuff.

            link to this | view in chronology ]

            • icon
              Thad (profile), 4 Sep 2018 @ 3:07pm

              Re: Re: Re: Re:

              A fixed flag count of 5 like you describe could easily be abused. Get 4 friends (or 4 other accounts) and flag the messages of people you don't like. I don't believe we've seen serious abuse but can see why people call the system unaccountable.

              But I think that distinction is inherent to any conversation about online moderation -- because appropriate moderation will depend on the platform.

              Yes, someone could abuse the flag feature as a bullying tactic (and indeed the Report mechanism is often misused on large platforms like Twitter). But, as you say, that doesn't appear to be an issue here. Techdirt's a somewhat popular site but has a relatively small and manageable community; the honor system has worked so far, and if it ever breaks down, the admins can reevaluate the flag system then.

              link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 2:21pm

            Re: Re: Re:

            Typically if a post gets caught in the spam filter, it's because it has too many links in it.

            It seems related to Tor. I get it without any links, or with internal Techdirt links. I could switch IPs and repost, and it will usually be posted immediately. But I don't know who the mods are or whether they're online, and it might cause duplicate messages, so I rarely do it.

            link to this | view in chronology ]

            • icon
              That One Guy (profile), 4 Sep 2018 @ 3:47pm

              Re: Re: Re: Re:

              It seems related to Tor.

              Using Tor will significantly increase the odds of a comment being caught by the spam filter, yes.

              link to this | view in chronology ]

    • icon
      Ninja (profile), 4 Sep 2018 @ 11:45am

      Re: Personal filter setups

      Actually TD comment section is pretty good considering some examples out there considering they allow *anonymous* comments without the need for an account. Every place has their pet trolls and here is no different. How is it failing?

      link to this | view in chronology ]

      • icon
        Thad (profile), 4 Sep 2018 @ 12:01pm

        Re: Re: Personal filter setups

        Every place has their pet trolls and here is no different.

        Well, no; many places have moderators who ban the trolls outright, or at least have tools for blocking them.

        How is it failing?

        I'm with Christenson: the trolls, and people feeding them, have a tendency to drown out productive conversation. I don't read Techdirt because I want to see the same damn "Blue whines about how he's being censored; other people explain why he's wrong; this goes on for 50 replies" cycle a half-dozen times a day. But apparently some people do.

        This place has a great community and can be a great place to come for insightful, intelligent, nuanced conversation on a variety of technical and political issues. When they aren't spending all day getting caught up in Blue's schtick for the seven thousandth time instead.

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 4 Sep 2018 @ 1:48pm

          This place has a great community and can be a great place to come for insightful, intelligent, nuanced conversation on a variety of technical and political issues. When they aren't spending all day getting caught up in Blue's schtick for the seven thousandth time instead.

          Good discussions can come from replies to Blue, Hamilton, and the rest of the troll brigade. And besides: When a good point can be made by replying to one of those schmucks, why pass up that opportunity?

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 2:12pm

            Re:

            When a good point can be made by replying to one of those schmucks, why pass up that opportunity?

            That's fine. More commonly, I see pointless sarcastic replies along the lines of "hi, I'm (NAME) and I completely misunderstand Mike's views on copyrights". The original comment gets flagged and hidden while the replies stick around to clutter things up. Let's not do that.

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 4 Sep 2018 @ 4:35pm

              Re: Re:

              It would be pretty cool if the replies to a hidden post were also hidden and could be unhidden as a group with that link.

              link to this | view in chronology ]

              • icon
                Thad (profile), 4 Sep 2018 @ 4:45pm

                Re: Re: Re:

                That's a feature in my script (click my name). If you set hideReplies = true, then replies to hidden posts will be hidden.

                It's not the most user-friendly thing; you need to install Greasemonkey, Tampermonkey, or similar. And it doesn't have the "click to show" feature you're suggesting; it just removes the replies entirely. But I've provided the source and you're welcome to tweak it however you like.

                link to this | view in chronology ]

              • identicon
                Anonymous Coward, 4 Sep 2018 @ 5:26pm

                Re: Re: Re:

                Maybe let people vote those replies back out of oblivion, for the rare case when a useful discussion develops.

                link to this | view in chronology ]

          • icon
            Thad (profile), 4 Sep 2018 @ 3:15pm

            Re:

            Good discussions can come from replies to Blue, Hamilton, and the rest of the troll brigade.

            Can, but, in my opinion, seldom are.

            Blue doesn't know what the First Amendment is; Chip doesn't know what a natural monopoly is; Hamilton doesn't know what common law is; MyNameHere just hates it when due process is enforced; over and over again, ad infinitum. There's not a whole lot of new ground to be tread, just the same points over and over again.

            And besides: When a good point can be made by replying to one of those schmucks, why pass up that opportunity?

            Because the point's already been made a hundred times and there's no value in repeating it for the hundred-and-first.

            Because you could be having another, better conversation with somebody else instead.

            Because there are other, better ways to spend your time than saying the same things to the same trolls, post after post, day after day, and year after year.

            Because there's more to life than scoring points in online arguments.

            link to this | view in chronology ]

    • icon
      Thad (profile), 4 Sep 2018 @ 11:54am

      Re: Personal filter setups

      Seems to me that most people here are pretty happy with the moderation system as-is, and aren't interested in anything more granular.

      Or at any rate I've been sharing my code for blocking abusive posters for over a year and, as far as I know, nobody else but me has ever used it.

      We discussed this in another thread, but I think it's an interesting idea so I'll mention it again: a simple Bayesian analysis of every comment on Techdirt that's ever been hidden due to flagging would probably preduce a pretty accurate filter for abusive posters -- at least, the regular ones.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 4 Sep 2018 @ 1:16pm

        Re: Re: Personal filter setups

        Seems to me that most people here are pretty happy with the moderation system as-is, and aren't interested in anything more granular.

        That only means it's not so bad that people will leave or start hacking up alternate systems. Not a very high bar, particularly when Mike's vaguely proposing an alternate system. Facebook and Twitter have bigger budgets to figure out things like "allowing third parties to create different 'views' into those networks", but I'd be interested to see what it might look like on Techdirt.

        link to this | view in chronology ]

        • icon
          Thad (profile), 4 Sep 2018 @ 1:21pm

          Re: Re: Re: Personal filter setups

          That only means it's not so bad that people will leave or start hacking up alternate systems.

          I think it's more than that; I think most people think the flag system works fine and no additional moderation tools are desired.

          A lot of people seem to actually enjoy talking in circles with the same two or three trolls day in and day out.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 4 Sep 2018 @ 2:17pm

            Re: Re: Re: Re: Personal filter setups

            The comments here seem fine to me for the most part, but I can't tell whether that's because of the flagging system, in spite of it, or unrelated to it. I feel like you're trying to draw a conclusion about what "most people" think, without much hard data to support it. I'd say there's no crisis, but disagree with "no additional moderation tools are desired"—Mike has, right here, stated a desire for a new moderation tool (for other platforms, but it would apply just as well here).

            link to this | view in chronology ]

            • icon
              Thad (profile), 4 Sep 2018 @ 3:25pm

              Re: Re: Re: Re: Re: Personal filter setups

              I feel like you're trying to draw a conclusion about what "most people" think, without much hard data to support it.

              Respectfully, I suggest you read the comments on the other articles under the content moderation tag.

              link to this | view in chronology ]

              • identicon
                Anonymous Coward, 4 Sep 2018 @ 4:49pm

                Re: Re: Re: Re: Re: Re: Personal filter setups

                That shows a lot of imperfect solutions — few specific to Techdirt — and several statements that "proper" moderation is "impossible". What are you getting at?

                In general I see a lot of people bitching about the Techdirt moderation system and a lot of people supporting it. I would not presume to know which represents the majority.

                link to this | view in chronology ]

                • icon
                  Thad (profile), 4 Sep 2018 @ 5:19pm

                  Re: Re: Re: Re: Re: Re: Re: Personal filter setups

                  In general I see a lot of people bitching about the Techdirt moderation system

                  I think if you look closer, you'll find that you don't see a lot of people bitching about the Techdirt moderation system, you see one person bitching about the Techdirt moderation system a lot.

                  link to this | view in chronology ]

  • identicon
    Anonymous Coward, 4 Sep 2018 @ 10:18am

    I, personally, could flag every previous comment on this thread as either trolling or feeding the trolls.

    Including your own.

    link to this | view in chronology ]

    • identicon
      Christenson, 4 Sep 2018 @ 11:48am

      Re:

      Yes, you might dislike everything I post...but it still begs the question... let's suppose you have the world's *best* taste in moderation, according to exactly half of Techdirt's audience. The other half just wants to check in every now and then.

      Now, how do you share that with them? (Techdirt says push moderation out to the edge, that means you and me, not TD management)

      link to this | view in chronology ]

  • identicon
    ProfitFirstSpeakMaybeLater, 4 Sep 2018 @ 10:36am

    ShareholdersDefineSpeech

    I don't see anything in any amendment to the constitution that says a business must provide a platform for others speech.

    In fact the opposite could be said with the right-to-work laws in several states that state unions do not represent the voices of all employees... or go back to the George W Bush era where they deployed chain link fences and created free speech zones away from main events.

    We're in an era where laws and accompanying regulations haven't caught up with technology.

    A technology platform that drops a user because of their speech isn't the same as arresting citizens who go out into a public space and yell their opinions.

    The corporate liability is to the shareholder which is effected by the position/stance/brand/position of the corporation and thus enables the corporation to control speech, any speech that impacts their bottom line.

    The citizen liability unlike a corporation means speech can and often does lead to jail.

    It's not about marginalizing some speech.

    People can still go into the public square and shout at the world.

    The issue is society's reliance on technology to speak - which is much more frightening that being locked up in jail or put behind a chain link free speech zone.

    __Shareholders__ in a very real sense __define__ what __speech__ is acceptable or not and until we can regulate the profit motive, speech will be censored.

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 4 Sep 2018 @ 11:46am

      The issue is society's reliance on technology to speak - which is much more frightening that being locked up in jail or put behind a chain link free speech zone.

      This is the broader—and much more pertinent—issue that a lot of the “MODERATION IS CENSORSHIP” crowd could bring up if they were less focused on starting flame wars about Twitter and more focused on starting a coherent dialogue about the role of technology in our daily lives.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 4 Sep 2018 @ 10:55am

    Glad things are a bit more balanced again. *sigh* Riding this tribalism train is good for clicks but bad for karma...

    link to this | view in chronology ]

  • icon
    tom (profile), 4 Sep 2018 @ 11:41am

    If Facebook want's to ban a person or group, it is their right as it is their toy. Where it gets problematic is when they alter the news feeds to favor certain types of news and don't tell anyone or provide an easy way to revert back to all are equal. The result is Facebook is basically misrepresenting what is going on by gaming the news feeds.

    Google is notorious for altering search results based on whatever their goals are this week. Yet the end user has little way to know that the reason they are getting so few results this week for X isn't that X is any less popular but due solely to Google changing the search algorithm. The only way the end user can discover this is by using several search engines and comparing the results. IMO, when Google does this, they are lying to the end user about what results are popular or pertain to the user's search request.

    link to this | view in chronology ]

    • icon
      Ninja (profile), 4 Sep 2018 @ 11:59am

      Re:

      "Google is notorious for altering search results based on whatever their goals are this week."

      Really? I wouldn't rule it out but notorious would mean they'd be heads deep in a shitstorm with regulators. So far I haven't seen anything remotely looking like a shitstorm. Grandstanding yes but nothing of substance.

      link to this | view in chronology ]

    • icon
      Steve R. (profile), 4 Sep 2018 @ 12:00pm

      Relative Pricing

      I don't think I can blame this on Google, but I was surprised one day when the same product from the same vendor on the "same" website had two different prices!!!

      I had multiple tabs open to follow different leads in searching for the product. Evidently each of the leads I was following must have accessed the products webpage differently.

      This of course raises significant public concerns with product pricing and even your personal information. The company knows your zip code and personal wealth and modifies the product price accordingly.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:41pm

      Re:

      "Google is notorious for altering search results based on whatever their goals are this week."

      What are our goals this week Google?
      Same as every week, more ads!

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 1:04pm

      Re:

      Define “equal” In a way everyone can agree in. And if you mean by chronological order anyone with more that a coupe dozen active friends will find the site unusable very quickly.

      link to this | view in chronology ]

  • icon
    Steve R. (profile), 4 Sep 2018 @ 11:43am

    Are the Search Alogithms Biased Against Conservatives?

    The discussion on social media platforms raises the issue of "content moderation efforts". In turn that has raised a concerns over the algorithms used to identify so-called "hate speech" that needs to be taken down.

    The social media industry tends to be very "left" leaning and those on the "right" have claimed that their conservative viewpoints are being repressed, because those "leftists" write the algorithms. Seems to be a reasonable explanation on the surface. But I suspect that the apparent suppression of conservative viewpoints may be more complex than simply pointing a finger of blame at an abstract algorithm. I would like to see more research into how the apparent suppression of conservative comments is occurring.

    For an example, the apparent suppression of conservative comments may not be due to a biased algorithm but could, in reality, be a side-effect of how conservatives use social media when compared to liberals.

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 4 Sep 2018 @ 11:49am

      The social media industry tends to be very "left" leaning

      If anything, they stay in the centre with occasional dips into “left” or “right” that depend on whatever flaming bag of dog crap lands on their doorstep in a given day.

      link to this | view in chronology ]

      • icon
        Ninja (profile), 4 Sep 2018 @ 11:57am

        Re:

        I think liberal leaning fits better in this case. The "left" Steven dropped in that sentence actually makes his argument poorer at a glance than it really is. I wish people stopped using these labels without knowing what they mean. Most people I know, even die hard "right-wingers" agree there is no left in the US major parties. Only thing closer to left I've seen there is one or two independents.

        link to this | view in chronology ]

        • icon
          Steve R. (profile), 4 Sep 2018 @ 12:04pm

          Re: Re:

          Thanks. I was being sloppy with my terminology.

          link to this | view in chronology ]

          • identicon
            Wendy Cockcroft, 5 Sep 2018 @ 5:43am

            Re: Re: Re:

            I think you might be on to something where "how they use social media."

            How exactly do they use it? Generally speaking you have a timeline of all the posts from people you follow, with perhaps some promoted content (Twitter does this, I don't really use FB so I don't know about it). So you read through your timeline to see what's going on in the world (according to the people you follow). If that's how you use social media, the implicit bias is your own.

            Okay, let's talk about accounts being shut down. If you break the TOS you get shadowbanned or kicked off. If this happens more on the right than on the left, what is the content generally being posted by those accounts? There's the clue. If I'm right that egregious douchebaggery is likely to get you kicked off a platform then either one of two things are true:

            1) more hateful douchebags on the right than on the left/liberal/progressive side, or
            2) some selectively censorious douchebag is kicking off right wing douchebags only.

            So... if people are using social media to behave badly and to foment hatred against [$group] and get kicked off for it, is this a -wing thing or a "don't be such a douchebag" thing? Serious question.

            link to this | view in chronology ]

            • icon
              PaulT (profile), 5 Sep 2018 @ 6:18am

              Re: Re: Re: Re:

              "(Twitter does this, I don't really use FB so I don't know about it)"

              Facebook too. Your wall has a feed of the people & groups you follow (not necessarily in the order they were posted, see below), peppered occasionally with promoted content.

              "So you read through your timeline to see what's going on in the world (according to the people you follow)"

              Well... yes and no. The biggest current annoyance with FB is that they try and guess which posts you're most interested in. You can affect this to some degree by marking people as "close friends" (thus always seeing their recent posts at the top of your feed) or by unfollowing users you disagree with (i.e., you're still "friends" for other purposes but you don't automatically see what they post) But, Facebook's algorithm still tries guessing for you.

              It can be a major annoyance for various reasons - for example, I know I've missed events I would have liked to attend, but because I hadn't communicated directly for a while with the friend who posted about them it didn't show in my feed until after the event was over. It's not hard to imagine that people are unwittingly being drawn into echo chambers because FB chose not to show them the dissenting voices in the argument.

              Every social media platform acts differently, but they all seem to be the same sort of way - they try to make decisions about what you're likely to most interested in, and this tends to put blinders on less savvy readers. If people depend on them as their primary or even only source of news, they end up having skewed ideas about what other people think as a whole.

              "So... if people are using social media to behave badly and to foment hatred against [$group] and get kicked off for it, is this a -wing thing or a "don't be such a douchebag" thing?"

              It's kind of both. There are douchebags everywhere, no matter which part of the political spectrum you reside upon. However, certain types of douchebaggery will tend to congregate on certain parts of the spectrum. When a certain rule is applied, therefore, it may seem to address one side more than the other, even though realistically it's only because that particular type of douchebag is simply represented more on one side. It just depends on the type of rule that is applied.

              For example, white supremacists will often be more right-wing, while militant vegans tend more toward the left. If the "don't be a hateful douchebag" standard is applied, then people promoting attacks on meat eaters will be banned as equally as those promoting attacks on black people. However, if the standard applied is "don't be a racist", the vegans will be free to continue. To the racist, that might appear to be unequal treatment, but all it really means is that the site decided that the racism issue was more important to them than the meat eating one.

              A person may disagree with that choice, but the correct answer is to either stop being a racist douchebag or to go to a platform where racism is acceptable, not to whine that they should be able to be as racist as they want on a platform that finds it abhorrent.

              link to this | view in chronology ]

    • icon
      Ninja (profile), 4 Sep 2018 @ 11:54am

      Re: Are the Search Alogithms Biased Against Conservatives?

      "For an example, the apparent suppression of conservative comments may not be due to a biased algorithm but could, in reality, be a side-effect of how conservatives use social media when compared to liberals."

      There was a comment about that on Ars. Conservatives tend to be old people yelling at the cloud. The guy in question gave his grandparent as an example. The old man hadn't even heard about internet based conservative outlets. It seems the apparent bias towards liberal content is actually a result of liberal people being more fluent in tech and being more active on the internet.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:45pm

      Re: Are the Search Alogithms Biased Against Conservatives?

      " those on the "right" have claimed that their conservative viewpoints are being repressed"

      I have heard this claim many times now but have not seen any examples. Have any of these individuals stated clearly, exactly what happened and why they think they are the victim of censorship?

      link to this | view in chronology ]

  • icon
    Steve R. (profile), 4 Sep 2018 @ 12:10pm

    Re: Are the Search Alogithms Biased Against Conservatives?

    I think there is a lot of validity behind ->"It seems the apparent bias towards liberal content is actually a result of liberal people being more fluent in tech and being more active on the internet."

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:46pm

      Re: Re: Are the Search Alogithms Biased Against Conservatives?

      Not only that, but there are many more on the left than on the right.

      link to this | view in chronology ]

  • identicon
    John Smith, 4 Sep 2018 @ 12:16pm

    If bias doesn['t describe a service which allows death threats against a conservative, but terminates the conservative's account for literally no reason other than branding an anti-feminist post that attacks no one and uses no vulgarity as "misogyny," which would does describe it?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:41pm

      Re:

      We will let you know when that has actually happened in the real world.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 4 Sep 2018 @ 12:49pm

        Re: Re:

        It seems there are plenty of generalities used in the complaints about the so called suppression of the conservative voice but there are few examples accompanied by sufficient detail such that one can get an accurate picture of what transpired. I think this is by design.

        link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 4 Sep 2018 @ 1:52pm

      Re:

      That depends. During the height of GamerGate, how many Twitter accounts were suspended that belonged to Gaters making death threats and harassing certain “targets”, and how many accounts were suspended that belonged to Gater targets shooting back insulting rhetoric to their harassers? (And yes, the second one happened multiple times.)

      Twitter does not care about specific views; they care about who is making the bigger tirefire by way of reportbombing.

      link to this | view in chronology ]

  • identicon
    John Smith, 4 Sep 2018 @ 12:27pm

    user-controlled moderation doesn't work because the people in favor of moderation already know how to do this. Their goal isn't for THEM not to see the content they don't like, but for YOU not to see it.

    Any woman who spoke out about #metoo before it became fashionable could have been banned as a "misogynist troll." One female anti-feminist lost her social media account for things far less than what feminsits had done.

    I still think the market solves these probles naturally, since censorship has always proven to be bad for business. While totalitarianism is easy to enforce, doing so while pretending to be open and democratic is not.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:42pm

      Re:

      Oh look you somehow turned this subject misogynist. At least you didn’t try to ban all “lawyers” again.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 4 Sep 2018 @ 7:23pm

        Re: Re:

        You're confusing this with out_of_the_blue. Blue's the one who whines about lawyers (while relying on them to enforce his copyright).

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 4 Sep 2018 @ 10:25pm

          Re: Re: Re:

          Oh this one hates lawyers too. Funny thing though. He frequents this site which, to the extent I can understand his post-stroke style of writing, does exactly what he’s complaining about. User controlled moderation, that is.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:50pm

      Re:

      How many John Smiths are there?
      Or are you using TOR again?

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 4 Sep 2018 @ 12:27pm

    the whole aim is to let the entertainment industries decide what sites we can go to, what can be done ie, upload or download or both and how much we have to pay. almost every government, security service and court is doing whatever they can to help achieve this and at the same time keep the rest of the world off of the internet. just wait and see what happens and whether this happens and how long it takes. and it's all in the name of keeping a few people wealthy and in control of what they want and stopping us from finding out what members of governments, security services, courts, the rich and the famous are up to!!

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 4 Sep 2018 @ 12:54pm

      Re:

      It sure seems that way.

      And the result will be the end of the internet. The powers that wanted this might be shocked when everyone stops using the POS after it has been screwed up.

      And the birth of internet 2.0 will hit mainstream. The big wig fat boys will have to start all over again - LOL.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 4 Sep 2018 @ 12:35pm

    As far as it goes, but what about amplification

    Yeah, private censorship has plenty of issues which Techdirt has discussed at length. But it's unrealistic to ignore the context -- social media giants are far from open platforms -- they sell targeting and amplification of messages. So, the platforms amplify the effect of the wealthiest and most manipulative voices. This is what has had such a terrible effect on elections and political dialog. Allowing people to screen out the kind of stuff they don't want to hear is irrelevant -- political and ideological advertisers are not really trying to convince anyone who doesn't want to listen, but they are very effective at organizing hate and playing on fear. No amount of user filtering can actually affect this aspect. To the contrary, it will increase the echo chamber effect of social media communications, making it ever easier to for users to hear only what they want to hear.

    link to this | view in chronology ]

  • icon
    ECA (profile), 4 Sep 2018 @ 12:49pm

    In the end..

    OPINION should NOT be a problem with any country..
    If your country is a Fair and decent place...OPINION is only opinion. NOT an actionable thing. it is 1 person having a problem, that is for the .1% that have a problem that the gov. has no reason to worry about.

    Once an opinion has an affect on the society, THINGS need to be looked at and Adjusted to the Major parts of society.. NOt for 1 group, but for fairness to all..

    On the internet...IF someone wants to have a SAY, let them..but NOT to spam other persons..IF we want to see/listen to them..WE CAN GO TO THEIR CHANNEL, and listen.
    Its the same as with Trolls..THEY CAN have a good reason/comment to say. you can even give them a Room of their OWN for their OWN opinions..

    There are many reasons to listen to EVERYONE.. because you ARE NEVER 100% accurate..there are Many sides to anything. Seeing opinions, can help you see what you MAY HAVE MISSED..
    How many games are perfect? NONE. (not even windows10),.

    PS...THAT IS MY OPINION.

    link to this | view in chronology ]

    • identicon
      John Smith, 4 Sep 2018 @ 4:13pm

      Re: In the end..

      The problem with OPINIONS is that some people use them to justify death threats, defamation, harassment, and general bullying against those wit whom they do not agree.

      Voting for Trump has brought this problem to the surface but it is not unique to politics.

      Our general tolerance for bullying renders free speech concerns moot. We have none.

      link to this | view in chronology ]

      • icon
        ECA (profile), 4 Sep 2018 @ 5:53pm

        Re: Re: In the end..

        Then comes Justify ability..and abit of Logic.
        ASK them to prove their foundation of what they are saying...
        Its not the idea to talk random, its the idea that someone HAS an opinion..
        Get them to explain their side..And if you find them a threat, call the cops.
        thats called SELF POLICING..something we SHOULD do..
        The only difficulty in all of this, tends to be ANON, and finding someone you have Little knowledge about.. Iv even said it to my chat buddies..It would take you 15-30 minutes to remember WHERE I LIVE, Find the local police/emergency, get them to my home to ASSIST ME..
        In 15 minutes Anything could of happened.

        link to this | view in chronology ]

  • icon
    sumgai (profile), 4 Sep 2018 @ 10:23pm

    The reason this won't work? It will do no more than reinforce one's opinions, which will stem from what one reads, and accepts, as fact(s).

    The fact that no post ever published, anywhere, at any time, on any medium, has stated "This is my opinion, you should make up your own mind.", or words to that effect, tells me that we internet uses, all of us, are pretty much looking to be part of the herd, and not very willing to stand out for our own selves.

    Stated another way, this idea will lead to confirmation bias on a level never before dreamed of. I predict that somewhere along the line, the repercussions of "us versus them" is going to become very ugly.

    Please note that I am not saying that we should be forced to read other's opinions that we find to be repugnant, but simply that we should not wander willy-nilly down a path of "I don't like what 45 is saying, so I'm going to tune him out, just ignore him and all of his cronies." At some point in the future, when they're pretty sure that we're all ignoring them, they're going to come for us. And it won't be with pitchforks and firebrands, either. The survivors will be lamenting that we could've had prior warning, if we were not so close-minded, and had been paying at least a little attention to their rantings and ravings.

    Uncharacteristically, I have no "better solution" to the problem. Sorry, I'll keep thinking on it, but in the meantime....

    sumgai

    p.s. Sorry Mike, hope your Cheerios still taste OK!


    Disclaimer: See the quote in my second paragraph.

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 4 Sep 2018 @ 11:49pm

      Re:

      Stated another way, this idea will lead to confirmation bias on a level never before dreamed of.

      A few people have suggested this, but I don't buy it for a variety of reasons.

      1. People have been complaining that Twitter/Facebook/YouTube are ALREADY a giant filter bubble. You already choose who you follow/subscribe to on those sites. So I don't see this as changing that.

      2. The whole concept of the filter bubble is exaggerated anyway. Most people don't care that much, but DO want to avoid assholes/trolls. That's not being in a filter bubble. It's just getting rid of assholes and trolls.

      3. I actually think this could lead to LESS of a filter bubble effect, because those who build the best filters -- meaning high quality, fewer assholes, but still diverse and intelligent -- can get a bigger following.

      link to this | view in chronology ]

      • icon
        sumgai (profile), 5 Sep 2018 @ 11:11am

        Re: Re:

        Mike,

        Allow me to both agree and disagree on some of your points, please.

        1) Alleged filters on social media are something about which I know nothing - I refuse to give up my last shards of privacy, so I don't participate in anyway. I limit my on-line presence to Fora such as yours, where I usually find evidence of the median participant IQ to be somewhat higher than that of a raw carrot.

        2) The fact that "most people don't care that much" is concerning, at least to me. It says a whale of a lot about apathy, and opens a large diorama of reasons for that willingness to not participate in one's own government, even at a visceral level. Wanting to avoid assholes and/or trolls of various kinds is laudable, but in point of fact, if we avoid them, then we are just pretending to ourselves that they aren't there in reality.

        Compare this to the SESTA/FOSTA crap - "If we remove this (already criminal) activity from the internet, then it will go away entirely". I'm sure you can think of other examples near and dear to your heart.

        To quote Sgt Springer, from my Boot Camp days: "If you stick your head in the sand, then your ass is exposed, with a couple of nice big red rings painted on it." Better to know where they are, what they're doing, and how to keep them from causing ever greater harm.

        3) I must defer to my statements in 1) above. However, I do hold out hope that you're correct, that people will become less inurred to the actions of their fellow citizens, particularly those in the business of governing others. I'm not looking for outrage and a desire for retribution, I'll be happy to see merely a high degree of concern and a willingness to express a thoughtful opinion. (Read that last as: not an emotional outpouring, devoid of any rationality.)

        Thanks Mike. ;)


        sumgai

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Sep 2018 @ 11:34am

    BTW, there's a typo in the second embedded post:
    "of ...anchor markup--p-anchor" rel="nofollow noopener noopener" target="_blank">experiences of police violence"

    link to this | view in chronology ]

  • identicon
    Matthew A. Sawtell, 5 Sep 2018 @ 12:17pm

    So... the end game has been broached...

    ... will we lose the 'safe harbors' in order to police the platforms? If all things are equal, it appears that customers (i.e. end users) are already voting with with feet and leaving places like Twitter and Facebook. Other approaches are only making the powers to be in Beijing, Moscow, and elsewhere laugh.

    link to this | view in chronology ]

  • identicon
    Anonymous Cowherd, 6 Sep 2018 @ 4:23am

    Letting people hide "spam, trolls and hatred" (or whatever it is they don't want to read) by themselves is all well and good, certainly better than having such judgments dictated for everyone from on high by corporate pencil pushers accountable to no one.

    But it's unlikely to satisfy the "do something" -people.

    Because they really do want to silence people they disagree with, not just avoid reading them themselves. Complaints about bias in censorship, real or imagined, are often directed not so much at at the existence of a bias, but rather that the bias isn't in their favor.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.