Censorship By Weaponizing Free Speech: Rethinking How The Marketplace Of Ideas Works

from the challenges-of-our-time dept

It should be no surprise that I'm an unabashed supporter of free speech. Usually essays that start that way are then followed with a "but..." and that "but..." undermines everything in that opening sentence. This is not such an essay. However, I am going to talk about some interesting challenges that have been facing our concepts of free speech over the past few years -- often in regards to how free speech and the internet interact. Back in 2015, at our Copia Summit we had a panel that tried to lay out some of these challenges, which acknowledged that our traditional concepts of free speech don't fully work in the internet age.

There are those who argue that internet platforms should never do any moderation at all, and that they should just let all content flow. And while that may be compelling at a first pass, thinking beyond that proves that's unworkable for a very basic reason: spam. Almost everyone (outside of spammers, I guess) would argue that it makes sense to filter out/moderate/delete spam. It serves no useful purpose. It clutters inboxes/comments/forums with off-topic and annoying messages. So, as Dave Willner mentions in that talk back in 2015, once you've admitted that spam can be filtered, you've admitted that some moderation is appropriate for any functioning forum to exist. Then you get to the actual challenges of when and how that moderation should occur. And that's where things get really tricky. Because I think we all agree that when platforms do try to moderate speech... they tend to be really bad at it. And that leads to all sorts of stories that we like to cover of social media companies banning people for dumb reasons. But sometimes it crosses over into the absurd or dangerous -- like YouTube deleting channels that were documenting war crimes, because it's difficult to distinguish war crimes from terrorist propaganda (and, sometimes, they can be one and the same).

An even worse situation, obviously, is when governments take it upon themselves to mandate moderation. Such regimes are almost exclusively used in ways to censor speech that should be protected -- as Germany is now learning with its terrible and ridiculous new social media censorship law.

But it's not that difficult to understand why people have been increasingly clamoring for these kinds of solutions -- either having platforms moderate more aggressively or demanding regulations that require them to do so. And it's because there's a ton of really, really crappy things happening on these platforms. And, as you know, there's always the xkcd free speech point that the concept of free speech is about protecting people from government action, not requiring everyone to suffer through whatever nonsense someone wants to scream.

But, it is becoming clear that we need to think carefully about how we truly encourage free speech. Beyond the spam point above, another argument that has resonated with me over the years is that some platforms have enabled such levels of trolling (or, perhaps to be kinder, "vehement arguing") that they actually lead to less free speech in that they scare off or silence those who also have valuable contributions to add to various discussions. And that, in turn, raises at least some questions about the idea of the "marketplace of ideas" model of understanding free speech. I've long been a supporter of this viewpoint -- that the best way to combat so-called "bad speech" is with "more speech." And, you then believe that the best/smartest/most important ideas rise to the top and stomp out the bad ideas. But what if the good ideas don't even have a chance? What if they're silenced before they even are spoken by the way these things are set up? That, too, would be an unfortunate result for free speech and the "marketplace of ideas".

In the past couple of months, two very interesting pieces have been written on this that are pushing my thinking much further as well. The first is a Yale Law Journal piece by Nabiha Syed entitled Real Talk About Fake News: Towards a Better Theory for Platform Governance. Next week, we'll have Syed on our podcast to talk about this paper, but in it she points out that there are limitations and problems with the idea of the "marketplace of ideas" working the way many of us have assumed it should work. She also notes that other frameworks for thinking about free speech appear to have similar deficiencies when we are in an online world. In particular, the nature of the internet -- in which the scale and speed and ability to amplify a message are so incredibly different than basically at any other time in history -- is that it enables a sort of "weaponizing" of these concepts.

That is, those who wish to abuse the concept of the marketplace of ideas by aggressively pushing misleading or deliberately misguided concepts are able to do so in a manner that short-circuits our concept of the marketplace of ideas -- all while claiming to support it.

The second piece, which is absolutely worth reading and thinking about carefully, is Zeynep Tufekci's Wired piece entitled It's the (Democracy-Poisoning) Golden Age of Free Speech. I was worried -- from the title -- that this might be the standard rant I've been reading about free speech somehow being "dangerous" that has become tragically popular over the past few years. But (and this is not surprising, given Tufekci's previous careful consideration of these issues for years) it's a truly thought provoking piece, in some ways building upon the framework that Syed laid out in her piece, noting how some factions are, in effect, weaponizing the very concept of the "marketplace of ideas" to insist they support it, while undermining the very premise behind it (that "good" speech outweighs the bad).

In particular, she notes that while the previous scarcity was the ability to amplify speech, the current scarcity is attention -- and thus, the ability to flood the zone with bad/wrong/dangerous speech can literally act as a denial of service on the supposedly corrective "good speech." She notes that the way censorship used to work was by stifling the message. Traditional censorship is blocking the ability to get the message out. But modern censorship actually leverages the platforms of free speech to drown out other messages.

The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself. As a result, they don’t look much like the old forms of censorship at all. They look like viral or coordinated harassment campaigns, which harness the dynamics of viral outrage to impose an unbearable and disproportionate cost on the act of speaking out. They look like epidemics of disinformation, meant to undercut the credibility of valid information sources. They look like bot-fueled campaigns of trolling and distraction, or piecemeal leaks of hacked materials, meant to swamp the attention of traditional media.

These tactics usually don’t break any laws or set off any First Amendment alarm bells. But they all serve the same purpose that the old forms of censorship did: They are the best available tools to stop ideas from spreading and gaining purchase. They can also make the big platforms a terrible place to interact with other people.

There's a truth to that which needs to be reckoned with. As someone who has regularly talked about the marketplace of ideas and how "more speech" is the best way to respond to "bad speech," Tufekci highlights where those concepts break down:

Many more of the most noble old ideas about free speech simply don’t compute in the age of social media. John Stuart Mill’s notion that a “marketplace of ideas” will elevate the truth is flatly belied by the virality of fake news. And the famous American saying that “the best cure for bad speech is more speech”—a paraphrase of Supreme Court justice Louis Brandeis—loses all its meaning when speech is at once mass but also nonpublic. How do you respond to what you cannot see? How can you cure the effects of “bad” speech with more speech when you have no means to target the same audience that received the original message?

As she notes, this is "not a call for nostalgia." It is quite clear that these platforms also have tremendous and incredibly important benefits. They have given voice to the formerly voiceless. There are, certainly, areas where the marketplace of ideas functions, and the ability to debate and have discourse actually does work. Indeed, I'd argue that it probably happens much more often than people realize. But it's difficult to deny that some have weaponized these concepts in a manner designed to flood the marketplace of ideas and drown out the good ideas, or to strategically use the "more speech" response to actually amplify and reinforce the "bad speech" rather than correct it.

And that's something we need to reckon with.

It's also an area where I don't think there are necessarily easy solutions -- but having this discussion is important. I still think that companies will be bad at moderation. And I still think government mandates will make the problems significantly worse, not better. And I very much worry that solutions may actually do more harm than good in some cases -- especially in dragging down or silencing important, but marginalized, voices. I also think it's dangerous that many people immediately jump to the platforms as the obvious place to put all responsibility here. There needs to be responsibility as well on the parts of the end users -- to be more critical, to have more media literacy.

And, of course, I think that there is a space for technology to potentially help solve some of these issues as well. As I've discussed in the past, greater transparency can help, as would putting more control into the hands of end users, rather than relying on the platforms to make these decisions.

But it is an area that raises some very real -- and very different -- challenges, especially for those of us who find free speech and free expression to be an essential and core value. What do we do when that free speech is being weaponized against free speech itself? How do you respond? Do you need to weaponize in response and flood back the "bad speech" or does that just create an arms race? What other ways are there to deal with this?

This is a discussion that was started a while back, but is increasingly important -- and I expect that we'll be writing a lot more about it in the near future.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: attention, censorship, denial of service, free expression, free speech, marketplace of ideas, nabiha syed, zeynep tufekci


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 24 Jan 2018 @ 12:19pm

    And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

    "It is quite clear that these platforms also have tremendous and incredibly important benefits. They have given voice to the formerly voiceless."

    Again, "platforms" implies NEUTRAL. They even already got immunity for what others publish. So long as within common law, a "platform" has ZERO right to control speech, or the speech outlet.

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      Anonymous Coward, 24 Jan 2018 @ 12:20pm

      Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

      And "best way to combat so-called "bad speech" is with "more speech."" -- BUT in practical fact, YOU have banned the home IP addresses of dissenters here, including me, and YOU or other administrator have "hidden" thousands of my and others comments here at TD to disadvantage dissent.

      In short, you're so below hypocrite in this area that I had to coin the term "masnocrit".

      link to this | view in chronology ]

      • This comment has been flagged by the community. Click here to show it
        identicon
        Anonymous Coward, 24 Jan 2018 @ 12:21pm

        Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

        "I expect that we'll be writing a lot more about it in the near future" -- But not proposing any specific policy, let alone practicing it. -- Though I admit that after I complained for several years, you eventually responded, and that since THE CHANGE in September, I rarely get the "Held For Moderation" or my comments "hidden". But I don't think that's due to your changing, as your actions behind the scenes blatantly contradicted your public statements for YEARS.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 24 Jan 2018 @ 1:44pm

          Re: Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

          LOL.

          link to this | view in chronology ]

        • identicon
          Anonymous Coward, 24 Jan 2018 @ 2:47pm

          Re: Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

          Your comments are hidden by us flagging them because you are abusive, spamming, and a troll. TD has nothing to do with it.

          Stop this behavior and debate us logically and politely and we'll stop flagging your comments into oblivion. Continue and well, nothing changes.

          BTW, flagged for abuse/spam/trolling.

          link to this | view in chronology ]

      • icon
        Not an Electronic Rodent (profile), 24 Jan 2018 @ 12:39pm

        Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

        YOU or other administrator have "hidden" thousands of my and others comments here at TD to disadvantage dissent.

        Or possibly they get hidden because a large number of commentors hit the report button because they're sick of listening to your ad-hom whining and spamming?

        link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          identicon
          Anonymous Coward, 24 Jan 2018 @ 1:20pm

          Re: Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

          well, its not a secret that it is easy to trigger those snowflakes too. It doesn't really matter, no matter where you go someone wants to suppress freedom of speech somewhere.

          the difference is only if you are being silenced or if your enemy is being silenced.

          I rather enjoy when my enemy is being silenced... I am just too stupid to realize that it also sets the tone and mindset for me to be silenced eventually too. A lesson few ever learn. Reminds me of how America went nuts against communism, even secret spying by government against communist citizens... and now... the communists have an even greater foot hold in America through socialism. And the goal of socialism is communism, though you won't find many able to understand that in today's ignorance.

          People often fail to realize that attempts to silence the enemy usually results in a backlash that not only undoes your efforts but also sets you back even further than if you had done nothing at all.

          The likes of Donald Trump is a good example of this phenomenon. The media's repeated scoffing and derision wound up giving Trump a platform to run on where he could champion all the people that felt marginalized by bigger voices.

          It is best to let the idiot run their mouths and run out of steam. The more you plug them up, the more pressure builds, until shit hits the fan.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 24 Jan 2018 @ 1:30pm

            Re: Re: Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

            No matter how much you beg, Donny’s not going to let you suck him off.

            link to this | view in chronology ]

          • identicon
            Anonymous Coward, 24 Jan 2018 @ 2:52pm

            Re: Re: Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

            I am just too stupid to realize that it also sets the tone and mindset for me to be silenced eventually too.

            I am just too stupid

            This made my day. Thank you for admitting that you are, in fact, a fool.

            link to this | view in chronology ]

          • icon
            Stephen T. Stone (profile), 24 Jan 2018 @ 4:29pm

            Re: Re: Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

            I rather enjoy when my enemy is being silenced.

            The “enemy” whom you enjoy being silenced is another person, not some remorseless subterranean monster that is trying to eat your head. They deserve as much of a chance to speak their mind as you do. Silencing them gives them a chance to become a martyr for free speech—the rise in popularity and visibility of “alt-right” provocateurs like Richard Spencer, for example, is due in part to people trying to silence him one way or another.

            If you want to win a “war” like political debate, you cannot resort to silencing others. You have to outperform them. Unravel their arguments from the inside out and show those arguments for what they are. If you can make their arguments look like shit without resorting to personal attacks or “otherwording” (“I like dogs!” “In other words, you hate cats.”) or other logical fallacies, you stand a much better chance of defeating your “enemy”.

            Of course, that approach would rely on you both debating in good faith and recognizing that people with differing opinions than you are still people. You have shown no capability of doing either.

            link to this | view in chronology ]

      • icon
        An Onymous Coward (profile), 24 Jan 2018 @ 1:18pm

        Re: Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

        Did you consider that yours is the "bad speech" we hope to drown out with "more speech"?

        link to this | view in chronology ]

        • identicon
          Thad, 24 Jan 2018 @ 4:42pm

          Re: Re: Re: And yet YOU frequently state that "these platforms" are made for walkin', and that's just what they'll do, and one of these days these boots are gonna walk all over you

          I think of it more as the bad speech that's drowning out good speech.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 1:28pm

      Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

      Statutory laws trump common law.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 5:32pm

      Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

      Every time you spam useless commentary on this website it makes me laugh to the point of stitches.

      For websites to apply the standards that you demand would see your blue ass blocked so hard, you'd be tasting the contents of your large intestine faster than you can say "only pirates support net neutrality".

      C'mon, blue. Shiva Ayyadurai lost. Hamilton got a clue and he doesn't dream about this site no more. If Techdirt is as insignificant and not taken seriously at all as you and your troll fuckbuddies say why do you spend so much time here?

      Anyone with two brain cells to rub together can tell the only reason why you spend so much time on a site you loathe the guts of and proudly claim to have no significant effect whatsoever is that you're afraid.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 6:03pm

      Re: And yet YOU frequently state that "these platforms" have their own "First Amendment" RIGHT to silence those voices at any time, for any reason.

      oh but here I and other anonymous commenters like me have the power; here ya go, here's your report vote...

      link to this | view in chronology ]

  • identicon
    Rich Kulawiec, 24 Jan 2018 @ 12:32pm

    Great piece; here are a few observations

    1. Something I've said for years is that spam (and other related forms of abuse) are not speech, just as a brick with an attached note thrown through a window is not publication.

    2. We figured out over thirty years ago that moderation was probably going to be necessary -- whether or we liked it, and a lot of us didn't. But as soon as any forum reaches sufficient size and reach, it will likely include bad actors. The lessons we learned on Usenet in the early to mid 1980's are still directly applicable today.

    3. I've read Tufekci's piece. It's very good. I recommend it to everyone.

    4. The people who built "social media" operations built them to monetize private information, not to provide discussion forums. It's thus unsurprising that they're much better at the former than at the latter.

    5. Those same people made a catastrophic strategic blunder well before they ever plugged in the first server. They didn't plan for abuse at scale. They SHOULD have: it was a well-known problem long before any of these operations began. They SHOULD have known it was coming. They SHOULD have designed in mechanisms to deal with from the time that their operations were ideas on a whiteboard. They SHOULD have budgeted and staffed for it. They SHOULD have had robust, scalable procedures in place.

    But they didn't do any of that. Instead they ignored the problem. Then they denied it. Only now, belatedly, are they started to address it, and their efforts are -- as we can all see -- haphazard and ineffective. And they're still making mistakes that we all figured out were mistakes decades ago when we made them. (For example, they're still focused on algorithms. Wrong.)

    6. As a direct result of (5), these operations are largely not under the control of their ostensible owners. Not any more. They're available for weaponization by anyone with the requisite resources. This is a stunning level of negligence, incompetence, and irresponsibility -- and a reckoning for it is long overdue.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 12:45pm

      Re: Great piece; here are a few observations

      A quick question, how do you build and scale an organization to moderate the speech of everybody who is alive, without building a totalitarian nightmare?

      link to this | view in chronology ]

      • identicon
        Rich Kulawiec, 24 Jan 2018 @ 1:05pm

        Re: Re: Great piece; here are a few observations

        Good question. I have a two-part answer, partly snarky, partly serious.

        1. You hire people who've dealt with this problem before and you get them involved early. When they say "you really don't want to that" you don't do it. When they say "you need to do this" you do it. If the former requires leaving money on the table, and the latter requires spending money, you do them both anyway.

        2. You start by recognizing that "how do you build and scale an organization to moderate the speech of everybody who is alive" is not a correct statement of the problem you have to solve. I'm not being snarky (again) here: stating the problem correctly is key to solving it. You only have to moderate some of the speech of some of the people who use your operation, and that's a far smaller number than the population of the planet.

        And while moderation is a useful tool, it's reactive. Proactive measures are much better, because they stop the problem before it can gain a foothold AND they reduce the scope of the remaining problem so that other measures have a better chance of dealing with gracefully. In the interest of brevity, I'll give you one example: bidirectionally firewall anything on the DROP list, and refuse signups from anyone with an email address in a domain whose A, NS, or MX records are in DROP-listed space.

        link to this | view in chronology ]

        • This comment has been flagged by the community. Click here to show it
          identicon
          Anonymous Coward, 24 Jan 2018 @ 1:25pm

          Re: Re: Re: Great piece; here are a few observations

          "I have a two-part answer, partly snarky, partly serious."

          you forget to add "all stupid and ignorant too."

          You can't do it, no matter what you just can't do it. Someone somehow is going to use the machine you helped build to put you in your place for your own good. It's why the machine gets built in the first place, to do just exactly that under the guise of protecting you. In fact that is all government in a nutshell, to put society and its operatives in their place for their own good when deemed necessary, all under the guise of protecting you.

          If you want government, then you had better keep it well heeled, or it will turn on you and rip your throat out without a second thought.

          link to this | view in chronology ]

          • icon
            Roger Strong (profile), 24 Jan 2018 @ 2:53pm

            Re: Re: Re: Re: Great piece; here are a few observations

            If you want government, then you had better keep it well heeled, or it will turn on you and rip your throat out without a second thought.

            Likewise without government, the biggest thug turns on you and rips your throat out without a second thought. And becomes the government.

            So you need to keep both the government and the thugs - and polluters, scammers, spammers, etc. - well heeled. There's a word for that. It starts with 'g.'

            Look; we did the experiment. Usenet and its tens of thousands of newsgroups had no moderation. The sci.space groups were wonderful - full of people actually working on real launch systems. There were newsgroups where Hollywood producers would hang out, tell anecdotes and answer questions.

            Then AOL connected to Usenet, and then the general public all got internet and Usenet access. The sci.space groups filled with infantile conspiracy theorists and trolls attacking the professionals. The same thing happened in other groups. The signal-to-noise level dropped and then disappeared, and the professionals left.

            Some moderation - some government - is necessary. Even if it's just the users themselves flagging abuse, trolling and spam to hide it.

            link to this | view in chronology ]

            • identicon
              john smith, 24 Jan 2018 @ 3:07pm

              Re: Re: Re: Re: Re: Great piece; here are a few observations

              USENET was a victim of its own success. Its total free speech allowed whistleblowers and dissenters to speak out without fear of being censored or even shouted down. Debaets were won by truth not by force.

              Once the communities were built around truth and free speech, the information was public domain, and whoever could move the audience to moderated websites would get rich. Someetimes the moderators were the ones harassing anyone who would try to contribute. The inabiltity to handle USENET is a reflection on us as a species, not on speech. Section 230 and the lack of ability to get justice when people crossed lines took away the fear of God from malfeasors.

              Once the moderators hijacked the USENET audience, commercial versions of the groups, with no permanent archive, anad controlled by a single person or company, used their control over the flow of information to make money. This happened several times.

              To bash USENET because of free speech would be like saying that someone's mustic must stink because everyone walked out of the concert even if they did it to sabotage the musician. Fake reviews, fake news, all of it is premised on audience stupidity, and some "protector" sayins we need them as a guide. We don't. Even the "noise" on USENET was no problem, but the free speech sure was to those who wanted to scam money from others without having to defend their product or service.

              The other problem is anonymous defamation, through remailers, which left any target defenseless thanks to Section 230. "Sue the original poster" they say, but if that's some homeless person in Bulgaria, paid by some lawyer in the southwest, nothing will happen. Lawyers have a financial interest in engineering litigation and that's why they love 230;just put the bait there, wait for someone to grab it, republish it as their own words, get sued, then the lawyer shows up to put out the fire the lawyer started.

              We need USENET more than ever right now but the audience doesn't care. The Running Man's depiction of the American people was correct.

              link to this | view in chronology ]

              • icon
                Roger Strong (profile), 24 Jan 2018 @ 3:50pm

                Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                whoever could move the audience to moderated websites would get rich [...] Once the moderators hijacked the USENET audience

                There were no website moderators hijacking the USENET audience. It was the flood of abuse, trolling and noise that drove everyone to look elsewhere.

                To bash USENET because of free speech would be like saying that someone's mustic must stink because everyone walked out of the concert even if they did it to sabotage the musician.

                I'm not bashing USENET. I'm not saying that someone's music stinks. I'm saying that when the musician is constantly drowned out by air horns, megaphones and screamed personal attacks, there's not point in them playing. They go somewhere else where air horns and megaphones aren't allowed, and the music lovers follow.

                Air horn, megaphone and screamed personal attack connoisseurs may call it censorship - it isn't - but it's vital that those places exist for the music lovers.

                link to this | view in chronology ]

              • identicon
                Anonymous Coward, 24 Jan 2018 @ 4:11pm

                Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                So you want to hold innocent people accountable for someone else's bad behavior, even though they had nothing to do with it? Yeah, GREAT idea. That will solve everything.

                One of the biggest reasons we have the robust and open internet we have today is in part BECAUSE of Section 230.

                And even if that were to happen it still wouldn't get 'justice' because the actual people responsible got off scott-free with no repercussions. All they have to do is create a new account from a different IP and they can do it all over again. Instead of going after the actual culprits, victims will just go after the bigger, easier target, nevermind the fact the actual perpetrators get away with it. We got our money, that's all we care about.

                Holding Twitter/Facebook/Youtube/etc... responsible for something their users did is not justice, that's abuse of justice and instant gratification. Just because you can't get at or find the person responsible doesn't mean that you shift that guilt onto someone who isn't responsible.

                That isn't a good idea, that's immoral, unfair, and just plain stupid.

                link to this | view in chronology ]

              • identicon
                Rich Kulawiec, 24 Jan 2018 @ 5:13pm

                Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                You know, I hate to burst your bubble, but Usenet had moderated newsgroups from nearly the beginning. I know, I ran some of them. (Usenet prior to the Great Renaming used two namespaces: net.* for most newsgroups, mod.* for those which were moderated. After the great renaming, newsgroup names, for the most part, did not reflect their moderated/unmoderated status.)

                Usenet had many issues, and lots of people, including me, have written about them at great length elsewhere. Some of those issues posed difficult problems, and an awful lot of time and energy was expended wrestling with those, nearly all of it by (a) volunteers (b) acting in good faith (c) to preserve a useful communications resource.

                But what really did it in was the September That Never Ended. (Usenet veterans knew that every September would bring an influx of newbies who were starting their freshman year in college and thus would have Usenet access for the first time. We had grown accustomed to this and were ready for it. But we weren't ready for what AOL did.)

                Despite all that, though, parts of Usenet remain vibrant. The C language newsgroup, for example, is doing quite while, and comp.misc has had a revival following some unpopular changes at Slashdot. And we learned -- bitter lessons, to be sure, but we learned. Thus my acidic critique of today's "social media" for not paying attention to things we figured out a quarter century ago.

                link to this | view in chronology ]

                • icon
                  btr1701 (profile), 25 Jan 2018 @ 12:15pm

                  Re: Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                  > Despite all that, though, parts of Usenet remain vibrant.

                  rec.arts.tv is still a functioning TV discussion group. As everywhere, there are trolls to ignore and spam to filter, but there's a core group of real people who have real discussions about TV.

                  link to this | view in chronology ]

                • identicon
                  carlb, 25 Jan 2018 @ 4:36pm

                  What we learned a quarter-century ago?

                  You kids these days... sheesh. These issues don't merely date to USENET and something "a quarter-century ago". The same nonsense was going on using spark gap radio transmitters and Morse Code back in the days of the RMS Titanic. Pre-1912, there was relatively little control over radio as a shared medium; due to technical limitations, everyone from merchant shipping to military to amateur radio "hams" building transmitters out of Ford motorcar ignition coils were forced to mostly share the same few frequencies - and the result was cacophony. One could make this open new medium suddenly much less useful by sending garbage on the same channel - everything from (expletive deleted) ad-infinitum to false naval orders. Eventually, the marine morse code was left on 500kc and the "hams" moved way up the dial, with everything regulated to death by various governments. That didn't stop all the problems - the use of shortwave transmitters to maliciously interfere with foreign signals was widespread in the Cold War-era Soviet bloc, as was the endless broadcast of propaganda. The wilful interference still goes on in places like Iran, which target BBC broadcasts in Persian and the satellite uplinks which carry them. There's nothing magical about Usenet, HTTP, social media or anything else - obstructing free speech by broadcasting co-channel interference predates the RMS Titanic in the morse code era. It's just harder to direction find the source of all the noise.

                  link to this | view in chronology ]

          • identicon
            Dingledore the Previously Impervious, 29 Jan 2018 @ 4:14am

            Re: Re: Re: Re: Great piece; here are a few observations

            "If you want government, then you had better keep it well heeled, or it will turn on you and rip your throat out without a second thought."

            That's a flawed statement on so manay levels. Government is inherent because it's representative of the people. How you organise your government is up to the people, whatever the size of the group.

            We are both social and vicious animals. Things fail if the balance scale between those tips one way or the other. But it's not the government that dictates that, it's the people.

            link to this | view in chronology ]

        • identicon
          Anonymous Coward, 24 Jan 2018 @ 1:26pm

          Re: Re: Re: Great piece; here are a few observations

          >bidirectionally firewall anything on the DROP list, and refuse signups from anyone with an email address in a domain whose A, NS, or MX records are in DROP-listed space.

          That is the same as saying some people have the right to speak, and others do not, especially when that results in the dropping of an organization due to the intentional activities of one or two people. You have also cut of the primary means of the organization getting in touch with you to rectify any problems in that approach.

          It also has another failure mode, that any body can get an email address in a domain that you cannot realistically block, i,e, Google.

          Finally, you are effectively banning anonymous speech, or, at stopping people who do not have the knowledge to obtain an email address for an online handle.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 24 Jan 2018 @ 1:39pm

            Re: Re: Re: Re: Great piece; here are a few observations

            First of all, NOBODY has a right to speak on any platform. These are private operations. So let's leave out "rights" because they don't apply.

            Second, this is not my first day on the job. I'm well aware of the mechanisms that can be used to bypass this one example, which is why I said -- please go back and read -- "one example". Not only did I cover this extremely briefly, but it's only one out of hundreds of examples. No, I'm not going to type in a book's worth of proven abuse control strategies and tactics learned (often painfully) over decades. If you want that knowledge, including an exhaustive analysis of the features/drawbacks of each one, hire me.

            Third, the only people/organizations in DROP-listed space are spammers, phishers, malware purveyors, and network hijackers. That's why it's DROP-listed. There is nobody there that any sane platform wants as a user.

            link to this | view in chronology ]

            • identicon
              John Smith, 24 Jan 2018 @ 3:09pm

              Re: Re: Re: Re: Re: Great piece; here are a few observations

              Correct. No one has the right to speak on any platform.

              No platform has the right to claim credibility if they censor even one speaker. This is especially true if they sell advertising, because we don't know if they are silencing critics of the sponsor.

              This is why USENET is the answer but no one cares. At least with social media the postings can be traced and easily authenticated. This is actually much more preferable. There's alsao the libel loophole that you can't defame someone on their own page, so if they don't moderate their comments, well (this also lets them delete it or re or respond).

              link to this | view in chronology ]

              • identicon
                Thad, 24 Jan 2018 @ 4:49pm

                Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                No platform has the right to claim credibility if they censor even one speaker.

                Yeah, sure. If a moderator deletes a post because it doxxed somebody, or bans a user who keeps dropping in to rant about how the government did 9/11 while faking the moon landing and taking away his paint chips, it's the platform that lacks credibility.

                link to this | view in chronology ]

        • icon
          orbitalinsertion (profile), 24 Jan 2018 @ 3:06pm

          Re: Re: Re: Great piece; here are a few observations

          Something to consider when offering such a platform is _don't build an attractive nuisance_. And the platform shouldn't engage in spammy nuisance behavior itself. Frequently, that seems to be in conflict with max monetization, which is why IPOs are the arrows pointing to the next step down the drain.

          They all encourage the poor behavior or garbage content which they now seem to want to filter, or have been ordered to do so. They have done it through their own example. And yeah, most could have avoided a fair chunk of this by listening to people who have been dealing with digital fora since even before the damn internet. Although i have this idea that some of the people involved with creating such platforms were the trolls of older platforms.

          link to this | view in chronology ]

          • identicon
            Rich Kulawiec, 24 Jan 2018 @ 5:04pm

            Re: Re: Re: Re: Great piece; here are a few observations

            You're absolutely right. The term that's sometimes used to describe such platforms is "abuse magnet". With experience, we can see them coming from the time they're a vaporware announcement. Unfortunately, in almost all cases, the people involved are so invested in what they're doing -- no matter how awful it is -- that they won't listen. And then the inevitable happens, and then they say "...but nobody could have foreseen" despite the fact that we all foresaw it.

            link to this | view in chronology ]

        • identicon
          loquor, loqui, ergo sum, 24 Jan 2018 @ 9:59pm

          Re: Re: Re: Great piece; here are a few observations

          But who gets to decide the block-ees that go on the DROP list? You? The platform owners as a private entity do have some rights to limit speech on their platforms, up and until the point they become a de facto town square; they don't want to be seen as doing that though, since only sycophants to their agenda will sign up, and then where will the headlines and attention come from? Without headlines, where will the $ flood in from?

          That's the problem with your theory of 'moderation'. It isn't actually moderation at all, it's enforced speech when it comes to these platforms. If you don't say the right things in precisely the right way for today's 'moderator' then you don't get to speak. It's just a different weaponized use of censorship.

          There is a very simple answer to this -- stop listening to what you don't want to hear. No one MAKES you log in 8277 times a day to see what's on Twitter. No one MAKES you go see Sally's Facebook feed with more pictures of her kitten and world-wide pronouncements of how she woke up this morning with frizzy hair. Just like no one MAKES you stop on the street to listen to the street preacher. You can CHOOSE to ignore everything you don't want to interact with. Exercise that choice rather than disabling another's ability to speak, even when it is just noise.

          There is no 'good speech' or 'bad speech', there is only speech. Good and bad are your perspectives. The Declaration of Independence was treason to the King and Parliament, it was the morning stars singing to a new dawn of freedom to the colonists. The same words on the same parchment. Good speech and bad speech simultaneously. Mike, if you're not 100% in support of free speech, you're not in support of it at all. You've said 'but' and maybe don't even realize it. I hope you tread carefully here. While I don't agree with much of what you say, you've been quite steadfast in your defense of people's right to speak, but when you go down this road, even one step, you're down a different road you likely don't want to travel.

          The real problem here is the utter lack of humility and an infinitely over-inflated sense of self worth mankind has developed. This idea that 'self' is all-in-all and that your sphere IS the universe, which you rule, so that anything entering your sphere uninvited is anathema and as the supreme ruler of your sphere you have the right to deem it bad and execute judgment against it; the self is your new god and anyone who blasphemes you must be silenced.

          Get over yourselves. Plug your ears, avert your eyes, stop looking at your damned phone and actually look a person in the eye once in awhile. Realize that your every trip to the coffee shop bathroom is not headline, above the fold news, that you are not remarkable because you excreted waste from your bowels. You're not remarkable because you drew breath. Like every other collection of atoms in the universe, you may not be remarkable at all. Accept that and be content with what you are, and stop worrying about what everyone else is or isn't. Stop telling people what they can or cannot say, and just say what you want to say. Your existence in the universe is not dependent upon everyone liking what you say. There is no plebiscite being held tomorrow regarding your existence, with 'likes' and 'upvotes' counted like absentee ballots.

          Expression is one of the things that separates humanity from the rest of the creation. Use it or lose it.

          link to this | view in chronology ]

          • icon
            Mike Masnick (profile), 24 Jan 2018 @ 10:48pm

            Re: Re: Re: Re: Great piece; here are a few observations

            There is no 'good speech' or 'bad speech', there is only speech.

            That's a valueless perspective that I reject. I agree that it's all relative, but I think most sane people can accept that some speech can be probelmatic. THAT DOES NOT MEAN WE SHOULD CENSOR THAT SPEECH. No where do I advocate for censorship. However, my point is that the simple belief that "more speech" can respond to problematic speech does not always work (many times it does -- but we're increasingly seeing cases where it does not). Which is what I'm raising here. What do we do about it that IS NOT censorship, while recognizing that doing nothing IS LEADING TO CENSORSHIP.

            My concern is the LOSS of free speech by people using these platforms as a way to censor those they dislike.

            • Mike, if you're not 100% in support of free speech, you're not in support of it at all. You've said 'but' and maybe don't even realize it. I hope you tread carefully here. While I don't agree with much of what you say, you've been quite steadfast in your defense of people's right to speak, but when you go down this road, even one step, you're down a different road you likely don't want to travel.*

            I believe you've misread the post. I do not advocate for censorship. I'm saying the opposite. I'm saying that we're actually ending up getting MORE CENSORSHIP and MORE ATTACKS on free speech by framing things incorrectly. My concern is how do we enable more speech -- and not create systems that can be gamed to drown out speech.

            I have not said but. I have not advocated for censorship. I'm saying the opposite. You are reading something into what I wrote that I did not say.

            Get over yourselves. Plug your ears, avert your eyes, stop looking at your damned phone and actually look a person in the eye once in awhile.

            That's not the issue we're discussing. If that were all it took to solve these issues you'd be right. But it's not.

            Expression is one of the things that separates humanity from the rest of the creation. Use it or lose it.

            Yup. And that's why I'm concerned about the censorship I described in the post, in which voices are being silenced.

            link to this | view in chronology ]

            • identicon
              loquor, loqui, ergo sum, 26 Jan 2018 @ 7:43am

              Re: Re: Re: Re: Re: Great piece; here are a few observations

              I understand what you're saying, Mike, but people on an electronic platform don't actually have the ability to censor others (unless they own the platform) just via volume of speech.

              This isn't like in the physical world, where if you're holding a sign on a street corner and someone comes to intentionally block your sign with a tarp or a sheet or something, where they're using their 'speech' to censor your 'speech'. Saying that idiots mass Tweeting in response to something makes people want to shut down their Twitter account or not engage, isn't the same thing. I see where it's close, and I see the argument you're making, but at the end of the day the person 'censored' isn't actually prevented to from continuing to place their position in the marketplace.

              It seems to me you're talking about 'heckler's veto' in an electronic age. But it doesn't work quite the same way as it does, again, in the physical realm. Punching someone in the nose to get them to shut up is clearly illegal, even if you find the speech repugnant. What's the analog in the digital realm?

              Platforms COULD potentially do something to make this work, but to do so requires them to actively be engaged in the defense of all speech, and almost no platform provider has that level of courage. Even Cloudflare failed on this one, and they're one of the most important ones out there because they keep the pipes open, not just the platform.

              Your assumption that sane people can agree on what is good or bad speech is assuming facts not, in the entire history of mankind, in evidence. Sanity itself is not an agreed upon construct. To say that all speech is sacrosanct IS the value statement. Everyone should get their turn when they want it, to say whatever it is they want to say. Any time you start applying a judgment parameter to the value of the speaker (sane or insane) or the speech (hate/crazy/liberal/conservative) you are inherently starting down the path to limit it in a way that will eventually lead to someone in control of your speech who sees you as crazy.

              In the physical world, it is actually very easy to insure that everyone has the ability to say what they want -- you put opposing viewpoints on separate street corners and tell them to stay there. No one is able to stop any other from speaking or saying what they want, and the sidewalks (which are held in public trust for just this purpose) are allowed to be used by everyone. If when you do that, a sole speaker of an unpopular view decides he can't take the overwhelming response on the other side of the street, that's his problem -- his convictions weren't strong enough in his position, it's not censorship.

              It's even easier in the digital realm to stand by your position (with the exception of DDoS, which is the equivalent of the sheet or tarp in the physical realm).
              Loss of conviction and willingness to stand by your position and articulate it at a keyboard is not the same thing as not being ABLE to do so because the ability has been taken away. If you don't have the fortitude to stand by your beliefs and articulate them (if you feel the need to) you don't get to blame other people for that because they did the same thing you did -- spoke up.

              I think the argument that somehow allowing all speech to happen causes us to lose speech is specious. It's simply not the same digitally as it is physically. If it's important to you to speak, then you should be able to, and since you can't stop me from typing, you can't censor me, unless you take the platform away.

              link to this | view in chronology ]

              • icon
                Mike Masnick (profile), 26 Jan 2018 @ 5:18pm

                Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                You seem to keep insisting you know what I'm saying while 100% misrepresenting what I'm saying. That's... impressive.

                Your assumption that sane people can agree on what is good or bad speech is assuming facts not, in the entire history of mankind, in evidence.

                I am saying exactly the opposite of this. I do not think people can agree on what is good or bad speech -- and have spent literally YEARS speaking out against hate speech and other laws that purport to make "bad" speech illegal.

                Why do you keep putting false words in my mouth?

                Mike, but people on an electronic platform don't actually have the ability to censor others (unless they own the platform) just via volume of speech.

                It is not "volume" that is the concent -- it is, AS CLEARLY STATED IN THE ARTICLE, the scarcity of attention that is the problem, and the ability to "game" the system in a way that silences people and effectively censors them. That is the concern.

                link to this | view in chronology ]

          • identicon
            Rich Kulawiec, 25 Jan 2018 @ 12:50am

            Re: Re: Re: Re: Great piece; here are a few observations

            "But who gets to decide the block-ees that go on the DROP list? You?"

            I don't have, and have never had, any authority over the DROP list. It's run by Spamhaus, and if you could stop ranting for a few minutes, maybe you could head over to their web site and learn about it...and why you really, really want to be using it in your border routers or external firewalls.

            What I do get to decide is what measures I'll use to defend operations that I'm responsible for, and the DROP list is a no-brainer. So much so that if I wasn't using it, it could be argued that I wasn't doing my job adequately.

            Like I said: only one measure, out of many. They all have their positives/negatives, they all can be evaded by clueful abusers, they all have their implementation costs, they all (well, mostly) interact. I can't summarize decades of operational experience for you in a text box, nor am I going to try. But I can point out that judicious application of these is a known-working approach toward solving the problem. And I can point out that failure to deploy these is a known-failed approach that inevitably leads to the destruction of the platform in question.

            link to this | view in chronology ]

        • icon
          Ishtiaq (profile), 28 Jan 2018 @ 9:25pm

          Re: Re: Re: Great piece; here are a few observations

          @Rich Kulawiec

          Quote "Proactive measures are much better, because they stop the problem before it can gain a foothold AND they reduce the scope of the remaining problem so that other measures have a better chance of dealing with gracefully." Umquote.

          Is that not what China and North Korea are doing? (And my government is trying to do the same). And they rightfully get slagged off for doing it.

          What is the difference between moderation and censoring?

          Regardless of whether it is forum users shutting people up by down voting them into silence or moderators shutting them up it is still censorship.

          And as earlier comments show, when the forum users do the censoring it verges on bullying.

          End rant… Cheers… Ishy

          link to this | view in chronology ]

          • identicon
            Wendy Cockcroft, 29 Jan 2018 @ 2:31am

            Re: Re: Re: Re: Great piece; here are a few observations

            @Ishy

            You're assuming people have a right to be heard at all times. Not true.

            I haven't downvoted your comment, nor will I, because it's not abusive. If, however, you were calling names or otherwise trolling, I would.

            Why should I, as a reader, be obliged to wade through a ton of spam and trolling to get to the comments I'm interested in? And why should I have the speech of others foisted on me? I'm not interested in sunglasses, get rich quick schemes, or anything like that. And, per your logic, they are speech and shouldn't be moderated.

            I sometimes open the hidden comments — only to discover why they were hidden. Rule of thumb: if you disagree, do so politely, giving reasons why. As you did, actually.

            link to this | view in chronology ]

            • icon
              The Wanderer (profile), 29 Jan 2018 @ 4:48am

              Re: Re: Re: Re: Re: Great piece; here are a few observations

              I think the underlying argument is that "preventing a potential listener from hearing the speech, and therefore preventing that listener from being able to decide whether or not to listen" is functionally equivalent to "preventing the speaker from speaking".

              And I think there's a certain amount of merit to that argument, particularly in cases where the potential listener has no input on whether or not such filtering occurs or on what is filtered when it does occur.

              Where it falls down is when the potential listener does know that the filtering is taking place, and what (sort of thing) is being filtered, and has actively chosen or approved this.

              Most governmental filtering is the former type. Essentially all individual filtering is the latter type. The question is which type the platform-based filtering is - and I think the answer to that is much more varied.

              link to this | view in chronology ]

              • identicon
                Wendy Cockcroft, 29 Jan 2018 @ 5:42am

                Re: Re: Re: Re: Re: Re: Great piece; here are a few observations

                Agreed. And we can all agree there is no perfect solution; people will always find ways of disrupting discussions, shouting other people down, spamming, and trolling.

                The question I believed was being asked is, "Should we try to manage this or not?" I believe we totally should. How to do so without merely silencing dissent will take a lot of discussion and I don't believe there's a perfect answer.

                link to this | view in chronology ]

          • identicon
            Thad, 29 Jan 2018 @ 10:38am

            Re: Re: Re: Re: Great piece; here are a few observations

            I'd link you to the relevant xkcd strip, but it's already linked in the article.

            link to this | view in chronology ]

    • icon
      rou (profile), 24 Jan 2018 @ 5:17pm

      Re: Great piece; here are a few observations

      [quote]1. Something I've said for years is that spam (and other related forms of abuse) are not speech, just as a brick with an attached note thrown through a window is not publication.[/quote]

      I don't think that's really a useful way to see things. It just dodges the question of what reasonable limits we should accept on free speech by redefining whatever's outside those limits as "not speech."

      link to this | view in chronology ]

      • identicon
        Rich Kulawiec, 24 Jan 2018 @ 5:41pm

        Re: Re: Great piece; here are a few observations

        It's not dodging the question. It's confronting it, and recognizing that spam is far more like vandalism than it is like speech.

        The brick thrown through the window remains vandalism even if someone scribbles Shakespeare on it.

        link to this | view in chronology ]

        • identicon
          Thad, 25 Jan 2018 @ 10:13am

          Re: Re: Re: Great piece; here are a few observations

          Except that spam is not analogous to a rock thrown through a window (unless it contains malware). It's analogous to junk mail. That's why another name for it is "junk mail".

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jan 2018 @ 6:19am

      Re: Great piece; here are a few observations

      "I've read Tufekci's piece. It's very good. I recommend it to everyone."

      Tufekci's piece is *well-written*, but it isn't very good.

      Tufekci has contracted Hillary's elite arrogance disease:

      "Today, even the most powerful elites often cannot effectively convene the right swath of the public to counter viral messages."

      Precisely so. And why is this bad? Who elected these elites to "counter viral messages"?

      "Creating a knowledgeable public requires at least some workable signals that distinguish truth from falsehood."

      Tufekci presumes that

      1) there are universal "truths" and falsehoods"
      2) everyone must conform to an elite's definition of "truth"
      3) that "truth" is everywhere decidable (see Goedel & Turing).

      The real world isn't Lake Wobegon; half of the people are *below average*, and in a democracy they still have the *right to vote*, as well as all their other rights, including the right to *free speech*.

      Get over it, elites.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 24 Jan 2018 @ 12:39pm

    > It should be no surprise that I'm an unabashed supporter of free speech.

    ..."except when Facebook/Google/etc are the ones suppressing speech on behalf of themselves or as a favor for any number of governments"

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 1:08pm

      Re:

      You seemed to have missed a) the article pointing out that platforms are bad at moderation (and the many, many articles TD has written about moderation failures on social media platforms), and more importantly b) that private platforms have a First Amendment right to editorial discretion -- see CBS v. Democratic National Committee for the scoop on that one (hint: the 2nd half of CDA 230 is not the Internet exceptionalism you think it is).

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 24 Jan 2018 @ 1:49pm

        Re: Re:

        The design of Twitter heavily leans towards enhancing network effects with all of its pluses and minuses.

        A social media platform's design is critical to facilitating good discourse. Twitter is a losing game due to its free-for-all nature and minimal compartmentalizion of discourse.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 24 Jan 2018 @ 1:56pm

          Re: Re: Re:

          Doen't twitter offer its users the tools for them to curate the content they get, by choosing which accounts to follow? Its user ought to exercise their own powers more often.

          link to this | view in chronology ]

        • identicon
          John Smith, 24 Jan 2018 @ 3:12pm

          Re: Re: Re:

          Twitter succeeded where Myspace failed because it allowed for one-way following, whereas on Myspace celebrities had to "friend" their fans, putting them on equal terms. On Twitter, the celeb can send DMs to fans without them being able to respond. It reinforces our informal caste system.

          link to this | view in chronology ]

  • icon
    Sayonara Felicia-San (profile), 24 Jan 2018 @ 1:05pm

    Mike you keep saying:

    " our traditional concepts of free speech don't fully work in the internet age. "


    Mike you seem to lament the fact that the Internet has only exacerbated what has always been an issue, which is the proliferation of snake oil. Primarily, the proliferation of political snake oil, which used to be the monopoly of the government, and/or those wealthy and connected enough to influence it.

    The true purpose of free speech isn't so that I can get my jollies calling white people honkies, that's just an unintended, but positive externality to the prime motive, and that is, not only to make sure truth and ideas make it through to quell corruption and power, but that this truth is acknowledge IN TIME for it to make a difference.

    If I told you 15 years ago, that George Bush and the CIA were full of shit, and that the Nigerians weren't selling yellow cake to Iraq, nobody would listen, but now it's well known and completely forgotten.

    What if back then, I could reach millions of people and prove that Bush was full shit? Then I become a problem. But that's exactly what's happened, and yet instead of Bush being laughed out of the white house, we have a proliferation of 9-11 truth web sites. Still, the only effect seems to have been to the ego's of the various crooks in charge.


    SO WHAT EXACTLY ARE YOU SO AFRAID OF? Why exactly, do you think that a bunch of people having the right to say anything they want is so dangerous, when nothing they say, actually changes anything?

    It's because you want to stop the end result and not the problem. The problem is, students aren't being taught critical thinking. They aren't being taught to question things for themselves. They aren't being taught that sometimes the media are also liars. They aren't being taught how to objectively analyze information...

    So instead of doing that, you want to stop these products of a failed educational system from believing whatever it is that they want, because, well, in your view they can't tell the difference.

    link to this | view in chronology ]

    • identicon
      Bruce C., 24 Jan 2018 @ 1:36pm

      Re:

      The attention scarcity and the DDoS effects are not products of a failed education system, they are the products of the way the brain searches for patterns in the information it receives. In a stadium full of people booing, you can't hear the guy cheering unless he's right next to you. Similarly, the ability of a useful point of view to get the attention it deserves decreases dramatically if only one person reads it because it's buried under 16 pages of comments designed to encourage mindless acceptance of whatever agenda is being pushed by the propagandists. No human has time to find a needle in an internet haystack.

      link to this | view in chronology ]

    • icon
      Roger Strong (profile), 24 Jan 2018 @ 3:28pm

      Re:

      If I told you 15 years ago, that George Bush and the CIA were full of shit, and that the Nigerians weren't selling yellow cake to Iraq, nobody would listen, but now it's well known and completely forgotten.

      That was well-known within six months of the lie being told. Within three months of the invasion, if not before. Everyone was listening. Certainly many were questioning and disputing the WMD claims well before the invasion.

      we have a proliferation of 9-11 truth web sites

      That's the best way to deflect attention from a scandal. Promote a much bigger, more sensational, but ultimately meaningless scandal. Those with legitimate questions about what warnings Bush II ignored, were quickly grouped in with the holographic airplane/controlled demolition crowd.

      The problem is, students aren't being taught critical thinking.

      The problem is, "big lie" techniques work. When people hear the lie from 20 different sources, they believe it. With the right money and Fox, Breitbart and the blogs and social media outlets they feed, the liar has their 20 different sources. AND the ability to discredit the truth and those telling it.

      Understand, they don't have to maintain that lie for long. Just a few months until after the invasion/election. That the (yellow) cake was a lie was all over the news three months after the invasion. (Hence the Plame affair). Trump's popularity plummeted not long after the election. By then it didn't matter.

      link to this | view in chronology ]

      • icon
        Sayonara Felicia-San (profile), 25 Jan 2018 @ 11:40am

        Re: Re:

        > That was well-known within six months

        and yet made no difference whatsoever, but mike says we have a free speech problem...

        link to this | view in chronology ]

  • identicon
    Wixr, 24 Jan 2018 @ 1:16pm

    But what if the good ideas don't even have a chance? What if they're silenced before they even are spoken by the way these things are set up?

    I feel you should emphasize the conclusion much more on this. The "drown out free speech" part.


    I also want to posit a lesser mentioned talking point - What happens when two outrage mobs go after each other (see: GamerGate)?

    Should the victor be the one to tell history? People who say things like "No bad tactics - only bad targets"? People who actively work to create the monster they want to be seen as slaying?

    As someone who saw what happened for "Ground Zero harassment mob" GamerGate, my contribution to this discussion is: we need actual data. We need definitions - harassment cannot remain a subjective concept unless we want social media platforms to enable heckler's veto. We need to not take someone's word that their enemies are Sin incarnate and that anyone who talks with them is branded with a Scarlett Letter.

    If you truly want to limit harassment while minimizing harm on free speech - Find out what actually happens in harassment mobs. Don't let self-interested parties dictate the historical record.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Jan 2018 @ 2:31pm

    overloading to induce trance

    All of this sounded familiar. It took a while, but I think I found the vocabulary to describe it.

    Search: "overloading to induce trance"

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 24 Jan 2018 @ 4:15pm

    This is the last straw. I'm finished with Techdirt. You think an 'I'm all for free speech. However...' is any better than an 'I'm all for free speech but...'?

    Free speech is not for newspapers and witty repartee in drawing rooms. Free speech is intended to protect precisely those thoughts and opinions that society finds 'dangerous' because it threatens to upend the social structure. For Techdirt to jump on the bandwagon arguing for exceptions to free speech because some feel cowed by the speech of others, is truly the end of the concept of free speech in 'polite' internet society. It's fascinating that a site that has defended the speech rights of actual Nazis and killers suddenly goes soft when faced with professional victims claiming any criticism directed at them is 'harassment'. It just goes to show that defending free speech for hypothetical/historial mass murderers is easy because the murderers' genuine victims are not in front of you, whereas defending the free speech of Larry across the hall who thinks feminists are going too far, is beyond the pale, because his pretend 'victims' are also across the hall, and demanding everyone who lets Larry talk be fired. That's too hard for Techdirt. Thats too hard for almost anyone, and it really proves how completely fake and superficial all the talk has been from armchair internet activists about how much they 'value' free speech. They couldn't maintain their stance when it was hard. As this article shows, they can't maintain their stance even when it's slightly inconvenient. It's been amazing watching dotcom aftet dotcom 'complicate' i.e. discard its commitment to free speech over the most harmless shit imaginable, but as long as some snowflake claims it stopped them from talking, it's justified. I would say fully 99.999% of the pro-free speech talk on the internet has been proven in the last few years to be completely fake and incapable of standing up to even the slightest pressute, if that pressure comes from the 'correct' people.

    So you can stick a fork in Techdirt along with the whole idea that the West is founded on free speech. It was always a lie. The West is in favour of free speech for political dissidents in the East and that's as far as it goes. When it comes to their own backyards, Westerners' spines go soft at the first pleasant word directed for any reason at any member of the 'protected classes', and those Westerners will immediately swallow any justification given by any member of those protected classes for silencing that speech, without even subjectig said justification to the slightest smelp test. That's why Techdirt couldn't bring themselves to oppose this new Orwellian redefinition of censorship as being pro-free-speech (of the classes protected from the slightest discomfort at the responses to their own exercise of speech).

    If even Techdirt doesn't have the nutsack to oppose these dishonest snowflake arguments, then no legacy 'new media' companies will. You can stick a fork in the idea of 'new media' too. The 'new media' are now the 'old new media' and the 'new new media' is not a website but the mass of common sense people who are willing to say 'No' and be labelled 'harassers' for it. All the old 'defenders' of free speech will now join the old attackers of it in an unholy alliance against the common man, and all because some snowflakes that people like Masnick can't socially avoid claim to experience a magical voodoo silencing curse whenever they hear anyone else say an inappropriate word.

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 24 Jan 2018 @ 5:25pm

      Re:

      Free speech is not for newspapers and witty repartee in drawing rooms.

      It is for nearly every form of expression, in every potential mode of publication and expression, for anyone who wishes to express an opinion. It is for newspapers just as much as it is for you.

      Free speech is intended to protect precisely those thoughts and opinions that society finds 'dangerous' because it threatens to upend the social structure.

      Free speech laws protect all speech. Unpopular and “dangerous” speech needs more protection for the obvious reasons. Popular and “safe” speech still needs protection, though. Such speech can still be challenged and censored; the Banned Books Week campaign proves as much.

      For Techdirt to jump on the bandwagon arguing for exceptions to free speech because some feel cowed by the speech of others, is truly the end of the concept of free speech in 'polite' internet society.

      We already have exceptions in the concept of free speech. Granted, those exceptions are post-expression punishments of “illegal” speech such as an incitement to violence, but they do exist.

      It's fascinating that a site that has defended the speech rights of actual Nazis and killers suddenly goes soft when faced with professional victims claiming any criticism directed at them is 'harassment'.

      Techdirt writers tend to believe in the concepts and intent of free speech, but not in a dogmatic, “I can never examine why I believe in this” way.

      It just goes to show that defending free speech for hypothetical/historial mass murderers is easy because the murderers' genuine victims are not in front of you, whereas defending the free speech of Larry across the hall who thinks feminists are going too far, is beyond the pale, because his pretend 'victims' are also across the hall, and demanding everyone who lets Larry talk be fired.

      Larry has every right to say “feminists are going too far”. That said: If Larry is saying that as a representative of the company he works for, or other people feel that Larry’s comments are creating a hostile work environment for women/self-proclaimed feminists, Larry’s right to speak his mind will not save him from being pushed into the open hellmouth that is Human Resources.

      it really proves how completely fake and superficial all the talk has been from armchair internet activists about how much they 'value' free speech. They couldn't maintain their stance when it was hard.

      If anything, it proves how talk of protecting the right of free speech is not about protecting people from the consequences of their speech. The government cannot do a damn thing to Larry because he complained about feminists. Larry’s employers, on the other hand, have every right to pink-slip him if they believe his comments will hurt their company in some way.

      I would say fully 99.999% of the pro-free speech talk on the internet has been proven in the last few years to be completely fake and incapable of standing up to even the slightest pressute, if that pressure comes from the 'correct' people.

      Your belief in free speech seems rather dogmatic and absolute.

      The West is in favour of free speech for political dissidents in the East and that's as far as it goes. When it comes to their own backyards, Westerners' spines go soft at the first pleasant word directed for any reason at any member of the 'protected classes'

      The only people who seem to lose their spine over kind words directed at, say, queer people are those who express hatred of queer people.

      Incidentally, you might want to rephrase your statement for clarity; I think you are trying to make a specific point that you simply have not made.

      those Westerners will immediately swallow any justification given by any member of those protected classes for silencing that speech, without even subjectig said justification to the slightest smelp test

      Hi, queer guy here. I abhor the “God Hates Fags” signs wielded by the Westboro Baptist Church. I still believe they should have every right to express their opinions in public and on any privately-owned platform that would have them. Their entirely legal expression of anti-queer bigotry is unpopular; it deserves more protection precisely because it is unpopular.

      Techdirt couldn't bring themselves to oppose this new Orwellian redefinition of censorship as being pro-free-speech

      In case you did not read that xkcd strip: The right to free speech protects you from government interference with your speech and expression. It says nothing about the moderation of privately owned/operated platforms. You can argue that being banned from Twitter is the same thing as censorship. If you are banned from Twitter, your argument could have more weight. If you post that argument on a site like Tumblr, however, you will have a much harder time of making the case that your speech has been silenced by The Hellbird.

      If even Techdirt doesn't have the nutsack to oppose these dishonest snowflake arguments, then no legacy 'new media' companies will.

      Two things.

      1. When you call someone a "snowflake", you are quoting Fight Club, a satire written by a gay man about how male fragility causes men to destroy themselves, resent society, and become radicalized. Tyler Durden is not a hero, but a personification of the main character’s mental illness; his "snowflake" speech is both a mockery and an example of how fascists use dehumanizing language to garner loyalty from insecure people. If you use “snowflake” as an insult, you are quoting a domestic terrorist who blows up skyscrapers because he is insecure about how good he is in bed.

      2. Techdirt does not, has not, and will never speak for all other “new media” sites and personalities.

      The 'new media' are now the 'old new media' and the 'new new media' is not a website but the mass of common sense people who are willing to say 'No' and be labelled 'harassers' for it.

      Even if “saying ‘no’ ” was all that groups like GamerGate did to the targets of their harassment campaigns, they would still be harassers for the fact that they would be flooding a target’s mentions with the intent of harassing that target into leaving a given service. That seems an awful lot like censorship, at least from where I sit.

      All the old 'defenders' of free speech will now join the old attackers of it in an unholy alliance against the common man

      This seems…unlikely, at best.

      all because some snowflakes that people like Masnick can't socially avoid claim to experience a magical voodoo silencing curse whenever they hear anyone else say an inappropriate word

      Funny, then, that you used “snowflake” as an insult multiple times, yet I feel no sort of curse, hex, jinx, whammy, or bewitchment of any kind trying to stop me from speak or type or write. Maybe you should have tried “Candlejack” or someth

      link to this | view in chronology ]

      • identicon
        JarHead, 24 Jan 2018 @ 6:53pm

        Re: Re:

        In case you did not read that xkcd strip: The right to free speech protects you from government interference with your speech and expression. It says nothing about the moderation of privately owned/operated platforms. You can argue that being banned from Twitter is the same thing as censorship. If you are banned from Twitter, your argument could have more weight. If you post that argument on a site like Tumblr, however, you will have a much harder time of making the case that your speech has been silenced by The Hellbird.

        By this I take it that you argue someone banned from twitter is not censorship.

        Even if “saying ‘no’ ” was all that groups like GamerGate did to the targets of their harassment campaigns, they would still be harassers for the fact that they would be flooding a target’s mentions with the intent of harassing that target into leaving a given service. That seems an awful lot like censorship, at least from where I sit.

        Please elaborate more on how this not contradict your statement about twitter ban. What is your definition of censorship?

        link to this | view in chronology ]

        • identicon
          Wendy Cockcroft, 25 Jan 2018 @ 7:09am

          Re: Re: Re:

          https://www.thesun.co.uk/news/4119658/pug-nazi-salute-hitler/

          You're welcome.

          Not that I'm anti-Semitic but I'm not politically correct, either.

          link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 25 Jan 2018 @ 9:32pm

          Re: Re: Re:

          If you are banned from The Hellbird, you have not been “censored” in the broadest possible interpretation of the word. Yes, you can argue that Twitter has censored you if it bans you, but you can only make that argument about Twitter itself. A Twitter ban does not affect your ability to express yourself on any platform other than the birdsite.

          On the other hand, a targeted campaign of harassment meant to drive someone away from not just a specific platform, but as many platforms as possible reeks of actual “you are not allowed to speak your mind ever” censorship. The GamerGate campaigns against Zoë Quinn, Anita Sarkeesian, and other women within the gaming industry/culture were designed to make those women “voluntarily” silence themselves, to make them afraid of expressing themselves out of the fear that doing so could lead to potentially violent consequences. You can argue that a Twitter ban is censorship, but when was the last time Twitter harassed a banned user to stop them from using, say, Tumblr to complain about being banned from Twitter?

          link to this | view in chronology ]

        • icon
          Toom1275 (profile), 28 Jan 2018 @ 11:07am

          Re: Re: Re:

          The dogmatic, absolute belief that one's right to "free speech" trumps everyone else's rights including the rest of the first amendment is, in some places, labeled "freeze peach."

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 24 Jan 2018 @ 8:22pm

        Re: Re:

        One key point -

        The exclusive purpose of freedom of speech means freedom from consequences. That's why it's called *freedom*.

        If all that's required is the physical ability to speak, Kim Jung Un could say he respects freedom of speech because he doesn't rip the vocal chords out of the people he sends to slave labor camps as a consequence of their speech.

        link to this | view in chronology ]

        • identicon
          bob, 25 Jan 2018 @ 10:40am

          Re: Re: Re:

          Freedom never meant free from consequences, just that you can make the choice to do something. For example I have the freedom to commit suicide, however the consequence is I'm dead. I can't change that. Others might take my freedom to commit suicide away by locking me down to a bed so I can't hurt myself but regardless, a consequence ensues.

          Everything has a consequence, whether the result is good or bad for the individual is in the eye of the beholder. Also the consequence may not happen immediately but instead a long time in the future. Yet you still have the freedom to make the choice to do, say, think, and react as you wish. Just not what will happen to you as a result.

          link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 25 Jan 2018 @ 9:35pm

          Re: Re: Re:

          The purpose of the First Amendment is to protect speech from government interference. That amendment says nothing about protection from being fired from your job, booted from a platform, or turned into a social pariah because you used a racial slur on Twitter.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 5:39pm

      Re:

      He is in no way advocating for censorship or the silencing of any speech. If you truly read the article and that is what you got out of it then you need to read it again and maybe PM Mike to get clarification.

      What he is stating is that people have found a way to 'game the system' of free speech using the internet. They have found a way to use free speech to SILENCE the free speech of others. They have an easily accessible propaganda machine that can reach the entire world. What would normally have been one man shouting in a crowd and being drowned out by the crowd, now that one man has a full blown concert sound system and he has turned the tables and is now drowning out the crowd.

      Yes the crowd has access to the same systems but they are all on equal footing now, it's far easier for the one guy to drown out the crowd and far harder for the crowd to drown out the one guy. What's happened is that one guy figured it out quicker than the crowd and he is drowning them out by using it more effectively.

      What this article is about is figuring out how to level the playing field again without cutting power to that guy's sound system. It's not about silencing him, it's about making sure that he can't overpower everyone else. Because that is what is starting to happen, that one guy is using free speech as a weapon to SILENCE others. And that isn't ok, that's not free speech.

      It may just be that we have to get better and smarter at using our own sound systems. But whatever the solution is, and make no mistake, the solution should result in less censorship and more free speech, it is a discussion worth having. Because if we sit back and do nothing, we all lose.

      link to this | view in chronology ]

      • identicon
        eol, 25 Jan 2018 @ 6:56am

        Re: Re:

        *"Yes the crowd has access to the same systems but they are all on equal footing now"*

        No, they aren't. Because gaming the system usually requires some serious resources inaccessible to a regular member of said crowd.

        Those resources are usually only accessible to big organisations, rich companies and governments.

        I should know, I leave in Poland. Scientists say we have the amount of paid troll and automated bot activity on any average day on a level comparable to US before the last elections.

        Although, when I connected the facts that every third post about politics was made by a paid shill, and that post containing hate speech and calling people names are usually automated (bots, because apparently writing offending bullshit is so easy even robots can do it) I suddenly regained faith in humanity.

        After all, that means that most of the overwhelming flood of hate that is spilling onto me from my computer screen is not written by human beings. Ergo, there's a lot less mean people out there as I was led to believe.

        Which only proves the point that something has to be done with this situation.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 25 Jan 2018 @ 8:06am

          Re: Re: Re:

          Please tell me which resources used to spread all the junk on the internet are only accessible to big organizations, rich companies, or governments. Twitter? Facebook? Instagram? LinkedIn? Last I checked, anyone can sign up for a free account on any of those platforms.

          Maybe you meant actually making a website/forum and creating news articles? Hmm, I can buy a domain name for less than $10 a year and can get webhosting, sometimes free but usually for less than $100 per year. There are free website and forum templates I can use so I don't actually have to know how to code or design websites. All that's left is actually writing content, which anyone can do for free.

          I suppose if you suck at writing you might need to hire someone, but I'll bet if you are determined to spread that junk, you probably already know someone who would do it for free or for cheap. Heck, you could just copy and paste what someone else on the internet writes since you likely aren't worried about plagiarism either.

          Gaming the system doesn't require lots of resources. It just means using it in a way it wasn't originally intended to be used. That's all. You don't need lots of resources to do that. Con artists, liars, phishers, and hackers do it all the time, sometimes with $0 in resources and a dinky laptop and cellphone.

          link to this | view in chronology ]

          • icon
            Stephen T. Stone (profile), 25 Jan 2018 @ 11:41pm

            Re: Re: Re: Re:

            Please tell me which resources used to spread all the junk on the internet are only accessible to big organizations, rich companies, or governments.

            …child porn sites?

            link to this | view in chronology ]

          • identicon
            eol, 26 Jan 2018 @ 12:16am

            Re: Re: Re: Re:

            You have a point here, but you misunderstood me a little. I was talking about massive, organised actions geared towards suppressing a certain idea. If a single person creates a fake profile or two on Facebook then starts posting shit about stuff he doesn't like it wouldn't have much impact in the grand scheme of things. Unless there's a lot of like-minded assholes, that happens too, but to me this feels too close to how free speech should work, even if they're spreading a "bad" idea.

            The problem starts if a 1000 people do it, deliberately, in an organized manner. Then, the impact will be much grater - but you need to coordinate those 1000 people, teach them how it works. Suppressing an idea online isn't as simple a process as one may think, you have to know what to do and when, simply flooding the discussion space with the opposite idea usually isn't enough. Not unfeasible these days, as you said, even if those 1000 people are not formally organised - the Internet have proven it can be done.

            Then, if you have connections and money, you hire people that do it for you. These people know how it should be done, they're efficient, some of the work gets automated in a clever manner. That's what you do when you're not a haker, and most people aren't. That's where "gaming the system" requires lots of resources. That's what is most dangerous, too. Because the idea that gets suppressed/promoted this way is based on a single entity's incentive, be it political or economical, and not on what a lot of people really think.

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 26 Jan 2018 @ 7:05am

              Re: Re: Re: Re: Re:

              I'm sorry but you're still wrong. It doesn't take massive, organized actions to spread enough junk to suppress ideas.

              Look at all the people who are fooled by The Onion and repost it as true. And The Onion isn't even trying to suppress anything. The same holds true for any other site that is trying to push an agenda. All you have to do is have one guy who knows how people think and react and create a couple of social media accounts that people will trust and boom! Now you've harnessed the populace to do the work for you.

              You see that's the point. It isn't that there is a massive organization behind these suppressions, it is that they have found a way to direct and influence the masses to do their dirty work for them. Once you have enough people believing what you say is true, they will spread and suppress for you. They will organize themselves. And the internet is what makes that possible on a massive scale.

              No, it's not necessarily a simple process, but not because it takes massive amounts of resources. It's not a simple process because you have to have a deep understanding of how people think and react and how to manipulate that. (Or get really lucky) I know several people who can manipulate others when playing board games to unknowingly make plays that benefit him. They think they are benefiting themselves, or at least getting an equitable deal, right up until two turns later he swoops in and crushes everybody and wins the game. He doesn't do anything other than say very specific things and all the other players eat right out of his hand.

              This all has a name too. It's called grassroots organizing. You don't think that every big grassroots effort started off with a massive, money rich organization behind it do you? Most start with one person speaking up. Charities are a perfect example. They don't start off big, they start small and grow. Someone sees a need, starts telling people, those people join and start telling other people and the whole thing grows from there.

              Gaming the system doesn't require anything in the way of big resources. All it does require is an understanding of the rules in play, how humans think and react, your available tools, and using the combination of those three to get the desired result. Essentially, it's all a con. It's just some cons are used for good, and some for bad.

              link to this | view in chronology ]

      • identicon
        bob, 25 Jan 2018 @ 6:37pm

        Re: Re:

        "... that one guy is using free speech as a weapon to SILENCE others. And that isn't ok, that's not free speech."

        That one guy, are you going to take that insult?

        :-p

        link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 24 Jan 2018 @ 10:41pm

      Re:

      This is the last straw. I'm finished with Techdirt. You think an 'I'm all for free speech. However...' is any better than an 'I'm all for free speech but...'?

      No, you misread what I have said. I'm all for free speech, period. My concern here is not that free speech has gone too far, but that some are abusing a related concept -- "the marketplace of ideas" to actually censor and stifle free speech. My concern is about the LOSS of free speech.

      Free speech is intended to protect precisely those thoughts and opinions that society finds 'dangerous' because it threatens to upend the social structure. For Techdirt to jump on the bandwagon arguing for exceptions to free speech because some feel cowed by the speech of others, is truly the end of the concept of free speech in 'polite' internet society.

      I would suggest reading the post again -- because I do support free speech for dangerous ideas -- and I AM NOT advocating for any exceptions to free speech.

      Please read the article again.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jan 2018 @ 1:16pm

      Re:

      You sound old.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Jan 2018 @ 1:55pm

        Re: Re:

        He's not old, he's just got an axe to grind for no good reason and isn't above misreading the article so he can complain about it.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Jan 2018 @ 4:33pm

    Unsubscribed.

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 24 Jan 2018 @ 5:32pm

      Re:

      In the words of The Mighty Gord: “Door’s to your left.”

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jan 2018 @ 5:47pm

      Re:

      And unmissed.

      link to this | view in chronology ]

    • icon
      HegemonicDistortion (profile), 24 Jan 2018 @ 9:31pm

      Re:

      Pretty good as satire, if that's where you were going.

      Wow, if you can't even abide discussion about free speech, what kind of commitment to it did you ever actually have?

      "How dare you post a thoughtful discussion and set of questions about speech!"

      LOL, man.

      link to this | view in chronology ]

  • icon
    Koby (profile), 24 Jan 2018 @ 7:10pm

    My take on trust

    "The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself."

    It seems to me that the problem right now is trust. Government isn't trusted; corporations aren't trusted. TV networks aren't trusted; Silicon Valley Tech companies aren't trusted. We happen to live in a time when trust in a lot of important institutions seems to be very low. Until someone re-earns it (by being correct most of the time, instead of being rich) then there will be a constant desire for people on the internet to route around the un-trustworthy, leading to possibly undesirable outcomes through trial and error until it is re-established.

    link to this | view in chronology ]

  • identicon
    Christenson, 24 Jan 2018 @ 7:37pm

    Free speech and attention

    We take "free speech" as an article of religion, that is the idea that anyone should be able to say anything without the power of the government trying to shut it down, for example, by killing or jailing the speaker, or, in the current US situation, making the speaker's life impossible with the constant threat of the courts.

    The problem, though, is that or traditional methods of *choosing* what we hear have broken down. Make no mistake: Choices *have* to be made, and *are* made. I live in a smallish city of 100,000, and I *cannot* even read all 100 unique, good commmenters on my favorite blog, Techdirt, every day. Choices are and have to be made. But how?

    How do we preserve "the still, small voices"? How do we ensure the 'right' voices get heard? Keep the snake oil salesmen out? Notice the religion creeping back in, just as there are any number of charities trying to educate african children, but quite a bit of subterranean disagreement between the charities as to what that should mean beyond the 3 R's -- does it mean memorizers who tend to become clones of their teachers and parents or critical thinkers that change the world?

    I think we need to begin by asking ourselves what it is we want to accomplish with our free speech so that we can test the implementation or proposed implementation. A little historical perspective wouldn't hurt, either...why can't I advertise snake oil in the Washington Post? How did it get that way?

    How do we deal with master persuaders of various flavors in the system? (See "Win Bigly" by Scott Adams)

    link to this | view in chronology ]

  • icon
    DocGerbil100 (profile), 25 Jan 2018 @ 12:41am

    Now there's a thing...

    Hello, Mr Masnick. :)

    Good article. It's a sound point of view, IMO. It was also sound when I said much the same thing here in comments, three or four years ago - and in the comments section at TorrentFreak, years before that.

    Oddly, I don't seem to recall many people agreeing with me at the time. If memory serves, the overwhelming majority of cogent responses offered were all along the lines of "the only proper answer to bad speech is more speech".

    I suppose controversial opinions sell rather better when they come from young, attractive, female professors, rather than from random assholes in comment sections. Still, people are finally discussing the issue, rather than ignoring it completely, so that's good.

    - - - - - -

    As far as practical answers go, I agree it's a tricky one. I've little enough in the way of advice, but...

    Can we start by ridding ourselves of the word "troll"? I've no idea why anyone ever thought it was appropriate. Perhaps it made more sense when new, but now...?

    We have lobbying firms, religious groups and government agencies swarming over parts of the internet to sell pure bullshit, we have the alt-right and other noxious performers doing their best to harass, intimidate and in some cases literally terrorise individuals into suicide, because of their skin-colour, gender, or whatever...

    ... and we're going to call them by a name of a mythical monster best known from _The Three Billy-Goats Gruff_ and _The Hobbit_? For fuck's sake, who came up with this shit? What absolute fucking idiot thought it would be socially-beneficial to name these evil pigfuckers after something from a children's fairy-tale? Whichever stupid, dopey, leprechaun-molesting human horsecunt inflicted this upon the internet should be bitchslapped until their liquefied brains leak out of their twee and delicately-pointed fucking ears.

    - - - - - -

    Let's call trolls by what they actually all are: Liars. The poster who talks nonsense on a webpage isn't there to "entertain", or "play devil's advocate", or whatever excuse is used today. He or she is there to try and tell and sell Lies, whether it's by directly lying, by twisting the truth into a lie, by refusing to acknowledge a contradicting truth, or by giving credence to other Liars by responding with apparent sincerity, helping to drown out the truth with unending bullshit.

    If we can lose the word, then perhaps instead of talking about "trolls" and people "feeding the trolls", we can have "Liars" and "Assistant Liars". The distinction might seem trivial, but I think it would make a big difference, once people get used to seeing and thinking about them with harsh honesty, rather than through the soft-focus lens of dishonest euphemisms.

    - - - - - -

    One of the benefits is with how we think about contributors who feed the trolls - or Assistant Liars, as they should be known. We've all fed the trolls at one time or another. It's easy enough to do just by accident, since we can't really judge who we're talking to until after enough information's already been exchanged...

    ... but once that point's been reached, that's it. We're Assisting Liars in their goal of obstructing the truth. We become our own problem, Liars by proxy, silencing our own debate.

    I don't care how funny or insightful our replies are, we're still effectively doing their job for them. We're still Lying, along with the "troll". If we were all that insightful, we'd ignore them, flag them and focus on whatever we actually feel is true and important about a given topic.

    Free speech should encourage and enable us to collectively search for the truth. It should not be so poorly conceived that we feel obliged to ignore the truth and waste all our time debunking lies.

    Let the Liars be hidden. Let the replies of Assistant Liars be hidden, even if it damns us all on a good day. If anyone has something useful or amusing to say, let them do it separately, without serving to undermine free speech in it's own perverted name.

    - - - - - -

    Last thought for the day...

    Speech is a tool. Free speech is a more desirable tool than many other kinds, but it's still just a tool. What we do with that tool is what matters most.

    I use it striving to find some truth in the world, with varying degrees of success - it's one of my favourite tools for that.

    What do you want to do with yours, Mr Masnick?
    What is your free speech actually for?

    link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 25 Jan 2018 @ 7:13am

      Re: Now there's a thing...

      I'll bite. Mike mostly uses it to complain about government overreach and IPR abuse.

      link to this | view in chronology ]

    • icon
      The Wanderer (profile), 25 Jan 2018 @ 8:49am

      Re: Now there's a thing...

      A troll is a person who engages in the act of trolling.

      Trolling, in an online of context, has a meaning similar to that of the longer phrases "trolling for responses" or "trolling for a reaction" or "trolling for flames". In those phrases, "trolling" has the same basic meaning as it does in a fishing context, which has nothing to do with the mythical monsters (who, I'll note, appear in other myths way predating the children's tales you cite); that's a false cognate.

      In other words, a troll is someone who goes looking to provoke a particular type of response, and is willing to post whatever it takes to produce that type of response, regardless of whether what gets posted is true of even of whether he(?) agrees with it.

      The simplest definition of the act of trolling that I've ever encountered (that still seems accurate) is "posting with the intention of creating a furor".

      It's true that the label "troll" gets applied to many people who do not fit that above definition, nowadays. (It probably got misapplied that way early on, too, as yet another way to troll people.) That doesn't make it an invalid descriptor, however - it just means that we need to be careful about who we accept as deserving of that label.

      link to this | view in chronology ]

      • icon
        The Wanderer (profile), 25 Jan 2018 @ 8:28pm

        Re: Re: Now there's a thing...

        (Okay, I see how I could get "of even of" instead of "or even of", but how the #&*! did I get "an online of context" - particularly without noticing it??)

        link to this | view in chronology ]

    • identicon
      Christenson, 25 Jan 2018 @ 10:29pm

      Re: Now there's a thing...

      I think we should talk persuaders and snake oil salesmen instead of liars. All nasty, just like trolls!

      As to what techdirt is up to with its free speech, well, it seems to be trying to improve the world with truthful stories about tech abuses and a good discussion about how speech and attention get regulated in practice whether we like it or not because of the scarcity of attention.

      I applaud the search for principled answers.

      link to this | view in chronology ]

  • icon
    cattress (profile), 25 Jan 2018 @ 12:54am

    viral fake news

    Can anyone cite an example of a viral fake news story, which wasn't revealed as fake, or wasn't revealed as fake until the story had lost popularity and thus nobody really noticed? And I'm not talking about a story with incorrect information that was corrected almost immediately, or a story where sources were not verified, but it came to light that the story could not be confirmed and was no longer considered credible.
    Does anyone know a person or people who changed their mind or decided who to vote for because they read some propaganda in their social media news feed? I hear an awful lot of complaining about how fake news and Russian propaganda influenced the election, but no one complaining that they personally were influenced to vote in a manner that they were not already predisposed or inclined to vote. No one is shouting, "Help! I was tricked! All of my friends on social media inundated me fake news and I didn't realize that I should have scrutinized the credibility of the source! These internet companies should have protected me!"
    Mike is right- end users are responsible for their own media literacy. It's pretty damn insulting that folks on the left and the right suggest that the general population needs some sort of authority to protect us from thinking for ourselves.
    To be honest, we should try not to get too mad at the platforms for making missteps when moderating content- they are trying to do it themselves to avoid more government intervention. We need to give them constructive criticism and ask for more user controls so that those who want to wrap themselves in a bubble can do so without forcing everyone into the same bubble of conformity. I don't think it's a good thing to shield one's self from everything that might trigger you, but it's not my position to force anyone to engage in the marketplace of ideas who doesn't want to.
    It makes logical sense to me to determine a definition of harassment, as it applies to speech that infringes on the rights of others (thinking along the lines of the phrase that the right of my fist is limited to the distance of another person's jaw)

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jan 2018 @ 5:06am

      Re: viral fake news

      " I hear an awful lot of complaining about how fake news and Russian propaganda influenced the election, but no one complaining that they personally were influenced to vote in a manner that they were not already predisposed or inclined to vote."

      You won't. Because if it were that obvious, it wouldn't be effective.

      When you are manipulated by people who've spent decades fine-tuning their craft, including making it hard for the targets of their efforts to detect them, then the probability that you will figure out what's happening is small. Unless you're extraordinarily smart, well-educated, aware, etc., you're probably not going to realize what's going on. And even then: you may not. You're up against very smart, very determined, very experienced people with enormous resources on their side. The game is completely rigged so that you'll lose.

      link to this | view in chronology ]

      • identicon
        Christenson, 25 Jan 2018 @ 10:20pm

        Re: Re: viral fake news

        On the contrary: Remember the guy that believed Pizzagate and showed up at the basement-less DC pizza parlor to investigate with a weapon?

        And you should hear one of my coworkers...all he can think about in politics is Hillary...

        Neither of these folks recognize the manipulation, but it's pretty obvious.

        link to this | view in chronology ]

      • icon
        cattress (profile), 26 Jan 2018 @ 8:55pm

        Re: Re: viral fake news

        Fair points, there are some highly skilled propaganda artists. However, none of the FB memes/posts or tweets were a demonstration of those skills, and they didn't saturate the population. Russian activity was tied to users who associated with right-wing subjects or groups; democrats and independents were not targeted. People didn't radically change their minds because of Russia or fake news. People who wanted to believe or were predisposed to believe in the propaganda bought into it and had no interest in fact checking. Comparing the content and tone of RT and Breitbart, RT is more accurate and reliable. I would argue that American trolls had a greater influence mobilizing the nationalist-populist base that back Trump.
        I see the hysterics around Russian propaganda activity on the left to be nearly the same as the right's accusations of George Soros financing every movement they don't like. Americans are not so oppressed that they have to suffer the propaganda of any source or group of sources without the ability to find the truth for themselves; the biggest hurdle we face is willingness to question our own beliefs and values.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jan 2018 @ 6:51am

    Your information model is wrong.

    It is worth noting that the censorship problem largely goes away with kill files. Which was a feature of usenet, but isn't a feature of web forums by and large, and should be.

    This:

    "once you've admitted that spam can be filtered, you've admitted that some moderation is appropriate"

    is complete bullshit. It conflates a technical ability with with a lawful or sociological concept. What this says is that because server side filtering is hard is some circumstances, that the server side should be granted a pass on doing more severe filtering. Of course this is wrong because server side filtering is not the only solution. There is also: client side filtering. (among other solutions)

    The whole "fake news" meme, is simply the use of a problem that has already been solved, to justify the execution of an ego driven solution to the same problem. "Fake News" has never been about anything other than re-entrenching the old guard into a position that restores their ability to propagandize. IOW: also complete bullshit.

    What your looking at, is speech adapting to technology. Then the technical challenges thereof being used as an excuse NOT to adapt the technology to better parallel lawful behavior. What we are talking about is the market impetus pushing a greater expansion of unlawful and uncivil behavior. The Fake News meme is primarily functioning to entrench an acceptance of that behavior in the public, but only in such as way that it defends the oligarchy.

    Equivocating on a the merits of Constitutional law because you don't want to write software is insane. There are ways to fix this. They may not currently be in slash (what TD uses for its website). Abandoning your principles because you are too lazy to write a patch is batshit.

    Of course now, even client side filtering is deprecated. The information model of everything above OSI layer 4, and apparently now below layer 2, on the Internet is broken.

    There are actually MANY ways to fix that. And I've read at least a half dozen white papers about technologies that go a long way towards doing that over the long term. One or two even have good funding at this point.

    But you get paid for what you get paid for, and I get paid for what I get paid for. So your going to have to find that information on your own.

    So you are going to have a grand debate, built on the basis of an assumption that a pop culture meme is true. When in fact, what is true is that the web uses shitty, heavy, hard to maintain software that has less facility for managing this problem, than a daemon written 20 years ago. And all during this debate, you will utterly ignore the possibility that this is a problem that has already been engineered out, (and has been previously on many occasions) because why?

    I know why the trinity ignores the idea that clients should be responsible for their own information filtering. Why would TD?

    The sock puppets are the wedge, and you Mike, are what they are trying to divide.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jan 2018 @ 7:55am

      Re: Your information model is wrong.

      So if the answers are already out there and being developed, why don't you provide links to them so that we can in turn support them?

      So your going to have to find that information on your own.

      Oh right, you don't actually have proof of any of them, you just think they exist therefore they must.

      Right, got it. Moving on.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Jan 2018 @ 3:23pm

        Re: Re: Your information model is wrong.

        Brilliant repost. I am completely undone.

        Products needs market space to grow. In terms of civil rights oriented products and services, that market space can only grow under a canopy of the publics ignorance. Once breached that canopy gives way to avarice, and an inevitable corrosion of reason. Which is what what we've seen over the years with with the Internet.

        Right?

        Once it got to the point where it was easier to send dick picks than it was to actually convey real ideas, the market became burdened by our collective stupidity. Ignorance is its own market constraint.

        So no. I'm not going to talk about next gen civil rights tech here. I'd like to, but there are to many enemies of the Constitution that follow this forum.

        Good luck.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 26 Jan 2018 @ 7:13am

          Re: Re: Re: Your information model is wrong.

          Wrong. Dead wrong.

          Positive social change can only take place in the light of day and public awareness. Take ANY social issue over the past few centuries, slavery, women's rights, segregation, child labor, I could go on. The only reason ANY of those situations changed is because someone dared to bring it into the public consciousness.

          The same goes for technology. Yes, some technology was developed in secret but it wasn't until it was made public knowledge that it's true potential and abilities were made aware. Computers and the internet are a good example. ARPANET was a cool idea developed mostly by the government and universities. It wasn't until it became publicly accessible that it turned into the internet and completely revolutionized our world.

          This exactly where you should be talking about next gen civil rights. Keeping it in the dark only prolongs the implementation of the solution. And, if as you imply, that they are publicly available to be found on the internet, then they aren't secret and there is no reason not to link to them.

          So yes, you are completely undone. If you do have proof, post the links so we can all be enlightened and humanity enriched. If not, well then what you claim doesn't actually exist.

          link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 25 Jan 2018 @ 9:34am

      Re: Your information model is wrong.

      Of course this is wrong because server side filtering is not the only solution. There is also: client side filtering. (among other solutions)

      I have long advocated for client-side filtering and I even mention it IN THIS PIECE as a possible solution (quoting from the article): "putting more control into the hands of end users, rather than relying on the platforms to make these decisions"

      But, it appears you were very interested in ranting rather than reading, so, you be you.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Jan 2018 @ 3:41pm

        Re: Re: Your information model is wrong.

        Sorry Mike,

        Got a lot of respect for you. Didn't mean to come off like a dick. Though I imagine I do from time to time. Hazards of the Internet and all.

        So do registered users get kill files in their profiles?

        There has been some sillyness going on in the forum, and I expect it is wasting a bit of your time dealing with it. I expect you have put some thought into how that work load is going to scale. If registered users have kill files within their user profiles, those files could be confederated, and readers could pick a confederation when reading the site.

        It would be a unique feature, and potentially a driver for moving anons over to subscribers. Just a thought.

        Again, sorry.
        Cheers!

        link to this | view in chronology ]

        • identicon
          Christenson, 25 Jan 2018 @ 10:14pm

          Re: Re: Re: Your information model is wrong.

          On the contrary:
          I find the server-side filtering of the discussion on Techdirt to be pretty good and elegant. There are enough like-minded folks here to flag what I consider garbage, and, if I get curious, I can ask to see it anyway, but, on an average day, I'm lazy.

          And I'm *way* above average, and *not* a casual visitor, so don't expect those folks to have the attention available for a good filter.

          link to this | view in chronology ]

  • icon
    M. Alan Thomas II (profile), 25 Jan 2018 @ 12:10pm

    First, when it comes to the marketplace of ideas, it's worth remembering that real-life marketplaces require certain regulation in order to function correctly.

    For example, they improve with the quality of the information available, so false advertising is banned. They fail when monopoly power is used to shut out smaller competitors, so monopolies are heavily scrutinized and regulated. It's only the most ideologically rabid of free-marketers who think that they want a completely laissez-faire market system, because that inevitably leads to a conversion of the free market to a monopoly market. We can see how these concepts translate into the marketplace of ideas without simply discarding the concept.

    Second, when evaluating certain user behaviors, I do think that it's useful to momentarily set aside the xkcd strip and ask, "Would I be okay with this if the government did it?"

    For example, is the government engaging in a vigorous debate about the virtues of its policies? That's great! Is it flooding the debate space with propaganda bots? That's . . . not so great. Did a government spokesperson interpret the facts in a way that I disagree with? That happens in a society of diverse thought. Is a government spokesperson flat-out lying about the facts? That's a problem. Again, we can see how these concepts translate into instinctively grasping which behaviors by client-side actors we disagree with.

    Third and finally, let's not forget that the First Amendment is not absolute. Strict scrutiny is a hard but not impossible bar, particularly when the goal is to ensure that all voices are heard rather than to silence one of them.

    link to this | view in chronology ]

  • identicon
    Matthew A. Sawtell, 25 Jan 2018 @ 3:23pm

    Finally decided to cross this Rubicon Mike?

    I would suggest that you have a public interview with the likes of Tim Pool, because you are entering an arena that asks for no quarter and expects none in return.

    Lengthy Op-eds are nice, a video talking to an actual human being would be better.

    link to this | view in chronology ]

  • icon
    binra (profile), 26 Jan 2018 @ 7:20am

    Signal and noise

    I felt some sense in the article and came down to comments and found a lot of noise. The troll is not only in the 'troll' but in the baiting or provocation to react.
    Reactive

    There are key guidelines that serve communication, but if made into rules, they become weapons and communication is not a weapon. Weaponised communication is a manipulative deceit.

    This is all about honesty of being. Awakening responsibility for your own experience and alignment in a worthiness of communication - that is you have to extend worth to join with another.

    Competitive and comparative 'worths' are appeals to specialness in victor or victim that invokes false saviours and further entanglement.


    While a 'broad spectrum dominance' weaponises and marketises everything, life is denied the conditions in which to be and know itself being. Mind capture runs 'identities' that are programmable or conditioned reaction. The whole thing is rigged up to Big Data for stress testing in real time so as to 'perfect' the system against loss of control (Chaos). Yet such a system itself is organized or managed chaos because it is OUT of communication. It is a model built upon an asserted and defended image of self, world or life. Its function has been to generate noise by which the signal of true communication is blocked, invalidated, demonised, ridiculed, shut down. False power runs over a sense of division.

    Individual responsibility can be seemingly escaped in blaming, so as to generate self-justifying narrative and believe it by acting as if it is true. What goes around comes around.

    Awakened responsibility owns its own experience and gives witness to its own choices instead of casting itself as the victim, victor or saviour to another.
    While I do not always agree with him, Jordan Peterson is exemplifying the use of communication as freedom from deceit. But for that he has to be living true to himself rather than asserting and defending a personal presentation under tyrannous or fear and guilt directed thinking.

    The breakdown of the masking narrative is the condition in which the desire for true communication stirs. Insanity is only an option for those who believe it is everyone else who is insane. Noticing our own reactions and the thoughts and beliefs beneath our emotional responses is an art of being.
    Communication begins within and extends.
    A loss of communication to systemic thinking runs a substitution reality or fantasy overlay. Whatever it was, it has become 'marketised and weaponised' in defence of an insanity. But recognizing conflicted thoughts is the awakening of perspective upon them and opens the freedom to release fixation and identity in them.

    Attempting to solve our inner conflicts upon the world is a disowning and dishonouring of life. But we know not what we do. Only as we uncover what lies beneath can we own it and thus change it. Blame persists the attempt to dump our stuff on others and 'solve it' by denying them. But from a true responsibility we can reflect a true witness and extend an invitation to align in the feeling of being rather than emotion backed thinking - which of course includes 'anti' emotional rationalisations.

    I write ideas as they find form from a willingness to listen in. This is also a learning by practice because I notice what moves me and what inhibits or diverts ad dilutes. Every time you write or speak, listen for where you are coming from, and feel for an find a channel of communication. Whenever reading or listening, also listen within for resonance with your being, so as to know what you need to know without judging another.

    Opening this quality of connection to a 'Field' quality is willingness to release the stories and dynamic of conflict so as to be grounded in and moved by that which you accept true of you, and so can stand in without taking from anyone.

    link to this | view in chronology ]

  • identicon
    Anonymuz, 26 Jan 2018 @ 4:25pm

    Yes we do weaponize

    If things actually get bad enough, I'm sure we WILL weaponize our free speech against censorship. That's just how democratic civilizations work.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.