Platforms, Speech And Truth: Policy, Policing And Impossible Choices
from the deplatforming-and-denialism dept
Warning 1: I'm about to talk about an issue that has a lot of nuance in it and no clear "good" answers -- and it's also one that many people have already made up their minds on one way or the other, and both sides will probably not really like at least part of what I have to say. That's cool. You get to live your own life. But, at the very least, I hope people can acknowledge that sometimes issues are more complex than they appear and having a nuanced discussion can be helpful, and I hope people can appreciate that.
Warning 2: This is a long post, so I'm going to provide a TLDR at the top (right under this, in fact), but as noted above, a part of the reason it's long is because it's a complex issue and there's a lot of nuance. So I strongly advise that if your initial response to my TLDR version is "fuck you, you're so wrong because..." maybe try reading the whole post first, and then when you go down to the comments to write out "fuck you, you're so wrong..." you can explain yourself clearly and thoroughly and address the actual points in the post. Thanks!
TLDR: Internet sites have every right in the world to kick people off their platforms, and there's no legal or ethical problem with that. No one's free speech is being censored. That said, we should be at least a bit concerned about the idea that giant internet platforms get to be some sort of arbiter of what speech is okay and what speech is not, and how that can impact society more generally. But there are possible solutions to this, even if none are perfect and some may be difficult to implement, and we should explore those more thoroughly, rather than getting into screaming fights over who should or shouldn't be allowed to use various internet platforms.
So, this post was originally going to be about the choices that Facebook and other internet platforms make concerning who is allowed on their platforms, specifically in response to an interview that Mark Zuckerberg gave back in July, in which he noted that he didn't think Facebook should remove Holocaust deniers from its platform, saying:
I’m Jewish, and there’s a set of people who deny that the Holocaust happened.
I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think... it’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do too, and I just don’t think that it is the right thing to say, “We’re going to take someone off the platform if they get things wrong, even multiple times.”
This created a huge furor of people talking about trolling, Holocaust denialism, Overton windows and a bunch of other things. But it's a complex, nuanced topic, and I was trying to write a complex nuanced post. And just as I was getting somewhere with it... this week, a bunch of platforms, including Apple, YouTube and Facebook, removed at least some of Alex Jones accounts or content. This created another furor in the other direction, with people talking about deplatforming, censorship, free speech, monopoly power, and policing truth. And then when Twitter chose not to follow the lead of those other platforms, we were right back to a big furor about keeping hateful wackjob conspiracy theory assholes on your platform, and whether or not you should want to do that.
Chances are no matter what I say is going to piss off pretty much everyone, but let's do the stupid thing and try to address a complex and extremely nuanced topic on the internet, with unflagging optimism that maybe (just maybe) people on the internet will (for a moment at least) hold back their kneejerk reactions of "good" or "bad" and try to think through the issues.
Let's start with a few basic principles: no matter what crazy legal analysis you may have heard before, internet sites have every right to remove users for nearly any reason (there may be a few limited exceptions, but none of them apply here). Whether you like it or not (and you should actually like it), corporations do get rights, and that includes their First Amendment rights to have their sites appear how they want, along with deciding who not to associate with. On top of that, again, despite what you may have heard online about Section 230 of the CDA, platforms not only have the right to moderate what's on their platform without legal liability, they are actually encouraged to do so by that law.
Indeed, if anyone knows this, it's Alex Jones, since Infowars' own terms of service makes it clear that Infowars can boot anyone it wants:
From InfoWars' own terms of service (H/T @alexkasprak): pic.twitter.com/9qH4YLH79J
— Sky Palma (@DeadStateTweets) August 7, 2018
If you can't read that, there's a long list of rules and then it says:
If you violate these rules, your posts and/or user name will be deleted. Remember: you are a guest here. It is not censorship if you violate the rules and your post is deleted. All civilizations have rules and if you violate them you can expect to be ostracized from the tribe.
One of the rare cases where I can say that, hey, that Alex Jones guy is absolutely right about that (and we'll leave aside the hypocrisy about him now flipping out about other sites applying those same rules on him).
A separate point that also is important, and gets regularly ignored, is that "banning" someone from these platforms often has the opposite impact of what was intended. Depending on the situation, it might not quite be a "Streisand Effect" situation, but it does create a martyr situation, which supporters will automatically use to double down on their belief that they're in the right position, and people are trying to "suppress the truth" or whatever. Also, sometimes it's useful to have "bad" speech out in the open, where people can track it, understand it... and maybe even counter it. Indeed, often hiding that bad speech not only lets it fester, but dulls our ability to counter it, respond to it and understand who is spreading such info (and how widely).
So, really, the question comes down to whether or not these platforms should be removing these kinds of accounts. But, before we should even answer that question, there's a separate question, which is: What options are there for platforms to deal with content that they disfavor? Unfortunately, many people assume that it's a binary choice. You either keep the content up, or you take it down. But that hardly gets at the long list of possible alternatives. You can encourage good behavior and discourage bad behavior (say, with prompts if the system senses you're doing something bad, or with reminders, or by a community calling you out for bad behavior or lots of other options). Depending on the platform, you can minimize the accessibility or findability of certain content. You can minimize the reach of certain content. You can append additional information or put a "warning flag" on content. You can "shadow ban" content. You can promote "good" content to go with any content you deem to be bad. Or you can do nothing. Or you can set things up so that your users are able to help promote or minimize good or bad content. Or you can create tools that allow your users to set their own preferences and thresholds. Or you can allow third parties to build tools that do the same thing. The list goes on and on and on.
And, yet, so much of this debate seems to ignore much of this (other than shadowbanning, which some people pretend is somehow evil and unfair). And, indeed, what concerns me is that while various platforms have tried some combinations of all of these things, very few seem to have really committed to these ideas -- and just get bounced back and forth between extreme pressure on two sides: "ban all the assholes" v. "how dare you fucking censor my favorite idiot."
So with the question of Alex Jones or holocaust deniers, internet platforms (again) have every right to kick them off their platforms. They don't want to be associated with assholes? Good for them. But, at the same time, it's more than a bit uncomfortable to think that anyone should want these giant internet platforms deciding who can use their platforms -- especially when having access to those platforms often feels close to necessary to take part in modern day life*. It's especially concerning when it reaches the level that basically online mobs can "demand" that someone be removed. And this is especially worrisome when many of the decisions are being made based on the claim of "hate speech," a term that not only has an amorphous and ever-changing definition, but one that has a long history of being abused against at risk groups or those the government simply dislikes (i.e., for those who advocate for rules against "hate speech" think about what happens when the person you trust the least gets to write the definition).
* Quick aside to you if you're that guy rushing down to the comments to say something like "No one needs to use Facebook. I don't use Facebook." Shut up. Most people do use Facebook. And for many people it is important to their lives. In some cases, there are necessary services that require Facebook. And you should support that rather than getting all preachy about your own life choices, good or bad.
On top of that, I think that most people literally cannot comprehend both the scale and complexity of the decision making here when platforms are tasked with making these decisions. Figuring out which pieces of content are "okay" and which are "bad" can work when you're looking at a couple dozen piece of content. But how about a million pieces of content every single day? Or more? Back in May, when we ran a live audience "game" in which we asked everyone at a Content Moderation Summit to judge just eight examples of content to moderate, what was striking was that out of this group of professionals in this space, there was no agreement on how to handle any piece of content. Everyone had arguments for why each piece of content should stay up, be taken down, or have flag appended to it. So, not only do you have millions of pieces of content to judge, you have a very subjective standard, and a bunch of individuals who have to make those judgment calls -- often with little training and very little time to review or to get context.
Antonio Garcia Martinez, who worked at Facebook for a while, and has been a fairly outspoken critic of his former employer (writing an entire book about it), has reasonably warned that we should be quite careful what we wish for when asking Facebook to cut off speech, noting that the rest of the world has struggled in every attempt to define the limits of hate speech, and it's an involved and troubling process -- and yet, many people are fine with handing that over to a group of people at a company they all seem to hate. Which... seems odd. Even more on point is an article in Fortune by CDT's Emma Llanso (who designed and co-ran much of that "game" we ran back at the content moderation summit), warning about the lack of transparency when platforms determine this kind of thing, rather than, say, the courts. As we've argued for years, the lack of transparency and the lack of due process is also a significant concern (though, when Mark Zuckerberg suggested an outside due process system, people completely freaked out, thinking he was arguing for a special Facebook court system).
In the end, I think banning people should be the "very last option" on the table. And you could say that since these platforms left Jones on for so long while they had their internal debates about him, that's what happened. But I don't think that's accurate. Because there were alternative solutions that they could have tried. As Issie Lapowsky at Wired pointed out in noting that this is an unwinnable battle, the "do nothing, do nothing, do nothing... ban!" approach is unsatisfying to everyone:
When Facebook and YouTube decided to take more responsibility for what does and doesn't belong on their platforms, they were never going to satisfy all sides. But their tortured deliberations over what to do with Jones left them with only two unenviable options: Leave him alone and tacitly defend his indefensible actions, or ban him from the world's most powerful platforms and turn him into the odious martyr he now is.
Instead, we should be looking at stronger alternative ideas. Yair Rosenberg's suggestion in the Atlantic is for counterprogramming, which certainly is an appealing idea:
Truly tackling the problem of hateful misinformation online requires rejecting the false choice between leaving it alone or censoring it outright. The real solution is one that has not been entertained by either Zuckerberg or his critics: counter-programming hateful or misleading speech with better speech.
How would this work in practice?
Take the Facebook page of the “Committee for Open Debate on the Holocaust,” a long-standing Holocaust-denial front. For years, the page has operated without any objection from Facebook, just as Zuckerberg acknowledged in his interview. Now, imagine if instead of taking it down, Facebook appended a prominent disclaimer atop the page: “This page promotes the denial of the Holocaust, the systematic 20th-century attempt to exterminate the Jewish people which left 6 million of them dead, alongside millions of political dissidents, LGBT people, and others the Nazis considered undesirable. To learn more about this history and not be misled by propaganda, visit these links to our partners at the United State Holocaust Museum and Israel’s Yad Vashem.”
Obviously, this intervention would not deter a hardened Holocaust denier, but it would prevent the vast majority of normal readers who might stumble across the page and its innocuous name from being taken in. A page meant to promote anti-Semitism and misinformation would be turned into an educational tool against both.
Meanwhile, Tim Lee, over at Ars Technica, suggested another possible approach, recognizing that Facebook (in particular) serves multiple functions. It hosts content, but it also promotes certain content via its algorithm. The hosting could be more neutral, while the algorithm is already not neutral (it's designed to promote the "best" content, which is inherently a subjective decision). So, let bad content stay on the platform, but decrease its "signal" power:
It's helpful here to think of Facebook as being two separate products: a hosting product and a recommendation product (the Newsfeed). Facebook's basic approach is to apply different strategies for these different products.
For hosting content, Facebook takes an inclusive approach, only taking down content that violates a set of clearly defined policies on issues like harassment and privacy.
With the Newsfeed, by contrast, Facebook takes a more hands-on approach, downranking content it regards as low quality.
This makes sense because the Newsfeed is fundamentally an editorial product. Facebook has an algorithm that decides which content people see first, using a wide variety of criteria. There's no reason why journalistic quality, as judged by Facebook, shouldn't be one of those criteria.
Under Facebook's approach, publications with a long record of producing high-quality content can get bumped up toward the top of the news feed. Publications with a history of producing fake news can get bumped to the back of the line, where most Newsfeed users will never see it.
Others, such as long-time free speech defender David French, have suggested that platforms should ditch concepts like "hate speech" that are not in US law and simply stick to the legal definitions" of what's allowed:
The good news is that tech companies don’t have to rely on vague, malleable and hotly contested definitions of hate speech to deal with conspiracy theorists like Mr. Jones. The far better option would be to prohibit libel or slander on their platforms.
To be sure, this would tie their hands more: Unlike “hate speech,” libel and slander have legal meanings. There is a long history of using libel and slander laws to protect especially private figures from false claims. It’s properly more difficult to use those laws to punish allegations directed at public figures, but even then there are limits on intentionally false factual claims.
It’s a high bar. But it’s a bar that respects the marketplace of ideas, avoids the politically charged battle over ever-shifting norms in language and culture and provides protection for aggrieved parties. Nor do tech companies have to wait for sometimes yearslong legal processes to work themselves out. They can use their greater degree of freedom to conduct their own investigations. Those investigations would rightly be based on concrete legal standards, not wholly subjective measures of offensiveness.
That's certainly one way to go about it, but I actually think that would create all sorts of other problems as well. In short, determining what is and what is not defamation can often be a long, drawn out process involving lots and lots of lawyers advocating for each side. The idea that platforms could successfully "investigate" that on their own seems like a stretch. It would be fine for platforms to have a policy saying that if a court has adjudicated something to be defamatory, then they'll take it down (and, indeed, most platforms do have exactly that policy), but having them make their own determinations of what counts as defamation seems like a risky task, and that would end up in a similar end state as where we are today with a lot of people angry at the "judgments from on high" with little transparency or right of appeal.
As for me, I still go back to the solution I've been discussing for years: we need to move to a world of protocols instead of platforms, in which transparency rules and (importantly) control is passed down away from the centralized service to the end users. Facebook should open itself up so that end users can decide what content they can see for themselves, rather than making all the decisions in Menlo Park. Ideally, Facebook (and others) should open up so that third party tools can provide their own experiences -- and then each person could choose the service or filtering setup that they want. People who want to suck in the firehose, including all the garbage, could do so. Others could choose other filters or other experiences. Move the power down to the ends of the network, which is what the internet was supposed to be good at in the first place. If the giant platforms won't do that, then people should build more open competitors that will (hell, those should be built anyway).
But, if they were to do that, it lets them get rid of this impossible to solve question of who gets to use their platforms, and moves the control and responsibility out to the end points. I expect that many users would quickly discover that the full firehose is unusable, and would seek alternatives that fit with how they wanted to use the platform. And, yes, that might mean some awful people create filter bubbles of nonsense and hatred, but average people could avoid those cesspools while at the same time those tasked with monitoring those kinds of idiots and their behavior could still do so.
I should note that this is a different solution than the one that Twitter's Jack Dorsey appeared to ham-fistedly suggest this week on his own platform, in which he suggested that journalists need to do the work of debunking idiots on Twitter. He's not wrong, but what an awful way to put it. Lots of people read it to mean "we set up the problem that makes this giant mess, and we'll leave it to journalists to come along and sort things out for free."
Instead, what I'm suggesting is that platforms have to get serious about moving real power out to the ends of their network so that anyone can set up systems for themselves -- or look to other third parties (or, even the original platforms themselves for a "default" or for a set of filter choices) for help. In the old days on Usenet there were killfiles. Email got swamped with spam, but there were a variety of anti-spam filters that you could plug-in to filter most of it out. There are ways to manage these complex situations that don't involve Jack Dorsey choosing who stays on the island and who gets removed this week.
Of course, this would require a fundamental shift in how these platforms operate -- and especially in how much control they have. But, given how they keep getting slammed on all sides for the decisions they both do and don't make, perhaps we're finally at a point where they'll consider this alternative. And, hey, if anyone at these big platforms wants some help thinking through these issues, feel free to contact us. These are the kinds of projects we enjoy working on, as crazy and impossible as they may feel.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: alex jones, content moderation, deplatforming, free speech, holocaust deniers, mark zuckerberg, platforms, policing content, protocols
Companies: facebook, twitter, youtube
Reader Comments
The First Word
“So Zuckerberg said:
So far, so good. Then he went back on this and took the stuff down anyway.
It's been said that those who do not learn from history are doomed to repeat it. Here's a bit of history that hasn't been all that widely studied, that we're currently in the early stages of repeating: hate speech laws from a century ago.
It might be surprising to learn that Weimar Germany had very strong, very modern laws against hate speech, and one of the strongest beneficiaries of those laws were the Jews. They used them to fight back against very real discrimination in their day, and the courts did "the right thing" the vast majority of the time.
One frequent target of such laws was a hate-filled guy by the name of Adolf Hitler. He ended up getting smacked down for his serial offenses so much that he eventually got injunctions against him, preventing him from holding further rallies. ("Deplatformed," to use the modern parlance.) Well, that went over perfectly and we never heard from that troublemaker again... right?
Oh, wait, no. That's not what happened at all. It made a martyr out of him. The Nazis were able to point to the way he was being censored and use it as a rallying cry, which ended up being massively successful and we all know where that led.
So yes, there is a massive ethical problem with allowing people to be censored from the modern-day public square, and the fundamental problems involved do not change one whit if those doing so are private rather than state actors. With great power comes great responsibility, and when you become powerful enough to do things that historically only governments were capable of doing, the restraints that we have historically placed upon governments must be applied as well.
Subscribe: RSS
View by: Time | Thread
Denying the Holocaust is just willful ignorance in the name of believing you're more enlightened than the masses.
[ link to this | view in chronology ]
Re:
These places encourage echo chambers, so why in the world would you want to allow anybody to participate in those echo chambers? They do not further discourse in any sort of good faith, period. People who subscribe to these nutjob theories aren't looking for honest and true discourse.
i thought he was going to be nuanced, not do the 'let's listen to both sides in Holocaust denial' thing, jeez.
i mean, let's all talk about how Newton was completely wrong about gravity or how in some alternate parallel world i believe in, 2+2 = 5, GOTTA LISTEN TO BOTH SIDES. /s
[ link to this | view in chronology ]
Re: Re:
anti-vaxxers, flat-earthers, and various other nutjobs
Anti-vaxxers may be idiots and clearly incorrect, but I'm not so sure setting a precedent of "ban people who question the pharmaceutical industry" is a great idea either. If you're banning flat-earthers, are you also banning fundamentalist Christians who believe the earth is 6,000 years old? I mean, that's equally factually incorrect, but I suspect you might get some backlash on that one.
And, in general, precisely how many scientists in how many fields is Facebook supposed to employ in order to make these determinations as new claims emerge in the future?
let's listen to both sides in Holocaust denial
Saying "attempting to ban all holocaust deniers may not be the best solution for various reasons" is not the same thing as saying they deserve your attention or "let's listen to both sides".
let's all talk about how Newton was completely wrong about gravity
I mean... he sorta was.
[ link to this | view in chronology ]
Re: Re: Re:
BIG SCIENCE IS MAKING MONEY OFF THE THEORY OF GRAVITY. QUESTION EVERYTHING, EVEN PROVABLE FACTS, BECAUSE FEELINGS!
ALSO, MY OPINION IS 2+2 = 5 AND IT'S JUST AS IMPORTANT AS STEPHEN HAWKING'S. HE'S HUMAN, I'M HUMAN, SAME THING. HE TELLS YOU 2+2 = 4 BUT HE'S A PHARMA SHILL AND I'M NOT.
I mean, seriously? That's all you have to say about that? Why don't you go tell your kids tonight gravity is just a theory so they should try doing 5th-floor parkour after dark because it's fun? Or better yet, how about you let your local cool conspiracy nut encourage it to your children? THE IMPORTANT PART IS HE HAS A PLATFORM TO SAY WHAT HE WANTS, RIGHT?
Hey, you know what? While we're at it, Twitter should let the 'infidels' and cartels post violently graphic pictures of them dragging bodies up and down the street BECAUSE THEY SHOULD HAVE A PLATFORM; THEY HAVE A VOICE, TOO; MAYBE THEY ARE RIGHT ABOUT SOMETHING!
i also think NASA should open up all their rocket building to public comment and then whatever gets upvoted the most is how they execute putting people in space. GONNA WORK OUT GREAT, I BET.
/s i mean, that is such a total load of B.S.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Next time you should actually read what I wrote instead of foaming at the mouth about random nonsense.
[ link to this | view in chronology ]
Re: Re:
Yeah, i can't believe he's endorsing that strategy of dealing with these people. Holocaust deniers should not be given any sort of validation in any remotely mainstream publication or platform, internet or otherwise, nor should anti-vaxxers, flat-earthers, and various other nutjobs who are outright factually wrong.
I'm in agreement that none of those people deserve any sort of validation. But... seeing as I know that I'm not the one putting together the list of what content will be allowed and what won't be, I'm a bit concerned about who WILL be making that determination. There are lots of things that are acceptable today that weren't acceptable just a few decades ago. Would you have been okay if a few decades ago an equivalent of Twitter banned any talk of same sex marriage? Or go back a few more decades and perhaps it would have banned any discussion of divorce. Or civil rights. I'm not saying these things are the same, but I am saying that a few decades ago many people DID view same sex marriage in the same way that you and I might view flat-earthers today. So... be careful what you wish for.
i thought he was going to be nuanced, not do the 'let's listen to both sides in Holocaust denial' thing, jeez.
I most certainly did NOT say "listen to both sides." I would not, because I don't believe that. Did you even read the post?
[ link to this | view in chronology ]
Re: Re: Re:
It has been a while since I posted here. The way this is headed concerns me very much, so I thought it might be fun to play Advocatus Diaboli (devils advocate) along many, many paths...
But... seeing as I know that I'm not the one putting together the list of what content will be allowed and what won't be, I'm a bit concerned about who WILL be making that determination
And right you should, you speak out about copyright issues, limitations on copyright and corporate greed ... how soon until you are on the receiving end of being shadow banned on search engines, social media, and have your accounts deleted online? We have already seen what government agencies are capable of when left unchecked.
There is this thing called the Streisand Effect maybe you have heard of it (nudge, nudge, wink, wink)?
Since, Alex Jones was banned from Facebook, his app has become the most downloaded on Googles Play store (see NYT). He has added followers on Twitter 100 times faster than before. And he has become bigger than CNN, what am I saying I have more followers than CNN, so that is nothing, never mind.
Banning him will make him disappear
400 years ago the catholic church had this list called the "Index Librorum Prohibitorum" (Index Of Banned Books). I believe every book on that list can be found today. You ban something people wonder what am I missing? what secrets are they trying to hide? ... well maybe I should take a look. This also goes to the Streisand Effect section.
We are a private company and the first amendment doesn't apply
It does if you banned certain people because the government pressured you to do so. Remind me again, how many hearings have the social media companies testified in front of? What was the implied threat, oh yeah regulation.
We are a publicly traded company, we have lawyers, and nothing can happen to us, we have a EULA that allows us to ban anyone we want
Until it comes out, that as a publicly traded company, you have alienated half your subscriber base, and made them wonder if they are being shadow banned because of their views or thoughts. Leading to your delete account pages being so swamped they no longer work. It is as if people had changed their names to Elvis and decide to leave the building. Which in the end causes investor lawsuits, because your job was to make money and not do politics.
I could go on for hours on this. I will leave you with one final thought. Since Mason Wheeler already broke Godwin's law.
First they came for (insert your least favorite group, racial, political, religious, sexual orientation, annoying neighbor, mother/father in-law, etc) I said nothing.
Then they came for (insert your second favorite here, then third, then forth...)
Then with no one left to speak for me ...
they came for me.
This is how this crap starts. Learn from history people.
[ link to this | view in chronology ]
Yeah, and some of those hearings and implied threats came after suspensions, bannings, and other actions that were implied to have proven an anti-conservative bias present within those companies. The government did not want Alex Jones banned from social media; if anything, they likely wanted Twitter, Facebook, etc. to keep him around because banning him would be “silencing” another conservative voice.
[ link to this | view in chronology ]
Re:
s 1) Yeah the government threatened suspensions, regulation, and retaliation ... but ....
s 2) The government did not want to ban people ...but ... yeah look at the first sentence.
[ link to this | view in chronology ]
Regulation does not necessarily mean “banning people”, you know.
[ link to this | view in chronology ]
Re:
Lets start chanting one word, one nation, one goal for our great nation, we want to be like ... Venezuela, Venezuela, Venezuela, follow along with me.
Oh come chant it to me, it will be fun... sing it to the tune we are the champions of the world by Queen, with a broken record player ...
[ link to this | view in chronology ]
Re:
Of course it does, to answer your original post. Any regulation is a ban. Because regulating any speech, no matter how hateful, is a ban on free speech. And ....
niger, spik, whitey, kike ... etc, etc, etc,
which will get me banned here?
Which one will not get me banned here?
Double standard anyone?
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
Yeah, I remember how that poem is about how in Nazi Germany, private platforms blocked conspiracy theorists from using them.
No, wait, sorry. That poem is about how the government was rounding people up and putting them in camps.
If you're worried about governments rounding people up and putting them in camps, I suggest that perhaps you should spend less time being angry at Facebook and more time being angry at ICE.
[ link to this | view in chronology ]
Re:
Ironically there is a Jewish religious tradition of it with 'fences around the law' where the devout would proscribe things that could possibly lead to breaking the holy law. I believe one interpretation of the no meat with dairy mixing practice is the 'not to have a calf boiled in his mother's milk' as an admonishment against senseless cruelty. If one doesn't know if the source of the calf and the source of the milk line up with one hundred percent certainty just don't mix the two - even if you think you are using goat milk with veal there could be a mix up not noticed until later. Given that it involves both religion and law you had better believe there are a lot of arguments about it - don't take anything I say as more than a paraphrase of remembered tidbits.
Like the slippery slope fallacy it can be silly or it can be principled like "Better one hundred guilty go free than one innocent be imprisoned, for doing otherwise will lead to only contempt for the law: if following the law is no guarantee of innocence why follow the law"?
[ link to this | view in chronology ]
A calf boiled in it's mothers' milk...
...is a specific part of a ritual wedding feast prepared in the name of the goddess Asherah.
Between the time when the oceans drank Atlantis and the rise of the sons of Aryas, [Yahweh] (before He was Yahweh) had a consort, Asherah, who was a bit more ambitious than He was, and when She became too popular, His temple priests gathered together a mob of loyalists who sacked the Asheran temple, slew its clerics and acolytes and burned it to the ground.
After that it was pronounced a capital sin to worship Nod and Asherah, and the proscription against a calf boiled in it's mothers' milk had to do with criminalizing Asheran practices and traditions.
Religions being what they are and the applicability of myths being eternal, this is not to say that more modern interpretations are wrong (any more than, say, the flight of Icarus being a lesson about hubris), but that's how that bit got started.
[ link to this | view in chronology ]
A Small Matter of Control
That seems unlikely. Control is the name of the game for Internet properties such as the ones that you are citing. I think one could make a plausible argument that control is more important than near-term profits. It seems more likely that a firm with control can earn future profits than a firm with profits can earn future control.
Which is why I have to call a wee bit o' shenanigans on:
Tactically, I agree. Strategically, I expect that the only way to "move to a world of protocols instead of platforms" will be to move off of Facebook, et. al. to other things. Partly, those "other things" (hopefully) will be "protocols instead of platforms". Partly, without loss of market share, I do not see the existing Internet properties embracing a loss of control.
[ link to this | view in chronology ]
Re: A Small Matter of Control
Facebook and their ilk are making it much easier by being cesspools of rage and bullshit. I know the Mastodon fediverse got another spike in Twitter refugees yesterday thanks to Jack’s explanation of why Twitter has not yet booted Alex Jones.
[ link to this | view in chronology ]
Re: A Small Matter of Control
FYI, a few of us have been having a pretty good conversation on this subject over in the Voting by Cell Phone thread -- with apologies for the thread-jacking, but this post wasn't up yet.
Generally, I'm with you: I think people should quit using Facebook or Twitter if they're alarmed by their actions or their policies. But I acknowledge Mike's point, too: this is much easier said than done. If you're running a business and you need to advertise it, giving up Facebook or Twitter effectively means hobbling your chances of attracting an audience. It's simply not a practical solution under those circumstances.
But if you're not? If you're just a private individual using Facebook and Twitter to keep in touch with people? You should think about whether there are alternatives that would work for you and your friends -- alternatives that may be a little less convenient, but which also don't include all the stuff you don't like about Facebook and Twitter.
(And if you're reading Techdirt, you're probably tech-savvy; you may already be aware of Mastodon, or know how to set up a Wordpress blog, or a phpBB messageboard, or some other kind of independent platform.)
My feeling is, if you're worried about how big Facebook and Twitter are, the solution is to make them smaller.
Though, again, I can't tell people who rely on those services for income to make that leap.
I mentioned in the conversation I linked up top that I think switching from the big, corporate platforms to smaller, independent ones is going to require that a whole lot of people re-evaluate how they think about the Internet, and that changing opinions like that is a very difficult thing indeed. But I mentioned that there's been, at least, some success in "buy local" campaigns in meatspace; I made the analogy that Budweiser is still doing a brisk business but microbreweries are more popular than ever. It's quite possible that there could be a rise of smaller, independent platforms that ate into Facebook's marketshare but Facebook would continue to be the biggest dog on the block.
It's also possible that the current backlash against Facebook results in subscribers quitting Facebook but just moving over to Instagram (which Facebook owns) or some other monolithic platform.
Ultimately, it's quite clear that both Facebook and Twitter are very vulnerable in this moment (which is where the Budweiser analogy breaks down; Anheuser-Busch InBev never lost 20% of its stock price in one day). There's a lot of anger and dissatisfaction. What will the results of that anger and dissatisfaction ultimately be? We'll see.
[ link to this | view in chronology ]
Re: Re: A Small Matter of Control
But where does this optimism come from? Are there good examples of closed platforms becoming open protocols to the benefit of their users? If anything it's the opposite. Services tend to become more closed and anti-user over time.
[ link to this | view in chronology ]
Re: Re: Re: A Small Matter of Control
Would email count?
[ link to this | view in chronology ]
Re: Re: Re: Re: A Small Matter of Control
[ link to this | view in chronology ]
Re: Re: Re: Re: A Small Matter of Control
[ link to this | view in chronology ]
Re: Re: A Small Matter of Control
This is really the key to the whole issue. The problem isn't that a private company chose to censor someone. The problem is that a private company that acts as a virtual monopoly acted to censor someone. These giant tech monopolies need to be broken up.
[ link to this | view in chronology ]
Re: Re: Re: A Small Matter of Control
Haven't you people seen The Simpsons? "Mono means one"!
[ link to this | view in chronology ]
Re: Re: Re: Re: A Small Matter of Control
https://www.theguardian.com/commentisfree/2018/aug/10/infowars-social-media-companies-conspi racy
and a well-articulated viewpoint as to why banning Alex jones is a mistake.
That source states that Internet has 4 Gig users, and 2 Gig (half) of them are active on Facebook. Might not be a monopoly, but close enough to be a *huge* concern.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: A Small Matter of Control
You don't become a monopoly by being big, or by serving a majority of the market. You become a monopoly by being the only provider in the market.
Facebook is not the only provider - not even close. People have LOTS of options for everything Facebook does - a few very big ones, dozens of medium-sized ones, and countless small ones. Hell, it seems that today's young people don't even care about Facebook anymore - we're all just a bunch of old folks moaning about a social network that's already out of fashion and stands a good chance of falling from grace within a generation.
Remember when MySpace was the undisputed king of social media? Remember when Digg seemed to rule the internet? Where are they now?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: A Small Matter of Control
Or look at it this way:
How many of the 2-billion Facebook users and the 1.8-billion YouTube users stated in that article overlap? And how many also use one or more of Twitter, Reddit, Tumblr, LinkedIn, Snapchat, or Pinterest on a regular basis? I would estimate that only a very small minority are exclusively Facebook users. So how is that a monopoly?
[ link to this | view in chronology ]
Re: Re: A Small Matter of Control
Also, it makes some sense for YT and Facebook to take it out. They're taking down 'hateful content'. The problem is when you have LinkedIn doing the same for the person, when LinkedIn doesn't even host any of the content, or Mastercard cancelling accounts for certain people. That becomes the same thing as a store having a 'no gays' sign. Sure, go to another store. But tomorrow, that store will also have a similar sign. Then the others have a similar sign, as they have the right to do it as well, and the only way to get your groceries is to take the money you keep stashed under the floor planks in your house and get it from some back alley dealer. This is a point of employing anti-discrimination protocols, rather than 'being regulated because it's so big'.
[ link to this | view in chronology ]
Re: Re: Re: A Small Matter of Control
So far, solutions to all this complaining has resulted in
1) Inaction. Result: more complaining till they couldn't ignore it any more.
2) Some action to remove the most egregious items. Result: The increase in users resulted in more egregious items and automation via keywords, etc., to remove them. Result: some innocent items got removed.
3) More action to remove contentious items. Result: even more complaining, this time about censorship.
Okay, so what do you propose? As business owners they're obliged to serve their customers or risk losing them. My Twitter feeds into my Linked In so it does appear there and if someone doesn't like what they see they'll complain. Chances are, if I repeatedly post items Linked In users don't like, I'll be banned.
None of this is "unpersoning" anyone. It may, however, prompt more socially acceptable behaviour on the part of users wishing to posting content that other users find objectionable. This, in and of itself, is entirely objective and will shift with trends in public opinion so if you're a rainbow flag-waving member of the LGBT community advising the location of a Pride parade in your town it's unlikely that you'll have your post removed even if right-wingers complain because there's a quorum of people who would complain about the complaints. However, if you advocate violence or cruelty to members of the LGBT community your post will most likely be yanked and only right-wingers will complain - and most likely be ignored. Does that make sense?
So basically what we're experiencing now is the result of years of build-up of resentment on the part of people who have experienced bad behaviour online and have elected to push back rather than be driven off by trolls. As I predicted years ago when I first started blogging, either the platform admins would do something about bad behaviour online or users would compel them to when the number of affected people reached a certain level. They've reached the level and the chickens are coming home to roost. Actions have consequences.
[ link to this | view in chronology ]
Re: A Small Matter of Control
The idea of protocols instead of platforms is great from a tech and user perspective, but can you imagine trying to pitch that to some VC guys? E-mail's a protocol and it's everywhere, but who's making money off of it? How do you recoup your investment? I'm not saying you can't, but it's not immediately clear
[ link to this | view in chronology ]
Re: Re: A Small Matter of Control
E-mail's a protocol and it's everywhere, but who's making money off of it?
[ link to this | view in chronology ]
Re: Re: Re: A Small Matter of Control
[ link to this | view in chronology ]
Re: Re: A Small Matter of Control
The idea of protocols instead of platforms is great from a tech and user perspective, but can you imagine trying to pitch that to some VC guys? E-mail's a protocol and it's everywhere, but who's making money off of it? How do you recoup your investment? I'm not saying you can't, but it's not immediately clear
This is beyond the scope of this topic, but have you seen how much VC money is flowing into token-based companies these days? Many are building protocols (just look at IPFS as an example). There are lots of people who think that using tokens/cryptocurrency is a way to make protocols much more sustainable and profitable.
[ link to this | view in chronology ]
Re: Re: Re: A Small Matter of Control
To wit: SoundCloud on the blockchain? Audius raises $5.5M to decentralize music
Might be an interesting one to watch.
[ link to this | view in chronology ]
KILL IT WITH FIRE
[ link to this | view in chronology ]
Re:
Yeah, I know, I know.
But despite its overexposure as a buzzphrase, the blockchain is quite an amazing technology, and this seems like a good application for it.
I've seen people (like Cory Doctorow) arguing for years that the proper solution for the question of how to compensate artists is to include some fixed amount as part of everybody's internet subscription fee and then track what songs get listened to and how much, and compensate artists accordingly.
This isn't quite that, as it's supported by ads or a separate subscription rather than being built into a monthly cable bill, but it sounds like a promising way of tracking what gets listened to (hopefully) without tracking who, specifically, is listening to it, and dividing up the proceeds in a fair way.
Plus, the goal seems to be to cut the labels out of the process, which I'm all for.
I'm sure there are a lot of bugs to work out, not just technically but in terms of the business strategy. But it's an interesting idea and I think it could turn out to be a model that works and is preferable to the system we've got right now, both from customers' and artists' point of view.
[ link to this | view in chronology ]
Re: Re:
However, how do we extract a fair payment *into* the system, since I don't listen to music???
[ link to this | view in chronology ]
Re: Re: Re:
If your question is about Audius, then clearly the hope is that enough people sign up that it is profitable -- but you're right, there's no guarantee of that.
If your question is about Doctorow's hypothetical of paying your ISP a fee every month to compensate rightsholders, then presumably that fee wouldn't just be divided up according to the music you listen to, but all the other media you access. Games, videos, news articles, everything.
Implementation of such a system is, of course, an open question, but I feel like -- much as I understand Stephen's impulse to roll his eyes at the mere mention of the word -- a blockchain solution is a good fit for this problem.
[ link to this | view in chronology ]
Re: Re:
USENET also has the free speech everyone claims to want.
[ link to this | view in chronology ]
CDA Section 230...
Yes, I'm commenting without reading the whole thing (yet). But there's something you repeat about CDA § 230 that I don't think is right. You say section 230 "encourages" platforms to moderate. No, it doesn't--nothing in section 230 encourages, motivates, or in any way leads platforms to moderate. The most it does is to remove a disincentive to moderate, that being the position of some courts that moderation made a platform liable for whatever appeared there. Removing a disincentive does not provide affirmative encouragement.
Now back to read the rest of the post...
[ link to this | view in chronology ]
Re: CDA Section 230...
The thing is that in writing that portion of the law, Congress made it pretty clear that their goal was to "encourage" and "incentivize" moderation, which is why you see that word used so often to describe s.230
So you are correct that technically the statute itself does not say anything about "encouraging" - but that was and is very much its stated purpose.
[ link to this | view in chronology ]
So I know very little about how it works.
Sticking an asterisk onto something seems like a fair way to deal with it.
The largest problem is, EVERYBODY & their gluten free dog, is demanding to NEVER!!!!!!!!! be offended.
A mantra I've used at least once or twice here is 'Personal Responsibility is Dead'.
Twitter has the obvious solutions blocking accounts, you can have your very own list of forbidden words, & if you just want to be lead by the nose there are blocklists you can subscribe to so you don't even have to think about it they block the 'bad' people for you!
Instead they slam the report button, cry how wrong it is, tell their friends to be upset to, and Twitter puts the account in time out & gives the DUMBEST explanations as to why. (See Also: https://www.techdirt.com/articles/20180419/17513039676/how-twitter-suspended-account-one-our-comment ers-offending-himself.shtml?threaded=true)
I was 'promoting hatred towards others'.
They couldn't/wouldn't explain that to me.
I pointed out the insanity & someone else on the team said oh oops... giggle sorry, we goofed.
No you responded like Pavlov's Dog. The button was hit enough times and you jumped into action & that action was lock it down & demand he delete it.
They have a 'team'... who apparently don't look at timestamps.
A flurry of reports on a year old tweet, but from none of the people in the thread, and all of the accounts reporting have submitted a large number of complaints that end up reversed (when the target bothers to try and fight back against the stupidity that is the team). Its almost like people are abusing the system to silence people! *le gasp*
I dislike Mr. Jones on SOOOO many levels, but he deserves to be able to tweet out his bullshit, just like I tweet out my bullshit. I don't have to follow him which magically seems to keep a vast portion of his bullshit from my sight. I think he is a conman fleecing the rubes & any action you take drives the rubes deeper into his hands. Leaving him alone, the best course of action, isn't possible when the 'team' kneejerk reacts to the report button being smacked ignoring anything around it. If he was tweeting out the addresses of the Sandy Hook parents, we all know that is a clear violation of the rules & should be reported. If he is claiming Hillary is a Lizard Alien with dementia... I have to ask why you bother giving him your attention. But people love to report because there is no fscking downside to doing it.
If I crank call 911 (without call spoofing to make it look like it came from the WH switchboard) I get a visit from Johnny Law & if I am flippant to Johnny Law he plays Uber... with cuffs with my ass. If I keep doing it, I end up in a cell to teach me that 911 isn't a toy.
Imagine if the braintrust that mass reported me had gotten a minor time out for abusing the system, think they would pursue that tactic again right away? Imagine if Twitters system showed Trust & Safety they have reported this account 20 times, none upheld. Think they would zap the reported account instantly... or perhaps look and see if it violated any actual rules and if not kick off a vacation for the reporters.
The mass reporter people think its funny silencing people on Twitter, if their main tool for doing so had the power to silence them if they abuse it... well once they got done screaming how unfair it was into the void they might learn to avoid doing it again lest they end up off the platform for abusing 911.
(sees letter from Del Harvey to staff about their new and improved punishments for dehumanizing behavior)
Nevermind.
[ link to this | view in chronology ]
Re:
Alex Jones can say whatever the hell he wants—but he is not entitled to use someone else’s platform for that purpose.
[ link to this | view in chronology ]
Re: Re:
It would have taken about 3 hours before he tweeted something that violated the actual rules.
There are millions of Twitter users, I'm pretty sure I've never seen tweets from 99.9% of them. If I was seeing tweets I disliked I should block the account, & not focus on if I don't stand up and declare war on person I disagree with the fate of the universe falls to evil!!!!!!!!
Everybody can be somebodies asshole, so if you start booting people off for being an asshole where & when can you stop?
[ link to this | view in chronology ]
So what? Those companies had the right to boot him. They exercised that right. That his getting banned made him a martyr for misguided free speech absolutists and crackpot conspiracy theorists does not change those facts.
Booting someone for being a disruptive asshole is a subjective judgment call. Booting someone for making defamatory statements or expressing hatred toward a historically oppressed minority of the population, however, is far less subjective.
[ link to this | view in chronology ]
Re:
It's still pretty subjective.
Determining for certain whether or not a statement is defamatory is up to the courts. There are certain reasonable guidelines that a layperson can follow to make a best guess -- is it a factual statement, or a statement of opinion based on disclosed facts? But even people who have an above-average knowledge of the law can trip up on that question.
(Try saying "Shiva Ayyadurai's claim to be the inventor of e-mail is an opinion, not a factual statement" and see how many people in the comments disagree with that, despite it being the core of Techdirt's defense and the judge's decision to dismiss the case. Indeed, here's a thread where a bunch of people jump down my throat for saying that -- including you, though at least you were a little nicer about it than the guy who called me a "pathetic loser" -- despite, again, it being a legally correct statement.)
Accurately recognizing hate speech is a difficult problem too; sometimes it's very, very obvious, but sometimes it requires context; something that appears to be hateful rhetoric on its surface may not be. There are plenty of stories on Techdirt of posts or tweets reporting abuse being flagged as abusive themselves, and of course the recent story of That Anonymous Coward being suspended from Twitter for using a homophobic slur in reference to himself. And I believe you dropped a C-bomb the other day; there are people who would see the mere use of that word as "expressing hatred toward a historically oppressed [segment] of the population" ("minority" from your original quote replaced here because women are, of course, not a minority), even though the context in which you used it was not actually demeaning toward women.
Defamation and hate speech can be easy to spot. Some examples are flagrant. But some are not. And understanding context and nuance does not work at scale.
That's another reason why I think going back to smaller communities is the best fix here: because in a small enough community, the moderators know all the members and all the inside jokes. No community is ever going to agree 100% about any moderation decision, but a moderator who's part of the community can make an informed decision about whether or not a comment is abusive, in a way that somebody reading a post completely out of context, without knowing any of the people in the conversation, can't.
[ link to this | view in chronology ]
Re: Re:
Fair points.
[ link to this | view in chronology ]
Re: Re:
Also, I appreciate two things about this comment:
That change from “minority” to “segment” when quoting me; I will try to use that particular wording from now on.
Good show, sir! 👍
[ link to this | view in chronology ]
Re: Re: Re:
Thanks. And I'd like to add that I enjoy our conversations and you give me a lot to think about too, even if we don't always agree.
And in this case it's not so much that I'm disagreeing with you as trying to tease out some exceptions to what you're saying. You're making good points, as usual; I just want to point out that they don't apply in all cases and the devil is often in the details.
[ link to this | view in chronology ]
Oh by all means, feel free to point out when I make absolutist statements that should have more nuance injected into them. I cannot learn if no one tells me what I got wrong.
Besides, being able to admit to being wrong is a good thing. Can you imagine how much of a shithead someone would be if people told him that a certain claim was wrong and he kept insisting he was right?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:
I like it.
Part of the problem is it is very rare that people will look at anything other than the single tweet presented to them by someone outraged.
I had some alt-right nutjob trying to verbally rip my throat out b/c he saw a single tweet and decided his best plan would be to attack me for mocking his hero...
The problem is his hero & I follow each other, we have a long history of inappropriate comments back and forth. We disagree on somethings but we set that shit aside and look for where we agree. I can ask questions & not get the standard you leftie commie etc. b/c he knows I am honest in my desire to understand their position.
So I have this unhinged idiot screaming at me, so I'm mocking him b/c well thats what one does when confronted with idiots, he gets so very worked up & then I pointed out you follow hero, he doesn't follow you... he does follow me. He got real quiet real fast.
I get a couple of these a week, someone seeing 1 tweet out of a tweet storm or long thread & going bonkers. Its easy to see 280 characters and get riled up, but more people need to start opening the whole thread to get the context.
[ link to this | view in chronology ]
Re: Re:
Just a bit of a side note, when you talk about minority in terms of discrimination and equality you don't talk about the overall number of individuals in such position but rather the fact that that segment, however big it is, does not receive equal treatment. Women are a minority segment in our current patriarchal society. In Brazil black people and their half-breed with white people (is that the right word to define it in English?) are more than half of the population but they suffer from multiple injustices so they are a minority group. It's about representation and equality.
Just my 2 cents. I'm not criticizing your comment, just focusing this specific issue of definitions.
[ link to this | view in chronology ]
Re: Re: Re:
While potentially correct from a grammatical point of view, oppressors love to co-opt language and can use a term like minority in a discussion of oppression to position themselves as the oppressed minority, because of the use of semantic games to establish a definition of themselves as the minority. And, of course, since they are the minority, not the majority, they can't be oppressing the 'majority'
While they can still co-opt the use of more accurate terms like 'oppressed segment' or just 'the oppressed', it keeps the discussion on their justifications on the oppression part, making it harder to disguise their threadbare arguments with a vainer of intellectualism and philosophy.
[ link to this | view in chronology ]
Re:
Where they let other groups stay up for the exact same type of hate speech as jones, only they see nothing wrong with it because it's directed at people those in charge don't like.
If they are picking and choosing who they ban based on if they dislike their targets or not, that seems like censorship based on personal views instead of the rules.
[ link to this | view in chronology ]
Which is entirely fine.
It is not “censorship” when someone gets booted from a platform they do not own, even for arbitrary and capricious reasons. Even the InfoWars terms of service document says that. A Twitter ban does not prevent someone from speaking their mind elsewhere—it only denies a platform and an audience to that person.
And by the same token, Twitter cannot be forced to host speech which its owners/operators do not want to host. The same goes for any other website. If someone were to argue that a White supremacist forum has the right to boot users for expressing disdain toward White people, they would have to accept that Twitter has the right to boot users for expressing White supremacist views.
Feel free to argue about the morality and ethics of banning people based on political viewpoints. That is a discussion worth having. But legally, the government can no more force Twitter to host Alex Jones’s speech any more than we can force Techdirt to host ours.
[ link to this | view in chronology ]
Re:
If you get kicked off the telegraph, or get kicked off plain old telephone service (POTS) for arbitrary and capricious reasons then whether you want to use the label “censorship” or not, it's still unlawful.
Further, along similar lines, in Turner Broadcasting v FCC (1997), the local television stations didn't own TBS's cable system, but the "must-carry" provisions of the Cable Television Consumer Protection and Competition Act of 1992 were still upheld.
[ link to this | view in chronology ]
Re: Re:
That would be equivalent to being kicked of the Internet. Being kicked off of FaceBook or Twitter is like being banned from a pub or cafe, you have to take you custom else where. The people you want to talk to do not have to follow to your new watering hole, but they are free to do so if they want to listen to you.
[ link to this | view in chronology ]
Re: Re: Re:
Do you understand very clearly that existing statute does not support the distinction you want to make here?
47 USC § 230(f) Definitions
So, with that clear understanding, you're invited to suggest new statutory language that would convey the distinction that lots of people would like to make.
Normatively, I may be inclined to agree that you're making a worthwhile distinction that ought to be expressed in the Communications Act of 1934, as amended by the Telecommunications Act of 1996, and as further amended.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
At this point, I'm not going to accuse you personally of arguing in bad faith, or acting as a Verizon shill.
But everyone ought to understand the bait-and-switch that's being offered.
Stone, for instance, argues that “platforms” have every right to terminate service for arbitrary and capricious reasons. But a “platform” isn't a term currently understood by courts or the law.
So, what he's arguing, in essence, is that telegraph companies and telephone carriers have every right to kick subscribers. And that's bullshit. Almost no one agrees with that. The people who do agree with that are generally paid by the telecoms.
[ link to this | view in chronology ]
If Facebook and Twitter were owned by the government and operated the same way as public utilities, you might have had a point there.
[ link to this | view in chronology ]
Re:
AT&T, back before the breakup in the eighties, might have effectively owned the government. But both before and after the MFJ, the government certainly didn't own AT&T.
Government ownership has never been a prerequiste for common carrier status.
[ link to this | view in chronology ]
Re:
Facebook and Twitter are not publishers or speakers.
[ link to this | view in chronology ]
Facebook and Twitter operate services that are platforms for legally-protected speech. Why should they have fewer rights to decide what speech will or will not be allowed on those platforms than a newspaper or a magazine publisher?
[ link to this | view in chronology ]
Re:
If that's the tack you want to take, then, why should they have fewer duties than a newspaper or magazine publisher?
Perhaps the answer to that is they're offering communications service to the public at large — at scale.
[ link to this | view in chronology ]
Like you said: Those services are neither publisher nor speaker. The companies may have a moral/ethical duty to moderate their services in a similar fashion as newspapers and magazines, but to make such moderation a legal requirement would destroy the usability of those services.
[ link to this | view in chronology ]
Re:
O'Brien v Western Union (1st Cir. 1940)
(Citations omitted.)
So why should Facebook and Twitter have fewer duties than the telegraph company? The telegraph company can't arbitrarily and capriciously deny service — no more than a railroad —
The anti-discrimination language of the Communications Act of 1934 was sourced from earlier language in the 1887 Act. It's a deep principle.
[ link to this | view in chronology ]
Your quoted court case says it all:
You know how busy Facebook and Twitter are these days? Yeah, imagine if all that traffic was ground to a halt because the admins and moderators of those services had to make sure every post was within the boundaries of the law even before they check to see if they do not violate the terms of service. You might say that is a good thing. But if Facebook and Twitter were slapped with that restriction, every other service and website that allows third-party content would have to do it, too. Tumblr, YouTube, Instagram, Blogger, every conceivable forum and imageboard…all of them would have to hold back posts to double check whether they could be posted. And given how the larger services are already understaffed vis-á-vis the scale of moderation they have to do, forcing them to moderate every post would destroy the usability of those services. The Web would effectively grind to a halt.
If Twitter, Facebook, etc. have a hand in creating or posting illegal content, they should be held accountable for it. (That is why Backpage got dinged by the law.) They should not be punished or unduly burdened just for operating a service that some people will use to personally demonstrate the Greater Internet Fuckwad Theory.
[ link to this | view in chronology ]
Re:
No. That case addresses the previous question — why Twitter and Facebook, just as a telegraph company back in 1940, might have fewer duties than a publisher.
But the question I followed up with, was why Twitter and Facebook, who are engaged in the business of providing communications service to the public at large, should have fewer duties than other communications service providers? Such as telegraph companies. Why should Facebook and Twitter be permitted to act arbitrarily, capriciously, unreasonably, and unjustly.
I'm not saying there aren't plausible reasons for that. But you yourself haven't given any here.
And lately, it's the modern phone companies' well-known and public position that they shouldn't have any greater duties or obligations than Facebook or Twitter.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
So the Comcast sits astride the gateway between a person at their residence or business, and the whole world wide web.
Person <-(Comcast)-> WWW
Applying your stated distinction, Comcast should be able to moderate your residential connection.
You know, most people don't agree with that concept. Personally, I think your stated distinction needs some restatement.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Failure to distinguish between the communications level and service level of the Internet is like failing to distinguish between the phone company and the businesses that you use the phone to communicate with.
The reason that Comcast and friends do not want to be classified as a common carrier is because they set between you and all those services that are eating away at their cable subscriptions.
[ link to this | view in chronology ]
Because at the end of the day, Facebook and Twitter are not the following two things:
They are still, for all their size and cultural influence and monetary value, platform services controlled by privately-owned corporations. They should have no fewer or no more legally-bound duties than any other similar platform. I have asked this of multiple people, and no one has yet to give me a straight answer: Why should Twitter, Facebook, etc. be forced into hosting speech which the owners and operators of those services do not want to host?
What they provide is a platform for speech; the Internet is the actual communications service.
[ link to this | view in chronology ]
Re:
In law, right now, descriptively, platforms similar to Facebook and Twitter are 47 USC § 230(f)(2) “interactive computer services”.
Yet, according to polling, five out of six people in the nation today oppose the idea that Verizon, Comcast and others ought to be permitted to restrict access to or availability of voice and/or data services.
I've tossed you a softball question twice now, slow hanging fat and square right over the plate — instead of taking the opportunity to smash it out of the park — you continue to insist on a principle that telegraph and telephone carriers have every right to kick subscribers.
They're commercial business. If those for-profit business corporations don't want to obey the law Congress enacts, they can flee to Europe or China. Or they can simply get out of the business of providing communications service to the public, take their servers down, and devote them to bitcoin mining or something.
The law on this goes back to circa 1623. Unfortunately, that means the case is reported in that bastard Anglo-Norman language known as “Law French”.
Essentially, the judges there say if you don't like the business you're in, you can take your marbles and go home. Until then, you've got to obey the law for that class of business.
One way or another, the American people have a core right to organize and regulate commerce among the several states, and with foreign nations, so as—
[ link to this | view in chronology ]
…fucking what? How does that justify forcing a privately-owned service to host speech that its owners and operators do not want to host?
A privately-owned business that runs a communication platform protected by both the First Amendment and Section 230 should not be forced out of business if it does not do something that literally any other kind of platform cannot and would not ever be asked to do by the government. The government cannot force a newspaper or a magazine to run an article about a certain subject; why should it ever be able to tell Twitter that it absolutely must host the insane ramblings of Alex Jones or else?
[ link to this | view in chronology ]
Re:
The First Amendment does not prohibit enforcement of §§ 201 and 202 of the Communications Act of 1934 against communications providers.
Section 230 can be amended. That's what we're talking about.
Ultimately, all the rights guaranteed in the Bill of Rights belong to the people — not the corporations. Now the people do have a fundamental right to organize — and the geneology on that right of organization goes back to the Mayflower Compact — but no one should doubt that there are limits on the right to organize for even political purposes in corporate form.
When you're talking about the right to organize for —not political— for commercial purposes in corporate form, the First Amendment was adopted roughly contemporaneously with the Article I, Section 8, Clause 3 commerce power. That commerce power encompasses the provisions of the Communications Act of 1934.
[ link to this | view in chronology ]
Look, all that talk about the law and limits on organizing and corporations is going over my head. (I am a regular jackoff, not a policy wonk or a third-year law student living off Ramen noodles and spite for their parents.) So I will ask you this one more time, and I would love an answer that does not descend into wonk-ness: What makes services like Twitter and Facebook, and nothing else but those services, deserving of being told by the government that they must absolutely host specific types and examples of speech and expression “or else”?
[ link to this | view in chronology ]
Re:
Wait, before you answer, I have another question, and this one might be easier to answer in a non-wonk way: If Twitter and Facebook were to announce tomorrow that they would be going out of business and shutting down all services on Monday, what—if anything—should the government be legally allowed to do to keep those privately-owned businesses and their associated services from going dark?
[ link to this | view in chronology ]
Re: Re:
1) If twitter and Facebook were to announce their disappearance, the Gubmn't *should* do nothing. (Well, possibly let people get their data out). But it's an impossible hypothetical, and if it came to pass, it *would* do something.
2) What makes twitter and facebook special enough to deserve gubmn't attention? Outsize influence, mobs doing harm (and not just right wing ones, The Atlantic discusses some incidents in India), and oh, yeah, Russian hacking! Oh, and, don't forget the Moral Panic (TM), caused by entirely too much screen time on something mysterious and new-fangled to the current geriatric generation of elected politicians, especially congress critters!
[ link to this | view in chronology ]
Re:
It's very touching that you'd like to limit the discussion in a way that's simply unsupported by existing law. That's a cute way to dodge the definition in § 230(f)(2).
Did you graduate high-school? Are you literate enough to read a ballot? Do you need some kind of special-needs assistance to vote? Are you at least partially capable of participating in the American experiment in self-government?
If you're in the commercial business of offering and providing communications service to the American public at large, then the American public has every right to insist that you actually deliver on that offer — and that your corporation must refrain from acting arbitrarily, capriciously, unreasonably and unjustly in providing the communications service.
If your corporation wants to act arbitrarily, capriciously, unreasonably and unjustly, then the American public may demand that you pursue some other line of commerce — one that does not involve significant control over the nation's political discourse.
'Cause we have a compelling interest in maintaining effective self-government, free from the capricious whims of any small handful of corporate oligarchs.
Although, maybe you'd rather live in a dystopia where a half-dozen corporations arbitrarily decide who gets to speak and what they can say — what subjects cannot be talked about. Some people would like that, I guess. It'd be like living in a sci-fi novel.
[ link to this | view in chronology ]
If you are going to literally insult my intelligence because I am admittedly incapable of being a law wonk who can process that stuff in an instant, I have no more reason to discuss this issue with you.
[ link to this | view in chronology ]
Re: Re:
they are very much speakers with the choices that go in to that transformative re-mix! That is, their recommendation algorithms are *very much* the speech of the platform, and share at least moral culpability for some of the harm.
Now, if we can just get some good *definitions* to support our intuition before we get hamfisted bad laws involved.....
[ link to this | view in chronology ]
Re:
Leslie Jones famously got a bunch of twitter hate & people who used the N word towards her were banned. Ms. Jones frequently used the N word towards others, not even a blip. Wrong or right it added to the narrative that if you had a blue check you got special rules.
While Twitter has a right to do things how they want on their platform, it is this obvious double standard pissed off the natives. There are hundreds of examples of this happening, including my own time out.
Twitter should stop trying to create justifications for their actions, it gives people who love the conspiracy theories more ammunition and the signal to noise ratio starts making the platform more problematic.
If you say a word is verboten it should be verboten for all, creating a grey area where sometimes it is okay sometimes it is bad depending on how many people complain about it opens the door for abuse. The "manpower' required to review the complaints keeps growing b/c outsiders now have to attempt to apply context or just slam the timeout button over and over & then deal with the number of appeals while adding to the growing incorrect narrative that all animals are equal some more equal than others.
How can you detect a campagin to silence someone vs someone who offended many people?
When the decisions are boiled down to only the eye of the beholder, you ban the gay guy who said faggot ironically for targeting hate at others... b/c someone allegedly got offended. If the minefield is will this word offend anyone, people will stop talking. We'll be reduced to inane things, and some asshole is still going to keep reporting people.
It is impossible for a computer to know how a word is being used. Even a human still needs more input to put it into context. Twitter has the ability for users to never have to see the words that offend them, yet people keep slamming the report button b/c it allows them to win points in their head against someone they dislike. Mass reporting is a tactic that takes advantage of the Twitter response of if it was reported we have to act now, and the only tool we have is silencing someone and wait & see if they appeal then invest time in looking at context, and even then depending on the reviewer you can get rejected.
Twitter created the problem by trying to appease people faster & faster rather than point at the tools available already. Twitter also screwed up in letting people know when they've been blocked, it encourages bad actors to make more accounts or recruit their social circle to make their point, rather than let them tweet into the void and assume the person is ignoring them.
It is really hard to manage these things automagically, when you have a system and refuse to consider it isn't actually working. Take the community moderation here, psycho starts with his Google shill rant & when we hit enough votes poof. But if you want to see it, you still can.
On twitter, first time it mutes that thread for me, but if I've clicked 20 times that account never shows up in my feed anymore, but I can go to their page if I want to & see what they are up to or proactively put them back into things I can see. It takes away the ability for a group of people to game the system & silence someone for sport b/c it offended them & they can feel they are the best arbitrator of what should be allowed for everyone.
The bad actor isn't aware that/which people aren't seeing their spew, can't climb up on a cross as a martyr being punished by Twitter & the Leftist Conspiracy, and users who have no interest don't have the spew getting onto them.
This removes the winning points game by how many of their side you can take down & makes users responsible for their experience & not just abdicating it to Big Twitter to protect them.
[ link to this | view in chronology ]
Re: Re:
Putting aside her status as a “verified user” and a celebrity, the important thing to remember about that sort of situation is this: Leslie Jones is a Black woman. If anyone has any right to use that particular word in just about any given context, she sure as hell does.
And you are absolutely free to feel pissed off about that, to complain about that, and to send Twitter management a list of your grievances with how they run the service. Whether Jack listens to you, however, is his prerogative.
I identify as queer. That word has a…certain history with the LGBT community. While I recognize that history, I believe queer, as a label, is both a more inclusive shorthand for the broader LGBT community in general (asexuals, aromantics, etc.) and an easier-to-use descriptor for my personal sexual orientation than, say, a Kinsey scale number or a detailed explanation. If another LGBT person called me “queer”, I would have little issue with it. If a straight person called me “queer” and meant it entirely as an insult, however, I would take issue with that usage.
If Twitter wanted to ban the word queer because of its history as an anti-LGBT slur, would you agree that, despite the context explained above, I should be banned for self-identifying as a queer man?
The gamification of social interaction networks has become a major issue, yes. A discussion about how to either prevent it or mitigate the damage it can do is a discussion worth having.
Agreed on both counts.
[ link to this | view in chronology ]
Re: Re: Re: Faster and Faster!
Pull out that instant gratification, and you take away that addictive dopamine reward from many trolls. That goes for bans, too; moderation should not be done instantly.
Reflecting, the real harms from Facebook and Twitter have been the mobs that form. And, with CDA230 immunity for moderating, the platforms pick and choose for us on an individual level what we see, so if I like Alex Jones, I won't see when Pizzagate is called BS, or anyone calling him loony tunes. I don't think CDA230 was intended for the case where there's so much moderation that the moderation itself becomes the message.
****
Now, the problem with words like queer and nigger is that the reaction they cause is *heavily* context dependent, and computers are famously bad at context. Same is actually true for moderation, go see the trolls on this post!
[ link to this | view in chronology ]
You would also take away much of the reason that normal everyday people use those services. Make of that what you will.
[ link to this | view in chronology ]
Re:
So, touche, lol
[ link to this | view in chronology ]
Re: Re:
Come on.
[ link to this | view in chronology ]
Re: Re:
[citation needed]
[ link to this | view in chronology ]
Re:
If it doesn't offend somebody, it couldn't possibly interest anybody.
[ link to this | view in chronology ]
If the Mike covered this in his post, I missed it, and I apologize. The post is quite long and dry and it was hard not to skip large sections.
Now on to my questions. I struggle with the definition of censorship, which a quick search gives the following:
noun
1. the suppression or prohibition of any parts of books, films, news, etc. that are considered obscene, politically unacceptable, or a threat to security.
Classically, this would occur when the government stopped publication and/or distribution of written/audio/video materials.
So I have often thought, is a publishing house censoring content when it refuses to publish based on a moral or ethical stance? Does the ability to self publish and distribute your own content mean that you actually cannot be censored by anyone other than the government?
In the case of deplatforming, does the ability to self publish on your own internet site mean that you aren't being censored?
I see logical arguments for both sides of the debate, but like Mike, I know the answer isn't binary, black or white. And I actually cannot seem to make up my own mind on the topic. So the big question here is, where do you fall on this?
[ link to this | view in chronology ]
Re:
No.
Yes.
Yes.
You are guaranteed a right to speak your mind. You are not guaranteed an audience.
[ link to this | view in chronology ]
Re:
Really? Then why the long, dry post?
[ link to this | view in chronology ]
Didn't read your whole comment [was Re: ]
First off, I will admit that I didn't read your whole coment. This a knee jerk reaction—
Fuck off. And the horse you rode in on, too.
[ link to this | view in chronology ]
So Zuckerberg said:
So far, so good. Then he went back on this and took the stuff down anyway.
It's been said that those who do not learn from history are doomed to repeat it. Here's a bit of history that hasn't been all that widely studied, that we're currently in the early stages of repeating: hate speech laws from a century ago.
It might be surprising to learn that Weimar Germany had very strong, very modern laws against hate speech, and one of the strongest beneficiaries of those laws were the Jews. They used them to fight back against very real discrimination in their day, and the courts did "the right thing" the vast majority of the time.
One frequent target of such laws was a hate-filled guy by the name of Adolf Hitler. He ended up getting smacked down for his serial offenses so much that he eventually got injunctions against him, preventing him from holding further rallies. ("Deplatformed," to use the modern parlance.) Well, that went over perfectly and we never heard from that troublemaker again... right?
Oh, wait, no. That's not what happened at all. It made a martyr out of him. The Nazis were able to point to the way he was being censored and use it as a rallying cry, which ended up being massively successful and we all know where that led.
So yes, there is a massive ethical problem with allowing people to be censored from the modern-day public square, and the fundamental problems involved do not change one whit if those doing so are private rather than state actors. With great power comes great responsibility, and when you become powerful enough to do things that historically only governments were capable of doing, the restraints that we have historically placed upon governments must be applied as well.
[ link to this | view in chronology ]
Small point of contention here, but...
Execution always overrides intent. If I tell a racist joke but say “I didn’t mean it in a racist way, it was just a joke”, my intent means jack shit compared to my sounding like a racist asshole.
[ link to this | view in chronology ]
Re: Small point of contention here, but...
So, if someone accidentally bumps into someone else, they should be charged with assault. Doesn't matter if they intended to or not. Got it.
[ link to this | view in chronology ]
Fair point.
[ link to this | view in chronology ]
Re: Small point of contention here, but...
[ link to this | view in chronology ]
Re: Re: Small point of contention here, but...
I saw the quoted part in your comment and felt a compulsion to reply to it. I realize that is not part of your comment, though. Still, as I said: Execution overrides intent. I did not mean to imply anything about you in my comment, and I will do better about being reactionary to comments here in the future. You have my sincere apologies for any discomfort or offense that I caused.
[ link to this | view in chronology ]
Re: Re: Re: Small point of contention here, but...
Mostly just confusion, TBH.
[ link to this | view in chronology ]
Re: Re: Re: Re: Small point of contention here, but...
Apologies for that as well; I can be a confusing bastard sometimes. 😄
[ link to this | view in chronology ]
Re:
the modern-day public square
People keep using this term to refer to social media as though it's now indisputably true. But it's not. In fact, it's pretty silly. Maybe you could argue that the internet as a whole is the modern-day public square, but the idea that each and every major web platform or social media service is "the public square" makes no sense.
when you become powerful enough to do things that historically only governments were capable of doing, the restraints that we have historically placed upon governments must be applied as well
The restraint placed on the government is that it is not allowed to make laws that prohibit speech. Facebook is not capable of making laws.
[ link to this | view in chronology ]
Re: Re:
You seem to imply that there can only be one public square. This was never the case. Each town had a square, and if you didn't like it you could move. But then you wouldn't be addressing your community, you'd be addressing some other community, just as if you move from Facebook to Twitter.
"All the party invitations in Cambridge come through Facebook. If you don't use Facebook you don't get to any parties, so you'll never meet any girls, you won't have any kids and your genes will die out."
Each of the well-known platforms holds more people than even the largest "public squares" ever could.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
And when they leave that Starbucks, right across the street, there’s another Starbucks!
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
If the World Wide Web is the thing that's transformative and unprecedented, why are you deeming multiple individual services like Facebook and Twitter to be the public square?
Large though those platforms may be, they still represent only a small fraction of the public's ability to publish content and engage in speech on the web.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Whether and how Starbucks should be regulated as a public square, and whether it is a de-facto public square, are different questions. For now I express no opinion on the former, though I'll note in passing that California's Pruneyard decision might already be regulating Starbucks in that way.
[ link to this | view in chronology ]
Starbucks and, say, Twitter are privately-owned corporations that both offer a service to the public and, under a certain interpretation, provide a “public square” to that same public. That one operates a chain of brick-and-mortar coffeehouses and the other operates an online social interaction network, and that Twitter’s service is literally being that “public square”, makes little difference. What would make Twitter more deserving of “public square” restrictions than Starbucks?
[ link to this | view in chronology ]
Re:
What do you mean by "public square restrictions"? Prohibitions on kicking people out? And "more deserving...than Starbucks"? Starbucks may already have these restrictions in California, and no court has said Twitter does (even in California they've been chipping away at the Pruneyard ruling).
[ link to this | view in chronology ]
Yes.
If Starbucks were to have more latitude in deciding who could be kicked out of its stores than Facebook would have in deciding who could be kicked off that service, Starbucks would seem as though it was more deserving of that latitude than Facebook. I would then have to ask why that is the case.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Starbucks
The rallies of the German National Socialist Party took place in beer halls. It's a valid question.
Whether or not social gathering at public places should be regulated may depend on whether or not the state / public approves of the positions of that group.
Or not.
I think in this case political speech is like armament (guns) in that we expect individuals of the public to be responsible in its care and use. That we have notions like hate speech or incitement implies that we don't trust members of the public to talk among themselves responsibly.
But then for the same reasons we might not trust people with talking or guns, we might not trust them with voting either. Look what happened in 2016.
It's ultimately a paradox of the people. They can't be trusted to be adults, but then we have no adults to supervise them. TechDirt provides a continuous flow of counterexamples to the notion our persons of authority (law enforcement, judges, legislators, et. al) are responsible and make just decisions.
Find the (Madisonian) angels that can articulate which groups are not allowed to gather in Starbucks / Beer Halls, and we'll have a notion of what groups should be allowed to speak on Facebook and Twitter.
Given Facebook has a penchant for censoring Danish mermaids and brfeastfeeding mothers, I would argue we haven't found those angels yet.
[ link to this | view in chronology ]
Re: Re:
What's the practical difference here? If I don't like my country's speech laws, I can choose a different country (maybe), or try to get the laws changed (maybe). If I don't like Facebook's rules, I can choose a different website. They can't throw me in prison, but prison isn't the only thing the First Amendment was meant to guard against.
[ link to this | view in chronology ]
Re: Re: Re:
What's the practical difference here?
Are you honestly asking me what the practical difference between Facebook and the US government is? How much time do you have?
[ link to this | view in chronology ]
Re: Re: Re: Re:
No, I'm wondering why you felt that was the appropriate point on which the actions of Facebook et al. should be evaluated, and whether we should focus so strongly on it. You made the point that "Facebook is not capable of making laws", as if that proves something in and of itself; you didn't really explain how that helps anyone. "Not capable of making laws" gives us little solace if the actions they take can cause much of the same harm as actual laws. (Though like Mike, I'm going to hedge with "no good answers". I'm not proposing regulation, but won't dismiss it out of hand.)
Facebook is code, the First Amendment is law, and Lawrence Lessig wrote an entire book—two decades ago—from the premise that "code is law" in the modern world. The distinction between governments and private actors is, in my mind, not quite as binary as you make it seem. Not to say that being banned from Facebook is anything like being thrown in a gulag.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
The big difference the government can order you not to publish in any media, electronic or paper based, and through you in jail if you disobey that order. Private actors on the other hand can trough you off their platforms, but not stop you from publishing elsewhere.
Therefore while FaceBook can throw you off there platform, they cannot through you of the Internet.
[ link to this | view in chronology ]
Re: Re: Re:
Facebook literally cannot stop you from posting on Twitter. The government theoretically could.
[ link to this | view in chronology ]
Re: Re: Re: Re:
If Facebook is the "public square", should people have any expectation or protection of privacy for the data they generate there? Should Facebook have any obligation or ability to enable and respect private/public settings on posts, groups, events, etc? Is all the data Facebook stores on people's conduct in the public square equivalent to government records, and subject to FOIA requests?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
That is a good question. Okay, it's technically wrong to declare that Facebook is simply "the public square". It's shorthand. Nobody means that everything passing through the platform, including private messages, is part of the public square.
The public parts of Facebook, like the Infowars pages, are like a public square. Anyone can see that stuff. Group/family spaces might be akin to a living room or rented library conference room that such a group might have otherwise met in; only "members" are allowed. Direct messages are perhaps more analogous to telephone or parcel-delivery systems, which are privately-owned though we've seen fit to regulate them in various ways as "common carriers"—they have to provide privacy and cannot arbitrarily refuse service.
None of this means we need to reach the same conclusions w.r.t. Facebook.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
The public parts of Facebook, like the Infowars pages, are like a public square.
Wouldn't that also mean that all public page operators themselves are barred from moderating content or blocking users from their pages? You can organize a rally in the public square and invite who you want, but you can't prevent other members of the public from attending.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re:
Yeah, it is indeed a giant mess that's going to get worse before it's resolved. We're talking about things that never really existed before, so, naturally, we'll quickly hit the limits of every analogy. Which doesn't mean that courts won't make the same screwups... (Aereo, the third-party doctrine, ...).
We can nevertheless look at various types of regulation that cover different aspects of the service(s), and evaluate how well they've worked in their original problem-domains and whether that's something we might want to apply here. And we'll have to watch out for stupid proposed regulations, because we know damn well that that's going to happen... really, we should probably keep the legislators away from this area for at least a few more years.
WRT your earlier question about whether they'd have to let everyone speak, I don't know. Probably another sign of the analogy's limits. Is it better to say Infowars is like a newspaper, and can decide which comments to print, while Facebook is like the postal service (USPS cannot ban certain newspapers)? Or maybe public square is accurate enough. It's not like every public square was always hosting a single conversation; there were always subgroups, who might try to speak quietly for privacy or try to eject troublemakers, even when the government running the square could not. Well, Facebook fits the definition in terms of a meeting place at least, and "public square" is more succinct than "newspaper and postal service and telephone operator and directory listing and...", so I'm inclined to stick with this shorthand despite its numerous problems.
[ link to this | view in chronology ]
No, because that would give Infowars more rights in moderating the open-to-the-public section of its site than Facebook. Facebook does not deliver mail or operate schools or fix public roads. It does not operate as an arm of the government. It should not be regulated as if it were.
As someone else pointed out in this comments section, the primary issue with Facebook being “too big” is not one of arbitrary size distinctions or technical functionality, but of cultural influence. All of these arguments about censorship and regulation, then, should be focused on two connected ideas: who has the ultimate control of a force that can influence culture and society, and what they are doing with that control (and influence).
[ link to this | view in chronology ]
Re:
Facebook does deliver mail. Not physically, but electronically much like telegraph operators did, and they were regulated as common carriers despite being privately owned. Similarly for railroad and pipeline operators, and telcos.
[ link to this | view in chronology ]
Re:
One could say the same thing about political parties, and they would hate not being able to control the direction of political discussion.
(I know they would ensure laws about speech do not apply to themselves).
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
They are NOT the square itself, but ARE contiguous with that prime real estate.
As shops on a square lose business, they go away and get replaced by new ones, but the square remains. Similarly, MySpace and Friendster had prime addresses, but those are now Twitter or Facebook.
Still, today, anybody can take a soapbox and go speak/rant in the public square, like Times Square NYC. But one cannot go to Times Square with a soapbox, enter the ESPN Zone restaurant, and deliver a filibuster about how media is controlled by Jews and Disney is tool of globalists.
Just as Alex Jones can set up his own site on the virtual town square at www.inforwars.com -- but maybe not welcome inside Facebook.
[ link to this | view in chronology ]
Re: Re: Re:
Are the .com gtld servers an essential public facility?
If you recall, the .com registry is operated by Verisign, a publicly-traded for-profit corporation, under agreement with ICANN, “formally organized as a nonprofit corporation ‘for charitable and public purposes’ under the California Nonprofit Public Benefit Corporation Law.”
Are the .com gtld servers an essential public facility?
Can Versign kick www.inforwars.com out of the .com gtld?
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re:
Yes, corporations should be able to ban/edit/fiddle however they want, but I suggest that you prevent them from influencing government (campaign "donations") too.
[ link to this | view in chronology ]
Internet sites have every right in the world to kick people off
1.Would an internet site have the right to ban, say, black people?
2.If so, would a restaurant have the same right?
3.If not, then why should one and not the other?
[ link to this | view in chronology ]
Re: Internet sites have every right in the world to kick people off
There is a very narrow set of protected classes, and you probably know that there is, seeing as you jumped on the first one.
For the sake of argument, here they are: Race. Color. Religion or creed. National origin or ancestry. Sex. Age. Physical or mental disability. Veteran status. Genetic information. Citizenship.
This list isn't true for everything, and you have to prove that the reason you were refused service was for being one of the protected classes.
So, if someone can prove the internet site banned them for being black, and not just an asshole, then NO, they can't be banned. Just like a restaurant can't refuse service for being white, but they can kick you out for being disruptive.
[ link to this | view in chronology ]
Re: Re: Internet sites have every right in the world to kick people off
[ link to this | view in chronology ]
Re: Re: Internet sites have every right in the world to kick people off
During the second half of the 20th century, and into the beginning of this one, an interesting change in legal thought —or perhaps the popular conception of “legal” thought— has taken place. Back at the beginning of the 20th century there was a firmly established standard barring unreasonable discrimination in providing public accomodations and essential facilities.
“Unreasonable”, of course, was measured against the prevailing social attitudes of the day. And back in those days, Jim Crow had force, with among other things, the election of President Wilson in 1912. (Woodrow Wilson was a racist.)
Later on, tending towards the middle of the 20th century, it became established, through a number of cases at the federal level, that discrimination on the basis of race was “unreasonable”.
Further, the Supreme Court eventually came up with a concept of “inherently suspect classifications” that, if utilized, strongly tended to indicate &unreasonable discrimination.
Towards the end of the century, we see the notion of “protected classes” grow into the popular mind.
At this point, today, popular thinking seems to have gone from a standard-based bar against unreasonable discrimination — to a rule-based bar against discrimination on the basis of “protected class”. People seem to have entirely forgotten about any standard-based bar against unreasonable discrimination in the provision of public accommodations and essential facilities.
In fact, some people seem to delight in the notion that large corporations may act unreasonably — and that the legislatures, courts, and the people themselves are quite powerless to stop these large corporations' unreasonable behaviour.
[ link to this | view in chronology ]
We have a good reason for that: Unreasonable discrimination these days typically takes the form of discrimination based on who someone is rather than what they do. Refusing service to someone who wants a swastika-shaped cake is reasonable; refusing service to someone only because they are Jewish is not. Booting someone from a platform for saying dumb bullshit is reasonable; booting someone from a platform only because they are Black or gay or an atheist—or, for that matter, White or straight or Christian—is exceptionally unreasonable.
[ link to this | view in chronology ]
Re:
“Reasonableness”, at bottom, is a question for the jury.
Just consider, though, that any old judge can mechanically apply some rule without sending a complicated case to a jury. And, note, observationally, there's been a drastic decline in jury trials over the past few decades.
[ link to this | view in chronology ]
Re:
That one's actually tricky. Refusing service to someone of Jewish ancestry would be discriminating based on who they are; refusing because they express support for Jewish religious teachings would be discriminating based on what they do. If we say that's not OK, does that make it wrong to discriminate against the Westboro Baptist "Church"? Should that depend on how sincere we think their beliefs to be, or offensive we find them, or how widespread the "religion" is?
Similarly, being gay is not the same thing as choosing to have gay sex. The law protects both, but the latter is technically a choice.
[ link to this | view in chronology ]
The latter woud still be considered religious discrimination because the discrimination would be based on the expression of a religious belief. You can argue whether such protections are deserving of being the law, but for now, US law says we all get them.
Yes, it does. I despise those homophobes and their message, yet I believe their odious beliefs should not disqualify them from the protections of the law and the exercise of their rights.
No metric exists that can accurately measure the sincerity of a religious belief, so we cannot use that. “Offensive” is a subjective standard based on personal standards, so we cannot use that. And using the size of a religious sect to determine who we can discriminate against is just asking for trouble.
Two things.
[ link to this | view in chronology ]
Re:
So what are we left with? If Jones said his videos are expressions of his religious beliefs, would he have a legitimate religious discrimination case against Facebook? What differentiates "religious" beliefs from personal opinions, if not popularity?
Not legally, according to the EEOC. I'd hope the same protection would exist for service-refusal and eviction.
[ link to this | view in chronology ]
Re: Re:
Well, for starters, Jones’s lawyers are trying to argue in court that he is simply an “entertainer” whose outlandish statements are part of an act and thus should not be taken seriously, so there is that. But more to the point…
Hell if I know for sure. We do not and cannot have an objective standard by which we can judge the sincerity of an expressed belief or opinion, religious or otherwise. A lot of this sort of thing is a judgment call. That said: When those opinions belittle an entire segment of the population based on who they are (e.g., anti-gay religious beliefs) or defame people (e.g., Alex Jones’s “Sandy Hook was a false flag operation” claptrap), booting from a platform the people who express those opinions becomes much less morally questionable.
That only applies to federal employment. On the state level, around 30 states—last time I checked, anyway—have no such protections enacted for LGBT people.
[ link to this | view in chronology ]
Re: Re: Re:
The link that the other poster provided didn't really support his assertion. Here are some better links from the EEOC—
Note that in the EEOC context, “Title VII” generally refers to Title VII of the Civil Rights Act of 1964.
[ link to this | view in chronology ]
Re: Internet sites have every right in the world to kick people off
Anti-discrimination laws are a separate, specific thing that block the unequal provision of services on the basis of certain protected qualities such as race. They act explicitly as an exception to the more basic idea that a private entity can deny service to anyone it chooses - and generally yes, they apply equally to online services.
While there may be a separate (though related) discussion to have about such laws and their impact on various rights, they don't change this broader analysis and they certainly don't apply to the question of blocking based on political positions, viewpoints, etc.
[ link to this | view in chronology ]
Re: Internet sites have every right in the world to kick people off
[ link to this | view in chronology ]
The second: "Most people do use Facebook. And for many people it is important to their lives. In some cases, there are necessary services that require Facebook."
Once necessities start going on a platform, then the ethical calculus changes. They don't, and probably shouldn't, have a legal requirement to host everything, but if Facebook is genuinely *essential*, then there *are* ethical issues with them banning people.
[ link to this | view in chronology ]
Re:
Many of the complaints in our more trollish comments arise from the disconnect between the expectations that these are non-discriminatory public utilities and the legal reality that these are private entities.
Once something becomes a public utility, or a public accommodation, then non-discrimination becomes a legal requirement. That is what Net Neutrality is really all about. It's been argued where to legally draw the line between a small site like Techdirt and a huge platform like Facebook.
I'm very much in favor of Techdirt's approach: What i post is my speech, Facebook algorithms promoting it without comment is Facebook's speech, so Infowars on Facebook is not a problem but Facebook promoting Infowars is a problem; it has certainly caused some real harms.
I'd be really interested to see what would happen if all the hotels in town decided not to serve people wearing sneakers...
[ link to this | view in chronology ]
Re: Re:
That the largest internet platforms (particularly twitter and facebook) are becoming like monopolistic public utilities is largely not disputed
I'd like to dispute that, thank you very much.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
Bullpucky. I just disputed it 90 minutes ago.
I agree that Facebook and Twitter have an alarming, outsized influence on our discourse. I do not agree at all that they are monopolies, or that they bear any resemblance to public utilities.
But that's just it: the expectations that they are public utilities is disconnected from reality.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Does this tie in with the "Filter Bubble" problem?
If I decide what I see, and I say I want to see 'progressive' articles, who decides when to apply the 'progressive' tag to a piece of content?
[ link to this | view in chronology ]
Scott Yates or beernutz, 59 comments total, 6 average per year,
because TEN AND A HALF years, 21 Dec 2007. 2 year gap after very first, which is typical, yet ODD: excited about a new site, then just drop it? HMM.
Also, it's pretty clear (to me, who has list of nearly all accounts for the last 5 years) these RARE accounts only come out when FEW comments. -- Exactly as if the site is trying to give the appearance of more interest.
[ link to this | view in chronology ]
My friend, you need to get laid by something other than a Fleshlight.
[ link to this | view in chronology ]
Re: Scott Yates or beernutz, 59 comments total, 6 average per ye
[ link to this | view in chronology ]
HOLY SHIT YOU ACTUALLY REPLIED. I thought you were literally incapable of doing that.
[ link to this | view in chronology ]
How’s the maze business bro?
[ link to this | view in chronology ]
Re: Scott Yates or beernutz, 59 comments total, 6 average per year,
(to me, who has list of nearly all accounts for the last 5 years)
It's really hard to say this without sounding insulting but you really need help. This kind of obsession is really not healthy. Please, please, please talk to somebody. I'm serious.
Alternatively, try to wean yourself of your compulsive behaviour to visit and comment on this site. Just go watch porn or something whenever you feel the need to go here..
[ link to this | view in chronology ]
You could make the same argument for religion, slavery, cigarette smoking, etc. Just because an institution is widespread in society doesn't make it efficient or just.
[ link to this | view in chronology ]
Not Quite.. but certainly in my mind, the bar for when the government should need to step in and force companies to provide a particular service in a particular way should be higher than you imply. Facebook is not an essential service we all lose Facebook tomorrow.. life goes on relatively unscathed.
Put otherwise... you create a tool, lots of people like it and use it.. at what point exactly do you cross the line from "you make an awesome tool and lots of people use it" to "your tool is now essential/important enough to society.. and we can't trust you [who, unlike us are able and insightful enough to build these tools that are so essential in the first place] to administer it responsibly so in the interest of public good we are taking control [to some degree]"?
To me, this thinking is backwards.. For one, it's kindof an unfair abuse of power and removal of important freedoms but perhaps more importantly because the only people who have demonstrated that they are at all equipped to make informed decisions regarding what will make these tools most useful for society are most the people who are able to make tools that are so useful to society.
I thought a decent solution for this might be something akin to the esrb ratings system.. something of a standard created by and for industry (even if only to preempt regulatory intervention) that would assist with transparency by laying out expectations for your users for how content will be monitored on your platform in a concise, clearly communicated and easily understood (if not neccessary always precise) manner.
Platforms could certainly allow their users to choose from a selection of available standards, but wouldn't neccessarily be required to do so, and more popular standards would spread
[ link to this | view in chronology ]
Re:
Where in the article did you read any implication that the government should intervene?
I see an article that makes suggestions about how Facebook and Twitter should voluntarily provide technical solutions to their users to help mitigate this dilemma. I don't see a government mandate mentioned anywhere.
That...sounds exactly like Mike is suggesting.
[ link to this | view in chronology ]
Re: Re:
This part:
But, at the same time, it's more than a bit uncomfortable to think that anyone should want these giant internet platforms deciding who can use their platforms
The way I see it, there is only one way that this decision is actually removed from the "giant internet platforms" and that is through regulation. If they implement what Mike suggests, that is just the result of their decision (at the moment) it doesn't make it so they aren't making the decision.
"That...sounds exactly like Mike is suggesting."
Yeah that was the idea, I was trying to say you could still accomplish something very similar to what Mike was suggesting using this hypothetical ratings system
My understanding of Mike's suggestion is more for every platform to have all the possible options and let the users select what they want every time. In my mind this is expecting too much from your users and I think it needs to be simplified. You could choose to only implement one or even no standards or you could allow the users to select one, the key is that what you are doing is transparent to the users so they can decide accordingly.
[ link to this | view in chronology ]
I hate to keep bringing up Mastodon when talking about this kind of subject, but the Masto protocol has per-post privacy settings: A post can be publicly visible (with the option of showing it or hiding it on the public timelines), visible only to followers, or a direct message to a mentioned user. It also has a “content warning” system that allows for putting sensitive topics (or the punchline to a bad joke) behind a warning box that a user can choose to click. That functionality extends to images as well, and an account setting can automatically hide all images behind a “sensitive image” warning. As far as block/mute functionality, a user can mute or block an individual user or, if necessary, an entire instance (useful for victims of a coördinated harassment campaign from that soon-to-be-blocked instance). It might not be exactly what Mike was suggesting, but hey, it’s a start.
[ link to this | view in chronology ]
Re: Re: Re:
I see. But expressing discomfort with a decision isn't the same thing as saying somebody shouldn't have the right to make that decision.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Yes. But you can be uncomfortable with something without thinking it should be legally prohibited.
Somebody brought up the Westboro Baptist Church upthread. It makes me uncomfortable that they can spout their hate outside funerals, but I absolutely support their legal right to do so.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
You don't say you are uncomfortable with them deciding whether or not they spew their hate speech outside funerals because (in my mind at least) that is the same thing as saying you are uncomfortable with them having the legal right to do so (which certainly isn't the same as saying you want to remove that right)
[ link to this | view in chronology ]
A few things..
And 2 (and probably more importantly) There ARE federal laws, Felonies at that, against inciting violence and riots. There is NOT however any Federal laws against incitement of ethnic hatred (which I guess is what they are counting on).
[ link to this | view in chronology ]
Choosing what content winds up in your Newsfeed
[ link to this | view in chronology ]
Re: Choosing what content winds up in your Newsfeed
Good idea and there's already a fully established and open platform they could base this on: RSS. Pretty much every website out there already has an RSS feed ready and waiting.
I've been thinking about that a lot lately. Imagine a web where much of modern social media grew out of RSS rather than proprietary platforms. I can envision a world where the hosting/publishing functions of Twitter and Facebook are totally distributed and syndicated, and they are instead focused on being end-user applications for viewing/aggregating content in your chosen way. There would be capability tradeoffs and different engineering challenges for sure, but man does it sound better than what we've got...
It's so easy to envision how we could have gone down that path instead than the one we did, but now we face the much more challenging question of how we get from here to there.
[ link to this | view in chronology ]
Re: Re: Choosing what content winds up in your Newsfeed
Unfortunately, Those two movies also demonstrate why that's a bad idea.
[ link to this | view in chronology ]
No one should have to think about Ready Player One.
[ link to this | view in chronology ]
Re: Re: Re: Choosing what content winds up in your Newsfeed
Ready Player One involved a centrally-controlled virtual world. Parzival's group wanted to win because otherwise IOI would have absolute power over something that ruled over everyone's lives (much more powerfully than Facebook). And they chose to run it democratically, but the story doesn't go so far as to put any guarantees against future reversals; I don't recall them making any "constitution" for the virtual world.
RSS is neat... not fully decentralized, because it usually relies on DNS and registrars can arbitrarily revoke domains for now. In theory you can run it on a .onion address. It has the obvious problem that it's a "pull" system: millions of clients checking every few minutes whether something new has happened takes some herioc efforts (CDNs) to scale.
[ link to this | view in chronology ]
Re: Choosing what content winds up in your Newsfeed
In the social media arena, this is one of the areas where FaceBook and Twitter have an advantage of a distributed protocol, they can handle the user/subscriber issues better than small instances of a distributed protocol, they can deal with a large number of followers or subscribers to a single persons feed.
[ link to this | view in chronology ]
If you put prominent disclaimers on pages or appended to posts by speakers who are known to be kooks who promote outright falsehoods, it's good to think it'd be educational, but is that going to be opening the site that puts the disclaimer up to free speech complaints? The speech is not censored per se, but how viable would a complaint about the disclaimer being a form of censorship and free speech interference be if they were to make the claim?
I think it's a good idea in theory for extreme views that are clearly bullshit and unfounded, but I can also see it going badly for views that are founded in reality but are politically or publically unsupported. True, it would be better than outright deleting or no-platforming unpopular views, but even the disclaimer idea of having the website actively give its implicit approval or disapproval to particular viewpoints is somewhat troubling. Do we want large website moderation teams acting as arbiters for the truth? And by rubber stamping posts instead of simply deleting inconvenient ones?
I do think it's a better idea than simply saying websites should hunt down and delete bad content though, because there's gonna be some good content lost to overzealous moderation (due to the nature of it being subjective) and also because it won't push bad content into the dark where it can't be publicly refuted. But it's not a perfect solution either, and it's possible those disclaimers could have a similar no-platforming martyrdom impact for groups such as the Nazis, the KKK, etc, who use the disclaimers as evidence of a conspiracy.
On small websites, trouble users can be dealt with individually without the martyrdom effect, but it's a real problem of how to deal with and get rid of someone who is a crap member of the community without giving them the means to cry foul.
[ link to this | view in chronology ]
Re:
People can complain about whatever they like, so, yes. It's probably even illegal, if done by a government, but the chance of a court finding against a private company on this basis is negligible. The idea of anyone having a right to speak on a specific platform is itself shaky.
[ link to this | view in chronology ]
For really the first time we have something that looks sort of like competition in this marketplace, instead of everyone abandoning Geocities for LiveJournal, then abandoning LiveJournal for Myspace, etc, but it's not exclusive competition. If I use Twitter, that doesn't mean I don't also use FB, LinkedIn, YouTube, and Snapchat. What they're competing for is time and attention to keep eyes on adds - but that makes each of those sites, large and powerful as they seem to us, uniquely vulnerable to public outcry and so they cave to vocal minorities. We all know this. The question is, is there a solution that's satisfactory to both end users and the bean counters? Is there another revenue stream they could focus on that didn't make them so vulnerable to passing public whimsy? Is there an option they could explore that would leverage their incredible public presence to lessen the absurd tribal animosity currently prevalent in social and political discussions to undermine brigading types of behavior?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
Yes, people make money off e-mail in all sorts of ways, but that wasn't always the case, only now that it's an established protocol in wide use
I'm not sure that's true. In my experience, in the early days of widespread public adoption of the internet, email service was very much a part of what you were paying your ISP for.
Most people had ISP-hosted email, and it was more clearly understood by the early adopters that by buying internet service you were getting the necessary infrastructure to make use of multiple different protocols - not just the ability to make http requests and access the world wide web, but also things like an account on a news server (owned and operated and maintained by your ISP) to access Usenet, and an account on a mail server (same) in order to send and receive emails.
While the email protocol itself was open and free, it was only later that ways to make use of it for free became ubiquitous - and indeed in the early days of free webmail, people questioned whether it was even a sustainable service to offer, much less a profitable one.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Gmail in particular snagged a lot of users partially because it kept advertising the amount of storage it offered users over other free and pay-to-use email services.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
If you look at the history of the ARPAnet, email was the feature that caused it to grow. Instant communications between university researchers were the reason universities made the effort to connect.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re:
I disagree; GeoCities, LiveJournal, and MySpace were never anywhere near as dominant as Facebook or Twitter.
[ link to this | view in chronology ]
Re: Re:
USENET also has the free speech everyone claims to want.
AOL was more dominant from 1993-1996, and abused their censorship power to the point of destroying the company. They were the only service at the time that had reliable, instantaneous e-mail (same server), they had keywords before websites, a payment system that could have been Paypal, and yet they threw it all away by chasing everyone onto the web, which slowly caught up technologically.
There will always be a hungry young internet company that wants to gain market share by offering free speech, just as surely as once it gains that market share, it will want to clamp down on dissent or anything it can't control.
[ link to this | view in chronology ]
Re: Re:
The directory listings, though, were still important in the early days. Yahoo's listing service might have been the closest thing to a monopoly at the time.
[ link to this | view in chronology ]
What about phone companies?
[ link to this | view in chronology ]
Re: What about phone companies?
[ link to this | view in chronology ]
Yeah, everyone knows storing the contents of phone calls is the job of the NSA, and we wouldn’t want to put them out of business, would we?
…wait—
[ link to this | view in chronology ]
Re: What about phone companies?
Bomb threats?
Calling people up and pushing their religion?
Still rules for phone use. It's pretty neutral, but not absolute.
[ link to this | view in chronology ]
Ouch.
[ link to this | view in chronology ]
And if a service is, as you say, "close to necessary to take part in modern day life," perhaps it shouldn't have the right to ban people after all.
Or on the other hand, perhaps the service should instead be made less necessary.
Either way, there is definitely something very not right in a situation in which a person could be excluded from "necessary" functions of society in retaliation for expressing his/her opinions.
[ link to this | view in chronology ]
I've said it before...
You are,GASP, not special. The internet does not exist just for you alone.
Social medias problem is that they try to moderate content to protect some groups. Short of someone posting threats, kiddie porn, inciting riot or rebellion or other unlawful acts QUIT MODERATING CONTENT!!!
If it's not unlawful, it's a self correcting problem. People who don't like or are offended by someone's post won't be back...problem solved.
FFS everyone needs to grow the fsck up and quit crying when they see something they don't like.
[ link to this | view in chronology ]
Re: I've said it before...
I'm generally in favor of free speech, but when the algorithms on a very influential platform such as facebook lead to mobs, we've got a problem.
[ link to this | view in chronology ]
Re: I've said it before...
[ link to this | view in chronology ]
Re: I've said it before...
And yes, it's true that "people who don't like or are offended by someone's post won't be back" and... here's the thing: social media companies want users, they don't want people to leave and never come back. And they also want to be a place frequented by celebrities, experts, politicians, interesting people - because that's good for their business and their brand.
So when persistent toxic behavior by some subgroup of users is proliferating and driving away other users, especially high-quality users, it becomes a business issue for the platform to figure out how to foster a better community.
Plus at the end of the day, some people don't *want* to be the owners and operators of a forum overrun with holocaust denial or misogynistic harassment or racist insults or what-have you. Many talented people don't want to work at a place like that. Companies and advertisers don't want to partner or be associated with a place like that.
As this post points out, this doesn't mean the solution is "just try to ban all the bad stuff". But, sorry, your "grow the fuck up" attitude isn't going to fix anything either, or change this situation at all. And if you want to see what a totally unmoderated social media platform looks like, try signing up for gab.ai
[ link to this | view in chronology ]
Re: Re: I've said it before...
I dunno, I think they can just scroll down a little and get a pretty good idea.
[ link to this | view in chronology ]
You've put your finger on something...
OK, by putting everyone in their filter bubble, Facebook and Twitter have been able to avoid the business issue....or is there more to it??? Or am I just not looking on a long enough timescale?
[ link to this | view in chronology ]
Re: You've put your finger on something...
And that's the situation these platforms seem to find themselves in so often. Someone from r/TheDonald cross-posts something to r/Politics and instead of everyone saying "Wow, that guy's an idiot, moving on..." they engage, yell at each other, and the place becomes toxic. Now the admins feel they have to act before the whole place becomes a cesspool, but they're a) human, b) understaffed to deal with the volume of content, and c) never going to please everyone with any attempt at censorship - even if it's warranted and totally within their rights.
[ link to this | view in chronology ]
See also: 4chan.
[ link to this | view in chronology ]
Re: You've put your finger on something...
OK, by putting everyone in their filter bubble, Facebook and Twitter have been able to avoid the business issue....or is there more to it??? Or am I just not looking on a long enough timescale?
I think the "filter bubble" aspect is somewhat overstated. The analysis does diverge a bit for Twitter and Facebook though.
For Twitter, a huge part of the appeal and core function of the site is the ability to connect with people outside your immediate circle - inasmuch as it has filter bubbles, they are extremely porous. This is reinforced by nearly every aspect of the design of the site: retweets, subtweets, notifications of what people in your network like or follow, hashtags that link to an open feed of other people from all across twitter, prominent trending topic links that do the same, etc.
Moreover, core to Twitter's appeal is public figures or just interesting people maintaining a public presence. Twitter and its userbase do not benefit if more people make use of its "filter bubble"-ish capabilities like having a private account or muting all replies.
So to take a specific and widespread example, there is a huge problem with constant and frankly insane harassment of women who have a large twitter presence, especially in certain industries like game design. They face an unduly disproportionate amount of aggression - including coordinated harassment campaigns employing tactics to get around the muting/blocking features that exist. This has driven many women off the platform. Twitter doesn't want this. They could also switch to a private account, and put themselves in a stronger filter bubble, but Twitter doesn't want this either - nor do they.
Facebook is somewhat different because it has many different usage patterns and a larger "private, immediate circle" aspect in some respects. However, interconnection is still a big deal: public pages and events are very important to Facebook, and important to its advertising business model. Facebook provides lots of routes out of your "bubble", showing you popular pages or those your friends interact with, etc. It is also home to several large and more public general-interest forums, such as the Facebook pages of major news organizations, popular TV shows, etc. - and it does not want these places ruined by toxicity either, because then these all-important large organizations might pack up and leave.
And that's just the briefest look at these platforms and how people use them. I won't even get into YouTube, iTunes, Steam, Wordpress, Wikipedia, app stores - all platforms facing these same challenges and all with unique needs.
The whole idea of "filter bubbles" is real and it is very much a factor and a force in how we communicate and consume information in 2018, but it is not an absolute or even necessarily the most dominant trend - and it's not an automatic solution for the challenges of Twitter or Facebook.
[ link to this | view in chronology ]
Switch Boards, Party Lines
[ link to this | view in chronology ]
"No, I'll surrender more of my Rights to corporations than you!
What looks like discussion above is a few fanboys eager for fascism. Yes, are a couple who stick up for actual free speech, which is anything within common law and NOT to be controlled by corporations, but most of the comments are eagerness to be ruled by corporate royalty.
[ link to this | view in chronology ]
Is Section 230 CDA to benefit The Public or corporations?
Obviously The Public by providing speech outlets. You could not find a single politician who'd say it's to benefit corporations. -- Though it's entirely possible that was and is the intent: a stealthy form of censorship.
Corporations have PR departments with large budgets to get their message out. It's only individual "natural" persons who need outlets for their views.
Masnick highlights that CDA 230 provides immunity that allows corporations to HOST content without the liability of PUBLISHING it:
Note first that states causes valid in common law. It's a requirement for simple decency. Not controversial so far...
[ link to this | view in chronology ]
Corporatists try to twist CDA into control of OUR PUBLISHING!
How did "Communications Decency Act" get twisted into authorizing complete corporate control of The Public's new forums?
Because the ultimate purpose of Masnick / EFF blather (both funded by Google) is to sweep on to:
They claim that corporations can use that "restrict access or availability of" clause to, whenever wish -- even explicitly over YOUR First Amendment Rights -- step in and EDIT comments, to point of becoming THE publisher, even to PREVENT we "natural" persons from publishing on "their" platforms at all!
But those are OUR platforms, The Public's, NOT theirs.
Corporations are allowed (by The Public) merely to operate the machinery which is to convey The Public's views. Corporations are NOT to control who gets on, nor WHAT The Public publishes, except under OUR clear common law terms.
But Masnick is for corporations CONTROLLING the speech and outlets of "natural" persons. That's repeated often here, can't be mistaken:
"And, I think it's fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."
https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content -free-speech.shtml
Masnick is not hedging "lawyers say and I don't entirely agree", or "that isn't what I call serving The Public", but STATES FLATLY that corporations have arbitrary control of MAJOR outlets for The Public! He claims that YOUR Constitutional First Amendment Right in Public Forums are over-arched by what MERE STATUTE lays out!
Such control (by ANY entity) to remove First Amendment Rights from The Public CANNOT be purpose of ANY statute. It'd be null and void because directly UN-Constitutional.
It's NOT law, only the assertion of corporatists: no court has yet supported what Masnick claims. Corporatists are reaching for the moon without even a step-ladder. It's simply a trick Fascists are trying to pull.
The provisions over-riding First Amendment Rights ONLY apply if done in "good faith" for The Public's purposes. A corporation intent on stopping YOUR publishing in favor of its own views CANNOT be "good faith" for The Public, but only de facto tyranny and censorship.
[ link to this | view in chronology ]
Posting on a "private web-site" is NOT a "privilege",
any more than is reading it. THAT'S THE PURPOSE.
1) What does "private" even mean when published and invites entire world?
2) WHO owns a "web-site", anyway? Like physical business, if allow The Public in, then have CEDED some right to "private property". The Public gains, NOT loses. That's the deal.
3) Where is this "corporation"? Show it to me. And UNDER WHAT PRIVILEGE AND RULES is it even allowed to exist? -- By The Public giving it permission, and NOT for the gain of a few, but for PUBLIC USE.
4) Again, mere statute doesn't over-ride The Public's Constitutional Right. And no, corporations are NOT persons, do not have rights, they are FICTIONS.
5) The Public's use is the PURPOSE of any and every web-site. If allows comments, then it's governed only by common law terms: no arbitrary exclusion. Two-way communications is the purpose of teh internets.
[ link to this | view in chronology ]
Re: Posting on a "private web-site" is NOT a "privilege",
2) Techdirt has an owner. Exactly who, I don't care, they keep up their end of the bargain nicely and the public gains from what's posted and the discussion thereof. CDA 230 gives those owners a shield from legal liability if you, a member of the public, abuse that discussion facility in the owner's opinion.
3) Corporations are a legal fiction of the common law, recognised in the laws of all 50 states. The purpose is to allow large undertakings (classically, steel mills and factories) that are too large for any one individual to finance. Techdirt is small enough it might be a sole proprietorship.
4) Most of the discussion here is about the rights of the public in respect of two websites/corporations with outsized influence. We generally agree that the 1st amendment prohibits the government from deciding what sites should do; but when we get to a private site like Techdirt it should be obvious that the owner should be in ultimate control. We have been arguing back and forth about whether the outsizedness of the two websites should make them more "public", or more "private". See laws about public accommodations.
5) Not every website has to serve the PUBLIC directly. CDA230 specifically tells techdirt it's not *legally* liable for whatever it may or may not decide to do with comments from the public, or for whatever the public may say in those comments. This prevents all manner of harm and allows techdirt maximum freedom to make a beneficial website. Part of that is not driving away users/readers by a comment section full of trolls.
[ link to this | view in chronology ]
As I have explained before (and you have refused to acknowledge due to either willful ignorance or brain damage), private does not mean the same thing as privately owned. A brick-and-mortar store such as WalMart is privately owned yet open to the public; the same can be said of a service like Facebook.
[ link to this | view in chronology ]
Re: Whichever of mine, I can't see because censored...
Yeah, but you 'splaining ain't all there is to it! You've got a word trick there, is all.
Why don't you 'splain why you're FOR corporations controlling The Public's speech in the forums that were explicitly created FOR The Public? -- It's even against your won interests! No matter how you despise Alex Jones, nothing he says is outside common law. YOU are setting up a real censorship regime, NOT ME.
I'm just typing in some text on a web-site that has HTML for the purpose. Why do you even want to hide it? Need to 'splain that too! -- As the real AC below asks.
[ link to this | view in chronology ]
Re: Re: Whichever of mine, I can't see because censored...
You've got a word trick there, is all.
Ah yes, that awful trick where words have meanings.
[ link to this | view in chronology ]
Re: Re: Re: Whichever of mine, I can't see because censored...
Not an answer, either.
Since you started on new point, I will too.
My repeat has been up 44 minutes now WITHOUT being censored. Do you (or any person) have ANY control over the "hiding", or is it entirely automatic, from some number of clicks? -- I've asked this dozens of times, but here you are responding, so I can hope for an answer at last? ...
[ link to this | view in chronology ]
Re: Re: Re: Re: Whichever of mine, I can't see because censored...
Gently down the stream....
Funny, I can click on the underlined gray link to show me the flagged posts and see them, no problem!
I'm not sure *exactly* how flagging works to auto-hide posts, but I think it takes more than one flag, and Techdirt Insiders might have more of a vote than me. So it's a collective action. And I'm a volunteer, and I'm going on strike right now, inconveniencing no one!
[ link to this | view in chronology ]
No, I do not. But since you seem unable to parse English well on your own, Ricky, let me explain.
Going by the Oxford English Dictionary, the first definition of private is “belonging to or for the use of one particular person or group of people only”. Several sub-defintions refer to things “not to be revealed to others” and a place that is “quiet and free from people who may interrupt”. A private space, for example, would be one reserved for a handful of people by someone in that group for the purpose of a private meeting.
The third definition, however, refers to services and industry. That definition reads as such: “Provided or owned by an individual or an independent, commercial company rather than the state.” By turning the noun private into the adjective privately, we can logically conclude that a privately owned space is one owned by an individual or an independent commercial company instead of the state.
A privately owned service such as Twitter can be private if the service owner limits usage of the service to a small number of people and prevents anyone else from seeing anything that happens on the service. It can also be open to the public like Twitter is while retaining its privately-owned status. Because that service is a privately owned platform for speech, the platform’s owners and operators have every right to decide what is and is not acceptable speech as well as who will or will not be allowed on that platform.
[ link to this | view in chronology ]
Re: Then Twitter is NOT private!
Uh, I may have neglected to mention that distinction is irrelevant in my opinion, besides that it's not the focus I wish.
But, since you just PROVED that Twitter is NOT a private site, then my point stands: that The Public has been CEDED rights just as Wal-Mart does by inviting person physically.
Oh, and READ the Sandvig decision. It expressly notes that "internet" forums have become PUBLIC SPACES. -- Don't get hung up on the word "spaces" now, doesn't mean physical...
[ link to this | view in chronology ]
You are free to ignore the meaning of words. By the same token, we are free to mock your open admission of willful ignorance by insulting you.
Even if forums and social interaction networks are open to the public, unless they are owned by the public/the government, they remain privately-owned services. You literally cannot force Twitter, in any way, to provide a platform for your speech or guarantee you an audience for that speech.
[ link to this | view in chronology ]
Re: Re: Then Twitter is NOT private!
So what? Common areas in malls are also "public spaces" that are privately owned. If you tried to setup a soapbox and a PA system in a mall so you could spout your nonsense you would be asked to leave. If you refused to leave, you could be legally trespassed from the property. This would not be a violation of your First Amendment rights because it's not the government who is restricting your speech, it's a private entity.
The same concept applies on the internet. If you want a platform to express yourself on without restrictions, either buy your own servers or build your own mall.
[ link to this | view in chronology ]
Yet fanboys promote mega-corporations to rule over themselves!
THAT is the BIG puzzle at Techdirt. What's in it for fanboys? -- They're happy to throw their own First Amendment Rights! And another part of the US Constitution, too, which is interesting to compare: they say content producers have NO right to income from or control copies of creations, but these pirates then stand up for even larger corporations to control their own SPEECH!
On that point, I'm going to assume they're just aping Masnick.
And what is the (mere) statute or court decisions that grant corporations the RIGHT to over-rule First Amendment to absolutely and arbitrarily control "platforms" as Masnick says? -- There is NONE! It's just his assertion. -- AND EVEN IF WERE, THAT CAN CHANGE.
So WHY assert that corporations operating "platforms" are empowered to control the speech of "natural" persons? That's EXACT OPPOSITE INTENT OF ALL LAW.
Only Mitt Romney and Mike Masnick will say that corporations are "persons" -- Romney got roundly hooted for it, and Masnick only does it here in this little walled garden where he's cultivated vegetables who don't question him.
Masnick is a total corporatist. Only the mistaken presumption that he acts in "good faith" and shares YOUR views gives him any credibility. -- Take away that presumption for a week, and READ what he writes: he's very open about particularly that "platforms" have an alleged First Amendment Right to arbitrarily control access of we "natural" persons. Masnick believes not only that The Public can be denied access, but since Google controls search, that it can effectively "hide" speech even on alternative smaller outlets you're forced to use. -- Masnick uses "hiding" right here to disadvantage dissenters until they give up and quit commenting. He can thereby claim doesn't censor.
Corporatists are going for TOTAL control over "natural" persons, period.
[ link to this | view in chronology ]
Re: Yet fanboys promote mega-corporations to rule over themselves!
until they give up and quit commenting
Is that part coming soon, do you think? Pretty please?
[ link to this | view in chronology ]
Re: Yet fanboys promote mega-corporations to rule over themselves!
And for not being able to realize that good thoughts require time and patience to develop, minds require time and patience to change.
We've all come to realize that a few operators (or is that nations of the internet?) have outsized influence that has lead to real, meatspace harm when a site like InfoWars was promoted, and we are trying to work out a rational response. Ham-fisted laws aren't gonna do it. Education might do it, but will take years. Competition doesn't look too good either. It's complicated. Partisanship doesn't help.
[ link to this | view in chronology ]
No one is forcing anyone to go to Jones social media page
[ link to this | view in chronology ]
Re: No one is forcing anyone to go to Jones social media page
Twitter, Facebook and YouTube are a monopoly
And The Beatles were a mighty fine trio.
[ link to this | view in chronology ]
Wait a minute…
[ link to this | view in chronology ]
Re: Re: No one is forcing anyone to go to Jones social media page
There's a pretty direct connection between Alex Jones, promoting Alex Jones on Twitter and Facebook, and some real meatspace harm.
At scale, editorial decisions and creating newsfeeds *are* speech. Twitter and Facebook have lent their credibility to the nonsense from Mr Jones, such as Pizzagate and his interpretation of the Sandy Hook tragedy. This may not be purposeful, but it is the effect. Disclaimers may not un-do it, either.
For Techdirt, note that the scale means this effect of lending credibility is different in kind from what happens when I post my nutty comments here. "I have a bridge for sale", lol.
[ link to this | view in chronology ]
Re: Re: Re: No one is forcing anyone to go to Jones social media page
There's a pretty direct connection between Alex Jones, promoting Alex Jones on Twitter and Facebook, and some real meatspace harm.
I didn't say there wasn't, did I?
Not sure if this is the comment you meant to reply to, but my snarky mention of the Beatles as a trio was simply pointing out the inherent absurdity of listing three companies and saying they are a "monopoly" since that, y'know, is by definition not what that word means.
[ link to this | view in chronology ]
Re: Re: Re: Re: No one is forcing anyone to go to Jones social media page
Facebook is close, if the market is defined as posts between friends with a newsfeed from outside the immediate circle.
Twitter is close, if the market is short announcements. Twitterverse is very close to being in dictionaries.
Youtube, if you are sharing the video form, same.
[ link to this | view in chronology ]
Not really. While some social interaction networks might not have newsfeeds, they share similar base functions with Facebook. Hell, I could call Facebook an evolved version of MySpace or LiveJournal and still be somewhat right.
Not really. Anyone can make short announcements on Tumblr, Facebook, YouTube, Instagram, or literally any other SIN.
You could call that one close, yes, since the direct alternatives to YouTube have not taken off in the same way YouTube did.
[ link to this | view in chronology ]
Re:
You could call that one close, yes, since the direct alternatives to YouTube have not taken off in the same way YouTube did
And even that's not entirely true, as both Facebook and Twitter also host videos. Also, Facebook can easily be used as a public microblogging service like Twitter, via public posts and its "subscribers" feature. And Twitter can be used with a private account that only follows friends, with optional access to additional newsfeeds via lists & trending topics, thus making it work much like Facebook.
All these services are in direct competition with each other, even though many people use all three.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: No one is forcing anyone to go to Jones social media page
Facebook is close, if the market is defined as posts between friends with a newsfeed from outside the immediate circle.
Well yeah, if you narrowly define a market as a very specific set of features then everything is a monopoly.
Subway definitely has a monopoly on restaurants that serve submarine sandwiches alongside wraps and baked cookies. The Cartoon Network has a monopoly on cable television networks that exclusively air animated shows. Dave & Buster's has a monopoly on chain restaurants with multiple card-operated arcade games. Nintendo has a monopoly on family-focused home game consoles with motion controls.
And yet all these companies in fact face huge amounts of fierce competition.
[ link to this | view in chronology ]
All in one convenient post. I see "Leigh Beadon" is watching,
so perhaps he's the Censor, not the lie of "the community" without any Administrator okaying.
Is Section 230 CDA to benefit The Public or corporations?
Obviously The Public by providing speech outlets. You could not find a single politician who'd say it's to benefit corporations. -- Though it's entirely possible that was and is the intent: a stealthy form of censorship.
Corporations have PR departments with large budgets to get their message out. It's only individual "natural" persons who need outlets for their views.
Masnick highlights that CDA 230 provides immunity that allows corporations to HOST content without the liability of PUBLISHING it:
Note first that states causes valid in common law. It's a requirement for simple decency. Not controversial so far...
Corporatists try to twist CDA into control of OUR PUBLISHING!
How did "Communications Decency Act" get twisted into authorizing complete corporate control of The Public's new forums?
Because the ultimate purpose of Masnick / EFF blather (both funded by Google) is to sweep on to:
They claim that corporations can use that "restrict access or availability of" clause to, whenever wish -- even explicitly over YOUR First Amendment Rights -- step in and EDIT comments, to point of becoming THE publisher, even to PREVENT we "natural" persons from publishing on "their" platforms at all!
But those are OUR platforms, The Public's, NOT theirs.
Corporations are allowed (by The Public) merely to operate the machinery which is to convey The Public's views. Corporations are NOT to control who gets on, nor WHAT The Public publishes, except under OUR clear common law terms.
But Masnick is for corporations CONTROLLING the speech and outlets of "natural" persons. That's repeated often here, can't be mistaken:
"And, I think it's fairly important to state that these platforms have their own First Amendment rights, which allow them to deny service to anyone."
https://www.techdirt.com/articles/20170825/01300738081/nazis-internet-policing-content -free-speech.shtml
Masnick is not hedging "lawyers say and I don't entirely agree", or "that isn't what I call serving The Public", but STATES FLATLY that corporations have arbitrary control of MAJOR outlets for The Public! He claims that YOUR Constitutional First Amendment Right in Public Forums are over-arched by what MERE STATUTE lays out!
Such control (by ANY entity) to remove First Amendment Rights from The Public CANNOT be purpose of ANY statute. It'd be null and void because directly UN-Constitutional.
It's NOT law, only the assertion of corporatists: no court has yet supported what Masnick claims. Corporatists are reaching for the moon without even a step-ladder. It's simply a trick Fascists are trying to pull.
The provisions over-riding First Amendment Rights ONLY apply if done in "good faith" for The Public's purposes. A corporation intent on stopping YOUR publishing in favor of its own views CANNOT be "good faith" for The Public, but only de facto tyranny and censorship.
Posting on a "private web-site" is NOT a "privilege",
any more than is reading it. THAT'S THE PURPOSE.
1) What does "private" even mean when published to and invites entire world?
2) WHO owns a "web-site", anyway? Like physical business, if allow The Public in, then have CEDED some right to "private property". The Public gains, NOT loses. That's the deal.
3) Where is this "corporation"? Show it to me. And UNDER WHAT PRIVILEGE AND RULES is it even allowed to exist? -- By The Public giving it permission, and NOT for the gain of a few, but for PUBLIC USE.
4) Again, mere statute doesn't over-ride The Public's Constitutional Right. And no, corporations are NOT persons, do not have rights, they are FICTIONS.
5) The Public's use is the PURPOSE of any and every web-site. If allows comments, then it's governed only by common law terms: no arbitrary exclusion. Two-way communications is the purpose of teh internets.
Yet fanboys promote mega-corporations to rule over themselves!
THAT is the BIG puzzle at Techdirt. What's in it for fanboys? -- They're happy to throw away their own First Amendment Rights! And another part of the US Constitution, too, which is interesting to compare: they say content producers have NO right to income from or control copies of creations, but these pirates then stand up for even larger corporations to control their own SPEECH!
On that point, I'm going to assume they're just aping Masnick.
And what is the (mere) statute or court decisions that grant corporations the RIGHT to over-rule First Amendment to absolutely and arbitrarily control "platforms" as Masnick says? -- There is NONE! It's just his assertion. -- AND EVEN IF WERE, THAT CAN CHANGE.
So WHY assert that corporations operating "platforms" are empowered to control the speech of "natural" persons? That's EXACT OPPOSITE INTENT OF ALL LAW.
Only Mitt Romney and Mike Masnick will say that corporations are "persons" -- Romney got roundly hooted for it, and Masnick only does it here in this little walled garden where he's cultivated vegetables who don't question him.
Masnick is a total corporatist. Only the mistaken presumption that he acts in "good faith" and shares YOUR views gives him any credibility. -- Take away that presumption for a week, and READ what he writes: he's very open about particularly that "platforms" have an alleged First Amendment Right to arbitrarily control access of we "natural" persons. Masnick believes not only that The Public can be denied access, but since Google controls search, that it can effectively "hide" speech even on alternative smaller outlets you're forced to use. -- Masnick uses "hiding" right here to disadvantage dissenters until they give up and quit commenting. He can thereby claim doesn't censor.
Corporatists are going for TOTAL control over "natural" persons, period.
[ link to this | view in chronology ]
Please define “common law” in clear and concise terms so that we may all understand what in the blue hell you are talking about.
[ link to this | view in chronology ]
Re: I'm not Wikipedia. Try there: it's reliable for such.
Common law is known TO YOU. Here you're just pretending to question while dodging mine.
This piece I've long had ready and shows EXACTLY that YOU know full well about "common law":
Here you are admitting that after harassing me for months, you KNEW The Law:
https://www.techdirt.com/articles/20180130/19040439126/court-shuts-down-troopers-attempt-to-por tray-new-ish-minivans-with-imperfect-drivers-as-justification-traffic-stop.shtml#c868
Me: You dodged both "natural" person and "must" terms.
You: I didn't address the "natural" "person" shit because SovCits and their lingo should only ever be mocked.
You: As for "must": Technically, no, one does not need a license to drive a car. Then again, if you get caught driving without a license, your ass is in trouble no matter how much you stress that you are an individual who is traveling under the auspices of common law as determined by the Founding Fathers under a gold-fringed American flag.
TECHNICALLY IS THE LAW. YOU BELIEVE EXACTLY WHAT "SOVCITS" DO.
You admit that, even mention "traveling", then without apology try to divert with advocating obedience to the police state and more mockery.
Indeed, you implicitly state that "SovCits" are right to worry about an oppressive police state that violates the letter of the law.
I say that you had the rare, precise, TRUE knowledge used for "outing" a Constitutionalist such as an Undercover Agent for the FBI would. -- Or other intelligence agency / secret police.
Last bit of evidence is that you only mocked that charge, didn't deny it. That's a tactic indicative of those who might later have to admit a lie.So, AGAIN, you are DODGING AND LYING.
Oh, and you'r arguing with one you say is an idiot troll. So WHY on that, too?
[ link to this | view in chronology ]
Anderson: Who are we?
Iscariots: The necessary evil!
Anderson: Why are we necessary?
Iscariots: To purge the world of evil worse than man!
Anderson: And why are we God's chosen few, ordained to undertake this unholy task?
Iscariots: Because no one else will!
Anderson: And because it’s fuckin’ fun!
[ link to this | view in chronology ]
Re: Yeah, you're long past substance on topic, "Stone".
You've lost. As happens every time I dig in and respond.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
But at least he stays on topic! Flagged again, re-posting the rant without a *concise* re-write means only the *dedicated* are willing to read you!
[ link to this | view in chronology ]
Re: Why so interested and monitoring site closely late at night?
"Christenson"? You're in six minutes after mine. Sheerly for interest of those who'd like to so closely monitor Techdirt, how EXACTLY do you DO that? Hit "refresh" every minute?
Oh, so YOU set the standards for writing here. Where exactly is your guideline so I won't run afoul of it again?
Your "reasons" to "flag" sound specious, trivial at best, and your quickness to comment and attack look a whole lot like ASTRO-TURFING.
[ link to this | view in chronology ]
Pot, kettle, black.
[ link to this | view in chronology ]
Re: "Pot, kettle, black." -- Huh? Doesn't even make sense.
How 'bout you answer? Sitting there hitting refresh?
[ link to this | view in chronology ]
How ’bout you suck on bofa.
[ link to this | view in chronology ]
Re: Re: Why so interested and monitoring site closely late at night?
And no, *I* don't set the standards here. Maybe Techdirt will reply with what it takes to get hidden by flagging, but I suspect it's more than one of us getting a bad opinion of your post and raising up and hitting that little red button. I write even this much at significant risk of getting flagged myself; I'm OK if that happens.
Let me re-state the reasons for my flag: You re-posted *exactly* what you said before. That's generally a no-no, it irritates almost everyone. I think your post is too long, too. So I'm trying to tell you that you will get further with everything if you calm down and shorten what you have to say, and take the time to say it well.
[ link to this | view in chronology ]
Re: Re: Re: Why so interested and monitoring site closely late at night?
That's my protest.
BUT NOW you're dodging having stated that you "flagged" it the first time! -- WHY? -- 5 posts separates it into areas, my preference. Same icon, anyone can skip over with two Page Downs? Not permitted by YOU? And where is YOUR authority written again? Because you seemed to speak with authority as if have actual control, and rather gleeful at "flagging"! -- As if you knew that those would soon be hidden.
Keep writing. ALL responses help me; you're bound to slip if as I believe, more than an "AC"...
[ link to this | view in chronology ]
Re: Re: Re: Re: Why so interested and monitoring site closely late at night?
Kill it all here Mike Masnick, my posts included. Thanks!
[ link to this | view in chronology ]
Nah, Russians would act dumber than him.
[ link to this | view in chronology ]
What are you afraid of?
Now I look again, and I see whole swaths of comments censored (hidden). As I open them, I read much more interesting content, another point of view that is well expressed and interesting, albeit in direct opposition with the article.
Why are you so afraid of these views? Is it because you know they are persuasive? Do you think hiding them serves your cause, or amplifies your fear for the whole world to see? Are you peeing in your pants?
Personally, I think the whole tyrannical censorship binge on Facebook, Twitter, YouTube and Techdirt are going to backfire. You all look like disingenuous, dishonest, fearful cowards quivering in your censored safe spaces. You look weak and pitiful, and unworthy of holding any power at all. The more you hide the words of others, the more you amplify their credibility.
What are you afraid of?
[ link to this | view in chronology ]
Re: What are you afraid of? -- I'm commenting only to
show I'm currently on another IP address. Of course, that doesn't mean isn't ME on another computer, but it's NOT. Real comment, real person, really PUZZLED by the fake "free speech" at Techdirt.
You must be new here, AC. Only if comments agree with Techdirt is the key point. -- And done in a piece where I'm EXACTLY on topic!
[ link to this | view in chronology ]
Re: Re: What are you afraid of? -- I'm commenting only to
OKAY, AC. There ya go! Now YOUR comment is censored!
I'm one of two who KNOWS that isn't ME. I raised the possibility, for what little is worth here at this den of astro-turfing.
But anyone new (IF are any!) reading will see what happens here on "free speech" Techdirt.
You CANNOT dissent. Period. Expand this soft, safe, sane, with excuse of "for public good" corporate control to all who question The Establishment, and you see even mild questions will be erased, too dangerous to masnicks!
[ link to this | view in chronology ]
Re: Re: Re: What are you afraid of? -- I'm commenting only to
[ link to this | view in chronology ]
Nobody here is afraid of those alleged “views”. Everyone here who is not one of our typical trolls is tired of seeing the same anti-Techdirt screed over and over again. The trolls never make any compelling arguments nor engage in on-topic, on-point, good faith debate. They never offer anything insightful or amusing; they offer bile and insults and, in the case of “Hamilton”, empty rhetoric designed to make them look “reasonable” while they ultimately say a lot of nothing.
If you do not want to get flagged, engage with the commenter community here on good terms and in good faith. You can disagree with people here and express views that go against the supposed orthodoxy of Techdirt without being a shithead about it.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
I can imagine how someone with brain damage would believe that, sure.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
No, this is someone who tries to argue in good faith seeing someone trying to argue in bad faith and treating them with the disdain and contempt they deserve.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Keep using yes-and-no questions in an attempt to goad me into an “gotcha” statement. Knowing your trick makes me far less likely to fall for it by accident.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Wasting your time on the "Stone" account, AC!
Ad hom is all you'll get. He gets much nastier. -- And THAT sheer ad hom is NEVER censored here.
[ link to this | view in chronology ]
Not true! I also have insults about your mother and pop culture references.
[ link to this | view in chronology ]
Re:
This is your statement, right, Stephen? Do you feel a little silly now? Or do you feel justified and reasonable.
Because you look ridiculous.
[ link to this | view in chronology ]
I figured out this gimmick long ago. It will not work on me.
Not really. You never had any intention of discussing the issues laid out in this article or in the serious comments posted before you showed up to troll—well, not in good faith, anyway. I have no problem with mocking you because you did nothing to prove you deserve being taken seriously.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
Dumbass
[ link to this | view in chronology ]
Your censorship speaks loudly
[ link to this | view in chronology ]
Re: Your censorship speaks loudly
[ link to this | view in chronology ]
This is a sincere (without wax) question
Really, this is a sincere question. You are doing a lot of policing, why not just restrict membership to people you trust and agree with? What is the point of “appearing” to be open to opinion, and then silencing opinion?
[ link to this | view in chronology ]
Re: This is a sincere (without wax) question
(And by the way, when you have to repeatedly state that your question is sincere, it's a rather good tipoff that you know it's not.)
[ link to this | view in chronology ]
Re: Re: This is a sincere (without wax) question
[ link to this | view in chronology ]
Re: This is a sincere (without wax) question
If you don’t want to hear opinions that you don’t like, why not just make this a “members only” site?
We actually like and appreciate opinions we don't like. But, what the community clearly does not like are disingenuous trolls.
There are lots of conversations that we have on the site where people disagree in respectful, non-disingenuous ways. And those don't seem to get flagged.
[ link to this | view in chronology ]
Re: Re: This is a sincere (without wax) question
Masnick, you label just so can skip points raised. That's a key problem in topic.
I'm not disingenuous, just stating my views. That YOU and your wrong views come in for reference is unavoidable.
My views are based in reality and shared by large numbers of persons including Congress and Supreme Court. I point you again to the Sandvig v Sessions decision, from page 7 on, "A. THE INTERNET AS PUBLIC FORUM". Need to read the whole, of course, but it's clear just from these quotes that the Supreme Court is nearly to exactly the view that I state in the posts which you censored. (By the way, I say censored because clearly deliberate and targeted viewpoint discrimination. You have not responded to dozens of questions state whether an Administrator is involved, so I conclude that one is.)
The discussion is businesses verus "natural" persons.
Key point: "the same principles are applicable." -- Again, that's applying to "natural" persons who in the instant case are accessing web-sites against TOS and corporate wishes, which of course is EXACTLY apposite to using forums and requiring them to be NEUTRAL.
Nothing in The Constitution supports the Corporatist view that "platforms" under CDA 230 are authorized to discrinate against viewpoints rather than on common law terms. And CDA always requires "good faith", as I show in the already censored comments.
The bottom-line question for readers: Do YOU want to be SUBJECT to Corporate Control? -- If so, just follow Masnick blindly, he'll lead you into the high-tech prison!
[ link to this | view in chronology ]
Re: Re: Re: This is a sincere (without wax) question
[ link to this | view in chronology ]
…the idea that anyone, including the government, can legally force any kind of platform for First Amendment activity to host certain kinds of speech.
[ link to this | view in chronology ]
Re: Re: Re: This is a sincere (without wax) question
The Internet is a collection of websites, and so long as you can create and control your own website, you are part of that forum. That is you can set up your room, in a common forum, and if nobody enters that is not a free speech issue, but rather people deciding they do not want to hear what you want to say. If you have that problem, you are not entitles to barge into somebody else room and try to take over their audience.
[ link to this | view in chronology ]
Re: Re: Re: Re: This is a sincere (without wax) question
Most people these days use a hosting company for their website. So, should hosting companies be considered common carriers?
47 USC § 201
47 USC § 202(a)
As the D.C. Circuit explained in Verizon v FCC (2014), those provisions are at the core of the concept of common carriage.
(Emphasis added; pincites omitted.)
Should internet website hosting companies be classified as common carriers?
How else would you ensure that “you can create and control your own website” ?
[ link to this | view in chronology ]
Buying your own servers.
[ link to this | view in chronology ]
Re:
Incidentally, I don't agree with Supreme Court nominee Kavanaugh's position that a showing of market power is necessary for common carriage determination. As a matter of Supreme Court precedent, I think he's wrong. Otoh, he's the nominee and I'm not.
All the same, here, in this context, I'd agree that the market for internet website hosting providers isn't that concentrated, and there are plenty of alternative providers. And buying your own servers is a feasible option, as opposed to renting.
But what about Cloudfare?
Unfortunately, these days, DDoS protection is probably essential for a modern website of any prominence. Is Cloudfare an essential facility for practical website hosting?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: This is a sincere (without wax) question
Note, free speech, means freedom to have your say, it does not say that you can distribute your speech for free, but other than standing on the street corner, distributing speech has always cost he speaker money, unless a publisher decide what they had to say was worth publishing.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: This is a sincere (without wax) question
Ajit Pai does NOT agree with you. Title II of the Communications Act of 1934 regards “Common Carriers”. Ajit Pai says the ISPs aren't common carriers.
[ link to this | view in chronology ]
Ajit Pai also said the FCC was hacked. Do you always believe what he says?
[ link to this | view in chronology ]
Of course he did
The U.S. Court of Appeals for the D.C. Circuit disagrees.
At this point I'm pretty sure if he thought it would be beneficial to them(and he could get away with it) he'd be willing to classify them as planets, so the fact that he says they aren't isn't really that surprising.
Adding to the humor the companies themselves have a rather bi-polar view on the subject, where they want to be considered as falling under Title II when it benefits them, but object when it doesn't.
[ link to this | view in chronology ]
Is it possible to create communities that accept censorship?
[ link to this | view in chronology ]
Re: Is it possible to create communities that accept censorship?
[ link to this | view in chronology ]
Re: Re: Is it possible to create communities that accept censorship?
But deaf people are more interesting.
[ link to this | view in chronology ]
Sockpuppetry [was Re: Re: Is it possible to create...]
Care to explain why you're apparently engaging in “dialogue” with three comments (1, 2, 3) all from a poster using the same identicon?
[ link to this | view in chronology ]
Re: Sockpuppetry [was Re: Re: Is it possible to create...]
[ link to this | view in chronology ]
Re: Re: Sockpuppetry [was Re: Re: Is it possible to create...]
Hitting report on all comments with that identicon.
[ link to this | view in chronology ]
Re: Re: Re: Sockpuppetry [was Re: Re: Is it possible to create...]
If you go back in time a little, there was some powerful mathematical writing by a fellow named Goedel or Godel, depending on who cites him. He explained something notable - that mathematical systems that refer to themselves are inherently unsound. Also, that mathematical systems that do not refer to themselves are not powerful enough to express anything interesting.
This same truth was expressed by Bach in his “endlessly rising canon”, and by Escher in artistic form. Self reference is just the most expressive medium in which to express a profundity.
That is, when you express something in literature by using the two person conversational form, you can say much more with many fewer words. Same for Bach, same for Escher, both of which are widely acclaimed for this fundamental expression of truth.
I was using shorthand to get my idea across. Sorry if that got your knickers in a knot.
[ link to this | view in chronology ]
Nah, fam, you were trying to sockpuppet and you hoped no one would notice.
[ link to this | view in chronology ]
TD Comment Theater presents: 'Me, Myself and I.'
Aw, you told them how to spot the same person across multiple comments, I was waiting to see if a few more 'people' would chime in to talk about how that first person was absolutely right(not that it's at all hard to spot them even without the gravitar).
[ link to this | view in chronology ]
Re: TD Comment Theater presents: 'Me, Myself and I.'
Oh, it's easy to spot me across multiple comments even crossing multiple articles— if I'm not the absolutely worst offender in following up on my own comments, then I'm certainly in the top two or three in that regard. You can spot me every time like that.
In my defense, I'll tell you that it often takes me considerable time to put together a short note with well-researched supporting hyperlinks. How much time should it take to put together a mere comment? And then it just often turns out that short note requires correction, clarification or other follow-up. I apologize for the error. Frequently.
Or I may be closely following an evolving story in the news… In that case, it's like putting together the pieces of a not-quite-well-fitting puzzle — a puzzle which fucking mutates and grows tendrils as its picture takes shape.
[ link to this | view in chronology ]
Re: Re: TD Comment Theater presents: 'Me, Myself and I.'
And by the way, just from the tone and the timing, I suspected the sock-puppetry.
[ link to this | view in chronology ]
Re: Re: Re: TD Comment Theater presents: 'Me, Myself and I.'
No one was trying to deceive anyone, it was a literary form, as I clearly explained. Everyone can see that posts are marked with an icon that identifies the writer by IP address. Just reset your router to change the icon. Easy peasy.
If we could all live by the logic you propose, that ad hominem attacks are shameful to the writer and not their target, this would be a considerably improved forum.
[ link to this | view in chronology ]
Re: Re: Re: Re: TD Comment Theater presents: 'Me, Myself and I.'
Anyone? Or is silence and submission the only acceptable virtue signaling of group membership for this community? I suspect the later, and those that do not want to be abused by the Techdirt mob know they cannot safely respond.
[ link to this | view in chronology ]
I think the age of anyone caring about comment policing is past
[ link to this | view in chronology ]
Re: I think the age of anyone caring about comment policing is past
Up and until the election of Donald Trump, the left had all societal tools at their disposal. The news media, the print media, and electronic media, with almost no exceptions. He steered America towards it’s own demise and replacement with a revolution of the minorities who felt disenfranchised at the expense of the all the rest of us.
This revolution failed. Even with all the tools of power at his disposal, Trump was elected. He was so radical in his implementation, corrupting the DOJ, the FBI, the IRS, schools, Universities, health care, foreign policy, etc. that Trump looked like the best option to American people. Trump, hairy head and all.
The time to care about policing social media is past. No one cares. From this point forward, the conservative side of American politics will dominate, I predict for a hundred years or more. Now students are being schooled in real American history again. Universities will follow, as will the DOJ, the FBI, the IRS, all of it. It was saved from destruction by the American people, who will never relinquish power again.
No one cares who polices this stuff. The American People have already won the war, and the left will now simply eat it’s own children. It’s not important.
Witness Donald Trump. Welcome to the Century of Conservative Rule.
[ link to this | view in chronology ]
Re: Re: I think the age of anyone caring about comment policing is past
America has been refining it’s idea of itself for over 200 years. We have, though our Constitution, incredible flexibility as a society. Not unlimited flexibility, of course, but as long as people consent to new creative approaches to solving problems by voting for their favorite political candidate, we have wandered far and wide over the ideological map.
What has changed now is the speed at which we can conceive, communicate and challenge and discuss. In the last few years, we have heard every argument from the left, and we have heard every argument from the right. The speed that these arguments can be made, distributed, endorsed or condemned has been accelerated far beyond what Hamilton and the other Federalist paper authors could have anticipated.
In Hamilton’s time, it took days or weeks or months for arguments to be presented and either endorsed or rebuffed. Now it takes seconds. We have heard all the important arguments. They fly by in the wink of an eye now. The speed of the opinion pendulum of American Voters is no longer delayed by Pony Express.
And we have decided, as a society, once and for all. This is not the past, this is the future. Communication is faster, resolution is faster, and what America choose in Donald Trump as an implementer of the American Dream, which he spoke directly to, repeatedly. We all know what we want him to do, and he is doing it. To the majority of the American people, that is how they see the America of today and tomorrow.
Nothing is going to change for the foreseeable future. The same censorship is going to take place, the same anarchists and going to espouse the same positions they did since Soviet Russia, there really is nothing new under the sun. We’ve seen and read it all. Several times.
And American have decided. There will be no pendulum swinging back any time soon, not until there are creative ideas to drive it. New ideas that have not yet been presented. Until then, Welcome to the Century of Conservatives.
Your only hope is to stop censoring and start creating. Censoring doesn’t work for long, and in today’s world, by demonstration, it doesn’t work at all.
[ link to this | view in chronology ]
Re: Re: Re: I think the age of anyone caring about comment policing is past
Perhaps my point was not as clear as it could have been
Yes it was. At least if your point was that you're completely detached from reality.
[ link to this | view in chronology ]
Re: Re: Re: Re: I think the age of anyone caring about comment policing is past
How is it going? Well, I they are getting the “participation” trophy, and they are giving it to themselves. In electronic media, is it a very one sided discussion in their favor. In the “real world”, to address your perception of reality and mine, they are losing every important election. Going forward, they will just lose more.
That is, in reality, the left are bunch of losers, bigly. The rest of us are getting promotions, buying nice things, paying off our debts, and saving for our futures with stock accounts that up 30%+. Trump’s popularity is nearly double what it was on Election Day. Real enough for you?
[ link to this | view in chronology ]
They lost in 2016 because the Republican presidential candidate was a White man who used xenophobia and racism as his primary campaign promises (building the wall, dismantling Obamacare because fuck Obama, only letting the “right” immigrants in, the Muslim travel ban) and Republican voters seized on the chance to slow the so-called “browning of America”.
Define “rest of us”, because I know people who live paycheck-to-paycheck hoping that nothing major happens to drain what little money they have in their checking account. I doubt they are seeing the success you claim they are.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
14. BREACH, REVOCATION AND CANCELLATION.
14.1. In the event that you breach any provision of this Agreement, you agree that we may immediately terminate your use of our Services and System.
14.2. In the event such a breach occurs by you, we may post on the Website that you have violated our terms and conditions of service.
So, according to InfoWars, they reserve the right to name and shame you on their web site for violating the Terms of Service. While I'm sure that's perfectly legal, there just seems to be a juvenile element to it, especially considering that they feel the need to explicitly point it out.
[ link to this | view in chronology ]
Re: Any breach of service must be due to common law cause.
Corporatist Masnick is explicitly turning even the First Amendment, the prime guarantee of free speech, into a means for corporations to regulate speech that he doesn't like. It's not going to please you if suppressed by a corporation rather than gov't.
Read my reply to Masnick above, which references recent Supreme Court decision strongly implying that internet forums are the new public forums, to be protected against arbitrary denial of service.
[ link to this | view in chronology ]
Re: Re: Any breach of service must be due to common law cause.
That's corporatism, simply the present way to grab power.
[ link to this | view in chronology ]
Re: Re: Any breach of service must be due to common law cause.
[ link to this | view in chronology ]
Re: Re: Re: Any breach of service must be due to common law cause.
There's an interesting suit allowed to go forward against Twitter (I believe) which is based on that being fraud. Inviting people in, saying you're for free speech even when difficult to bear, that "comment is open to all" -- and here on Techdirt are ZERO commenting guidelines, nor warning that comments can be edited (which "hiding" is, adding an editorial comment that they're too dangerous to view without a warning) -- and then discriminating against viewpoints, that's civil FRAUD.
By the way, Masnick trots out the continuing LIE that it's all the mysterious "community" with an opaque "voting system", won't state whether an Administrator is ever making a decision. The observable fact right above is that "Stone" gets to make vile senseless one-liner ad hominem comments, and jeer that he can do more, while my on-topic lengthy comments, into which I put time and thought, are censored.
[ link to this | view in chronology ]
Re: Re: Re: Re: Any breach of service must be due to common law cause.
[ link to this | view in chronology ]
Try being funny to anyone but yourself, see if that changes things.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
The only thing I enjoy about engaging you is finding better ways of insulting you.
[ link to this | view in chronology ]
Re: Re:
And Terms of Service?? It's a sophisticated crowd here, encouraged by "Funniest/Most Insightful Comments of the Week", so, sorry if you can't find explicit terms of service.
The terms amount to
"Don't be an ass".
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
“Don’t be an ass”, in the context of post you replied to, is not an accusation. Your post is like a set of used Dragon Balls: inert.
[ link to this | view in chronology ]
Re: Re: Re: Re:
It's *always* good to look in the mirror and ask:
"Am I being an ass???" "How could I be less of an ass?"
And BTW, I stole that rule from the very same website with "auto-nanny", which shadow-banned 7 words but not nigger or faggot or queer! lol.
[ link to this | view in chronology ]
I think we can draw quite a lot of conclusions about the social media moderation dilemma from that.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
Eh, I'm more of a switch.
[ link to this | view in chronology ]
The more sites that talk about this, the more people will hear of it.
[ link to this | view in chronology ]
shadowbanning etc.
AS for definitions:
Hate Speech = SPEECH I HATE!
Fake News = FALSEHOODS THAT DISAGREE WITH MY RIGHTEOUS BELIEFS!
Good thought provoking article. One major problem is the tyranny of the masses Group Think has led to all sorts of weirdness e.g. cultural appropriation. IMHO the idiots should be free and untrammeled when they make it evident that they are idiots!
No compulsion in the world is stronger than the urge to edit someone else's document.
H. G. Wells
[ link to this | view in chronology ]
In the censored comments, anyone see outside common law?
Of course not. So why censored?
Because Masnick doesn't want actual debate, only to spread the poisonous corporatist propaganda that we "natural" persons MUST accept new method of censorhip.
The real KEY is not a gov't / corporate divide though, it's that BOTH gov't and corporate methods serve The Rich in their aim for total control.
Again, you are eventually not going to like corporatized censorship, even though you DO like it currently targeting Alex Jones. The system is made to control you, too. It'll always need new targets, so if suppress "conservatives", then you're next.
Corporations have no ideology except power.
[ link to this | view in chronology ]
Re: In the censored comments, anyone see outside common law?
Go ahead and target Alex Jones - he has gotten more publicity in the last few days than he ever got before. Have at it, maybe he will be elected as the next president if they censor him enough - witness Hitler, as Mason Wheeler aptly pointed out at the top of this section.
Americans are capable of sorting this all out for themselves, as they publicly demonstrated with the election of Donald Trump. The left’s control of YouTube, Facebook, Twitter and Techdirt did them no good at all during the last election, and I guarantee you that their power will continue to diminish.
Nearly all men can withstand adversity, but if you want to test a man’s character, give him power. Witness the tyranny of Obama, Zuckerberg, Masnick and others, and compare them to Trump. Trump has more character than all those leftist tyrants combined, by demonstration. Don’t expect Masnick to change, and don’t try to take away what incredibly feeble power he wields. He is performing a public service by demonstrating the results of a tyrant in teapot. A very small teapot.
[ link to this | view in chronology ]
Re: Re: In the censored comments, anyone see outside common law?
Defend your public validation and endorsement of convicted traitors and how you are helping to build a better world for everyone, not just the criminal few who set out to damage America. It is difficult to comprehend that you are as shallow, conniving, fearful and hateful of others as you seem, probably you have a reason for your rampant censorship of the educated while promoting the hateful and disgusting rhetoric of the ignorant and retarded.
Give us a few words, Mr. Masnick, and bring yourself into a perspective that normal people can understand. You are a talented writer, express your character as Lincoln suggested. You certainly have the power to do it, there is no denying that by anyone, this is your site, you can do whatever you want, legally, as you laid out many times. You have the ultimate corporate power here. Expose your character, we are all interested.
[ link to this | view in chronology ]
Whoa, there are ladies present here, you freak.
[ link to this | view in chronology ]
Re: Re: Re: In the censored comments, anyone see outside common law?
[ link to this | view in chronology ]
Re: Re: Re: Re: In the censored comments, anyone see outside common law?
Walking away from someone spouting nonsense or hatred had always been part of public discussion. If people are turning their back to you, it is your problem for driving them away.
[ link to this | view in chronology ]
Hate speech is a vague concept. Censoring it would be a daunting task. There would be false negatives and false positives, which would cause severe problems.
I also think that online platforms have too much power in controlling speech. Google, Facebook and Twitter can prevent you from reaching the audience. Microsoft and Amazon can shut down your website.
If companies must comply with government censorship, it's totalitarian. And if they censor by free will, it's terrible too.
Yet with no censorship with all, liers are spreading their preach easier than ever. Finding solutions isn't easy, especially when one half claims censorship is too hard and the orher half that it's too soft.
[ link to this | view in chronology ]
Re:
Unlike Globalist Socialist Zombies, Americans, at their foundation, make their own decisions. Americans understand tyrannical attempts to control them. The more tyrannical the platform, the more Americans HATE them and elect their nemesis.
100 years of Conservative Rule. That’s my prediction. We have entered the ideological equivalent of WW I where everyone is dug into their trenches, and the line does not move. Short of a nuclear attack (which is in Trump’s full control) nothing is going to change for the foreseeable future.
Censorship is America is counterbalanced by American beliefs and ideology. Anyone who thinks otherwise is no longer in power anyway, so who cares? When I was in grade-school, we had lessons in critical thinking. They never really wear off.
Except for the retards, that is.
[ link to this | view in chronology ]
Re:
This is why American law does not recognize “hate speech” as a form of unprotected/illegal speech. No court has yet to come up with a working definition of the term that could strike down as narrow a selection of speech as possible while leaving room for, say, social commentary or satire.
Depends on what you mean by control.
If you want to discuss concerns about how Twitter, Facebook, etc. have “too much” cultural influence and what not? Go right ahead. But please avoid implying that those companies owe you an audience for your speech. You are not legally entitled to an audience of any size.
That is a legitimate and pressing concern, as are situations such as Cloudflare’s refusal to serve Stormfront and the ensuing fight over Stormfront’s domain names. (Even the Cloudflare CEO said as much.) Being unable to reach an audience on a specific service is not something I worry about. Being unable to reach any audience at all because someone said I could not? That should worry the shit out of everybody, regardless of whether that someone is a government agent or a corporation CEO.
(And yes, while I would typically refer to something like Amazon kicking your site off its service for violating the terms of service as “moderation”, it can also be censorship. Nuance!)
There is no ultimate “solution”—no magic problem-solver, no algorithm, nothing. Any moderation of a given platform is bound to run into mistakes and make bad calls. Automated tools are only as good as the people who make them, and human beings are irrational creatures. Those tools—and sometimes those people—might ignore the context of an “offensive” statement and smack down someone who, for example, uses “queer” as a self-identifying descriptor instead of an anti-gay slur.
We will always make mistakes. The goal, then, is to improve the systems we have so we can do better in the future. Better rules, better tools, better understandings of context—all things we can put into practice so we can make those systems better. We can never make a system “perfect”, but as an old aphorism says: “Perfect is the enemy of good.”
[ link to this | view in chronology ]
Re: Re:
Alex Jones and Infowars, through Facebook and Twitter, have done some serious harm that is worth addressing. The Atlantic has reported on some racial violence in India that is all Facebook and racial tension over a screw-up in a restaurant where the restaurant and the customer didn't speak the same language.
Censoring that idiot Alex Jones only Streisands him and validates him...just as it did Mr Hitler before he started the second world war. Censoring screeds might have been a short-term victory, but it is a long term failure. Similar with those we down-moderate here in the comments.
Many feel that Twitter and Facebook, being of great influence, free, and open to everyone that is not "ill-behaved", are in fact public accommodations that should not be arbitrarily censoring users. Cloudflare's CEO said it well when he dropped Stormfront, noting that while he had every right to reject them as a customer because he woke up in a bad mood, that didn't make it a good thing. Other commenters have noted that "nondisrcriminatory", as originally intended, did not include a protected class...it meant everyone. Otherwise, it gave, for example, the telegraph operator or railroad (or city garbage collector) inordinate power.
Moderators disagree over the exact same content, and moderating content is very context sensitive.
The normal speed of the internet is a factor in all of this.
******
From a prophetic standpoint, congress is jumping up and down saying "Do Something!...Do Something!" in a panic. Something *will* change, the question is how to not end up with bigger problems than we started with.
From the evidence above, which is on reasonably firm ground, I take the following:
The difference between Techdirt and Facebook is one of scale. With user generated content, both decide which content will be most easily seen; that is speech by the website and not by the users, and currently protected by CDA 230.
One possible way to legally distinguish Techdirt and Facebook is the degree to which Facebook users live in a "filter bubble". Essentially everyone sees the same Techdirt. No two people see the same Facebook or twitter, and you never see the same Facebook twice. The choices on Facebook are out of the end-users control.
I seriously don't think removing Infowars (or any other post) actually helps with the underlying problems, which I think involve a sense of panic, a degree of credulity, and a lack of reflection. Recall that the US has recently made gay marriage a thing; while the liberals said "it's about time!", the conservatives felt like it was legislated immorality.
Perhaps speech in the opposite direction might make more sense.
I have an acquaintance who has been going on about #qanon for months, and another who think Antifa are an actual organized group. Not sure how to reach these folks, but prying the garbage from their cold, dead hands is just gonna make them grab it harder.
[ link to this | view in chronology ]
Re: Re: Re:
The tricky thing about the Alex Jones situation lies largely with the Outrage Machine mentality of most social interaction networks: Expressing anger about something is both easier and more eyeball-grabbing than expressing positivity. (The exception is cute animal videos.) Jones, and the current POTUS to a similar-yet-larger extent, feed the Outrage Machine by saying a bunch of stupid bullshit and letting human nature take its course.
Now that Jones is gone from most of the high-profile SINs, his feeding of the Outrage Machine is stalled. Yes, he still says dumb bullshit on both Twitter and his own platform, but his reach and profile will most likely suffer as a result. Not all Streisanding leads to positive results for the person who would nominally benefit from one. Milo Yiannopolous went from being a high-profile name in the so-called “alt-right” movement to being a nothingburger practically overnight; that happened because he was deplatformed by Twitter for being a rulebreaking ass…and by several outlets to which he was contributing, although that was for his appearing to defend pedophilia. His profile dropped so hard and so fast that he is barely even a presence on either side of the Outrage Machine.
In regards to the filter bubble talk: Yeah, I got nothin’ on that one. A big problem with hyper-partisan news sites and news networks is the presenting of a specific set of facts under a specific lens that paints “the other side” as villains in a battle for the soul of humanity or the country or whatever. Fox News does this by presenting Democrats as monsters trying to get in the way of Republicans who absolutely want to do what is best for all Americans. MSNBC does much the same to Republicans, albeit to a somewhat lesser extent. We have to address both the hyper-partisanship of journalism and the filter bubble issue simultaneously, or else we will end up right back where we started…or worse.
(P.S. ~ That description of the difference between Techdirt and Facebook is outstanding.)
[ link to this | view in chronology ]
Re: Re: Re: Re:
1)Friends where they can apply little or no filtering;
2)Acquaintances, where they can apply more selective and restrictive filters.
3)Commercial/business relationships where applicable, including companies that the person buys from.
3)Everybody else, where they can be more selective, including blocking identified sources.
The site should not be applying any filters other than blocking outright spam and fishing attacks.
Making the users responsible for their own moderation eliminates the the risk of activists abusing the moderation system to try and impose their morals and politics on society.
[ link to this | view in chronology ]
I agree for the most part; any service should have tools in place to help users mitigate abusive behavior, and users should be better curators of their own experience. I also believe the service operators should always step in to moderate and guide the community by getting rid of toxic elements and applying consequences to bad behavior. If that means banning people on the basis of speech, so be it—Twitter, Facebook, etc. still have that right regardless of their size and cultural influence.
[ link to this | view in chronology ]
Re:
That is always ensure that people can choose whether to see a moderated site, or carry out their own moderation. I always feel uncomfortable about driving those with extremist view underground, as that only feeds their extremism, while leaving them operate in public view does allow the less extreme try to modify their views, even if most of society decides to turn their back on them, and leave them to their little corner of a site.
[ link to this | view in chronology ]
On one hand, I understand the logic behind this thought—“better to shine a light on evil than to let it fester in the darkness”, or something like that. In an ideal world, I would eagerly agree.
On the other hand, this is not an ideal world. By treating racist, sexist, and anti-LGBT speech and ideas (amongst other things) as merely “something to debate”, platforms give those ideas credibility and help move the Overton Window a little closer to normalizing those ideas. Some ideas have no place in the public sphere; no privately-owned platform has an obligation to give those ideas an audience.
Yes, at a certain size, a platform like Twitter can no longer effectively moderate the service through people alone. The solution should not be to force free speech absolutism upon Twitter at the expense of making the user 100% responsible for their Twitter experience. (As I said before, users have a responsibility to curate their experience—but the service has a responsibility to watch out for and punish abuse carried out by bad actors.) A better solution would be to never let a single platform get that fucking big in the first place. “Protocols over platforms” and decentralization and all that jazz has a point: We do not need to be connected with the entire world just to communicate with friends and find new people.
(Hell, for my money, getting rid of “instant gratification” social interaction networks like Twitter and Facebook by way of abandonment would be the best thing possible. Those services thrive on a reactionary state of mind, whereas all the old, “slow” forms of Internet communication like email and forum replies and blog posts do not require us to have our eyeballs glued to our timelines for the latest outrage...or the latest cat video.)
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
0) One more way in which Techdirt differs: I can expect a ban, swift condemnation, and other action if I suggest someone do something bad to our favorite troll, OOTB. In fact, if a mob were to take up that banner, I would expect Techdirt to find it in the position of the ACLU: defending OOTB, who Techdirt likely detests, against his attackers in various ways.
1) Most users don't even know what a "filter bubble" is...controlling your own experience is a lot of work!
2) If everyone controls their own filter, then there's no way to stop the next cheetoh from feeding the outrage machine.
3) The hyper-partisanship was architected by Roger Ailes. Without a doubt, prior to Fox News, there was some liberal bias in the media because being afraid of that strange-looking, strange-talking person just doing their job is like a routine car accident: it's just not dramatic enough to sell news. Bad things happening, typically to underdogs? That sells news through the outrage machine.
Quoting The Atlantic:
https://www.theatlantic.com/politics/archive/2018/08/the-battle-that-erupted-in-charlottesv ille-is-far-from-over/567167/
White nationalists win by activating white panic, by frightening a sufficient number of white people into believing that their safety and livelihoods can only be protected by defining American citizenship in racial terms, and by convincing them that American politics is a zero-sum game in which white people only win when people of color lose. While this dynamic has always been present in American politics, it has been decades since the White House has been occupied by a president who so visibly delights in exploiting it, aided by a right-wing media infrastructure that has come to see it as a ratings strategy.
I think we need to deal with that panic. These people are suffering from future shock. Most of them have no head for complex, nuanced reasoning. Hitler got his foothold when Germany was being made poor by England, and likewise these white nationalists are getting their footholds as the billionaires on wall street are making the rest of us poorer.
By the way, there's a fire burning and getting close: climate change.
[ link to this | view in chronology ]
shadowbanning?
Um, it IS evil and unfair, because you're deceiving people. They go about their business posting, uploading, etc. not realizing that less people (or none at all) are seeing them. At least they used to; most people at this point know how to tell if they've been shadowbanned, in relatively short time. Morality aside, it creates yet another layer of waste in the economy. Be forthright and timely with people about their behavior and they immediately have to adjust or put their energy towards other things altogether, whereas shadowbanning encourages people to keep pounding away at the keyboard for no reason.
[ link to this | view in chronology ]
A useful tool
https://www.npr.org/2018/08/09/634991713/lies-my-teacher-told-me-and-how-american-history-ca n-be-used-as-a-weapon
[ link to this | view in chronology ]
Mostly on target, "decentralization" an epic fail
The staffers deny shadow banning Twitterers, but admit further in their post:
"We do not shadow ban. You are always able to see the tweets from accounts you follow (although you may have to do more work to find them, like go directly to their profile)."
Speaking in plain language, your Tweets can't be seen by anyone on Twitter who doesn't follow you, and they may have to do an unusual amount of work to see your Tweets.
That's shadow banning people by the wikipedia definition:
"By making a user's contributions invisible or less prominent to other members of the service, the hope may be that in the absence of reactions to their comments, the problematic or otherwise out-of-favour user will become bored or frustrated and leave the site"
How does Twitter decide to do this?
"Here are some of the signals we use to determine bad-faith actors:
Specific account properties that indicate authenticity (e.g. whether you have a confirmed email address, how recently your account was created, whether you uploaded a profile image, etc)
What actions you take on Twitter (e.g. who you follow, who you retweet, etc)
How other accounts interact with you (e.g. who mutes you, who follows you, who retweets you, who blocks you, etc)"
Let's focus on "who mutes you, who follows you, who retweets you, who blocks you, etc" for a moment.
"Decentralizing policy enforcement" in this case puts at least some of the power to shadow ban Twitter accounts in the hands of "who mutes you, who follows you, who retweets you, who blocks you, etc".
Part of the marked discrepancy between conservatives and liberals on Twitter's shadow ban list may be that liberal political organizations famously began using social media on the Internet for organizing of all sorts.
Given the discrepancy exists, and few leftists are so banned, and that the criteria for shadow banning have been set forth by Twitter, it's not unreasonable to suspect that the algorithms Twitter uses to decide who gets a shadow ban are being gamed. You wouldn't have to have many followers to block or mute everyone on a list you draw up to get the algorithm anyone you don't like.
Such lists exist. I'm on two of them, curated by Marethyu (@scathachultor), helpfully labeled "Scum" and "Propagandists/Liars". Given his apparent social skills, it's perhaps unsurprising Marethyu only has 92 followers.
The question, I guess, is how many blocks or mutes do you have to have to get shadow banned on Twitter? Twitter won't say.
[ link to this | view in chronology ]
The Shibboleth of Choice
Did I missing where the dangers associated with this will either (1) be mitigated by something; OR (2) are analyzed to be less dangerous than the current state?
(Basically did you address the emergent segregation from "choice"? https://en.wikipedia.org/wiki/Thomas_Schelling)
[ link to this | view in chronology ]