Republicans Blame CDA 230 For Letting Platforms Censor Too Much; Democrats Blame CDA 230 For Platforms Not Censoring Enough
from the which-is-it? dept
It certainly appears that politicians on both sides of the political aisle have decided that if they can agree on one thing, it's that social media companies are bad, and that they're bad because of Section 230, and that needs to change. The problem, of course, is that beyond that point of agreement, they actually disagree entirely on the reasons why. On the Republican side, you have people like Rep. Louis Gohmert and Senator Ted Cruz who are upset about platforms using Section 230's protections to allow them to moderate content that those platforms find objectionable. Cruz and Gohmert want to amend CDA 230 to say that's not allowed.
Meanwhile, on the Democratic side, we've seen Nancy Pelosi attack CDA 230, incorrectly saying that it's somehow a "gift" to the tech industry because it allows them not to moderate content. Pelosi's big complaint is that the platforms aren't censoring enough, and she blames 230 for that, while the Republicans are saying the platforms are censoring too much -- and incredibly, both are saying this is the fault of CDA 230.
Now another powerful Democrat, Rep. Frank Pallone, the chair of the House Energy and Commerce Committee (which has some level of "oversight" over the internet) has sided with Pelosi in attacking CDA 230 and arguing that companies are using it "as a shield" to not remove things like the doctored video of Pelosi:
.@Facebook’s failure to appropriately address intentional political disinformation harms its users, the public discourse, and our democracy. Sec 230 is meant to enable platforms to take down harmful content. It should not be a shield for inaction. https://t.co/HMJ9ARhKo9
— Rep. Frank Pallone (@FrankPallone) May 30, 2019
But, of course, the contrasting (and contradictory) positions of these grandstanding politicians on both sides of the aisle should -- by itself -- demonstrate why mucking with Section 230 is so dangerous. The whole point and value of Section 230 was in how it crafted the incentive structure. Again, it's important to read both parts of part (c) of Section 230, because the two elements work together to deal with both of the issues described above.
(c) Protection for “Good Samaritan” blocking and screening of offensive material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.(2) Civil liability No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
It's these two elements together that make Section 230 so powerful. The first says that we don't blame the platform for any of the actions/content posted by users. This should be fairly straightforward. It's about the proper application of liability to the party who actually violated the law, and not the tools and services they used to violate the law. Some people want to change this, but much of that push is coming from lawyers who just want the bigger pockets to sue. It involves, what I've referred to as "Steve Dallas lawsuits" after the character in the classic comic strip Bloom County, who explains why you should always focused on suing those with the deepest pockets, no matter how tangentially they are to the law violating.
But, part (2) of the law is also important. It's the part that actually allows platforms the ability to moderate. Section 230 was an explicit response to the ruling in Stratton Oakmont v. Prodigy, in which a NY state judge ruled that because Prodigy wanted to provide a "family friendly" service, and therefore moderated out content it found objectionable (in order to support that "family friendly" goal), it therefore became automatically liable for any of the content that was left up. But, of course, that's crazy. The end result of such a rule would be either that platforms wouldn't do anything to moderate content, which would mean everything would be a total free for all -- and you couldn't have a "family friendly" forum at all, and everything would quickly fill up with spam/porn/harassment/abuse/etc -- or platforms would basically restrict almost everything to create a totally anodyne and boring existence.
The genius of Section 230 is that it enabled a balance that allowed for experimentation and this includes the ability to experiment with different forms of moderation. Everyone focuses on Facebook, YouTube and Twitter -- which all take moderately different approaches -- but having a Section 230 is also what allowed for the radically different approaches taken by other sites: like Wikipedia and Reddit (and even us at Techdirt). These use very different approaches, some of which work better than others, but much of which is community-dependent. It's that experimentation that is good.
But the very fact that both sides of the political aisle seem to be attacking CDA 230 but for completely opposite reasons really should highlight why messing with CDA 230 would be such a disaster. If Congress moves the law in the direction that Gohmert/Cruz want, then you'd likely get many fewer platforms, and some would just be overrun by messes, while others would be locked down and barely usable. If Congress moves the law in the direction that Pelosi/Pallone seem to want, then you would end up with effectively the same result: much greater censorship as companies try to avoid liability.
Neither solution is a good one, and neither would truly satisfy the critics in the first place. That's part of the reason why this debate is so silly. Everyone's mad at these platforms for how they moderate, but what they're really mad at is humanity. Sometimes people say mean and awful things. Or they spread disinformation. Or defamation. And those are real concerns. But there need to be better ways of dealing with it than Congress stepping in (against the restriction put on it by the 1st Amendment), and saying that the internet platforms themselves either must police humanity... or need to stop policing humanity altogether. Neither is a solution to the problems of humanity.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: cda 230, censorship, content moderation, frank pallone, louis gohmert, nancy pelosi, section 230, ted cruz
Reader Comments
The First Word
“Any imposition of neutrality would constitute a breach of the First Amendment.
You mean like the way we do with the phone company or the USPS violates the First Amendment? Common carrier is unconstitutional now?
Until the Supreme Court says otherwise, corporations have the right of association — and that includes the right to avoid association with certain people/kinds of speech. Imposing content neutrality on social interaction networks would violate that right.
Congress has the right to pass a law which says that internet sites who have UGC and are federal contractors shall be treated as common carriers.
Subscribe: RSS
View by: Time | Thread
They're both wrong
Independents blame CDA 230 for causing Republicans and Democrats.
[ link to this | view in thread ]
Seriously though
Wouldn't either move be a 1st Amendment violation? After all, Congress shall make no law is not just prominent, but the first words of the Amendment. Either proposal would be engaging in 'prohibiting the free exercise'. If platforms do it, it is not the government. If the government tells platforms what to do, it is government.
[ link to this | view in thread ]
Umm... huh? The only place that those words appear in the First Amendment, it's immediately followed by the word "thereof", making it clear that it refers to the thing that was discussed immediately prior, which is religion, not speech.
(You're not wrong about this being a likely violation of the First Amendment; only about how it applies here.)
[ link to this | view in thread ]
Re: Seriously though
I don't know what amending 230 would do, but repealing it entirely would simply move the status quo back to what it was before the CDA was passed -- viz, the Stratton Oakmont v. Prodigy decision that if platforms moderate content, they're liable for whatever content they don't remove.
This would be disastrous, for reasons Techdirt has covered repeatedly and at length. There would be legal challenges, there would be lobbying, there would be an awful lot of frivolous suits, and most US sites would either shut down comments entirely or not moderate them at all (including spam filters).
As for constitutional challenges? Maybe. The Prodigy case wasn't appealed to the Supreme Court, so there's always the possibility that a new challenge could make it up to SCOTUS and the precedent could be reversed. But that would take years.
[ link to this | view in thread ]
Sounds like a win
The fact that both parties are unhappy with it, for opposite reasons, makes it appear to be the rare good law, IMHO.
[ link to this | view in thread ]
TechDirt Et al., you owe me!
1.) Read the title of article.
2.) Face-Palmed so hard I lost a crown!
3.) Mike, It is clearly your fault for publishing "Absolute Idiocy"
4.) CDA 230 should not protect you from My Palm.
[ link to this | view in thread ]
And these clashing views from both parties will definitely slow down congress from messing with S230 in my opinion.
[ link to this | view in thread ]
Re: Seriously though
The first amendment isn't an absolute right. If it were, then child porn wouldn't be illegal.
[ link to this | view in thread ]
False Dichotomy. Too little moderation AND target conservatives
This is Masnick's usual attempt to position Un-Constitutional Section 230 as "opposed by both, therefore must be good".
But in fact, both complaints are true.
Masnick also wants Section 230 to provide corporations with absolute immunity AND government-conferred authority to control all speech.
That's just his wishes for corporations. It's not the law.
"Good Samaritarans" must be GOOD. Inarguable. It's right there, black letter law.
Masnick's duplicity on this is shown by that when arguing with me, he simply DELETED the "in good faith" requirement! -- And then blows it off as not important:
https://www.techdirt.com/articles/20190201/00025041506/us-newspapers-now-salivating-over- bringing-google-snippet-tax-stateside.shtml#c530
Now, WHERE did Masnick get that exact text other than by himself manually deleting characters? -- Go ahead. Search teh internets with his precious Google to find that exact phrase. I'll wait. ... It appears nowhere else, which means that Masnick deliberately falsified the very law under discussion. Probably because trying to keep me from pointing out that for Section 230 to be valid defense of hosts, they must act "in good faith" (to The Public), NOT as partisans discriminating against those they believe are foes.
[ link to this | view in thread ]
Re: Re: Seriously though
Child porn is illegal due to the harms it causes to a disadvantaged group, i.e. children. This isn't a free speech issue at all. Your comment is a nice strawman but it burned far too quickly to be viable.
[ link to this | view in thread ]
Re: False Dichotomy. Too little moderation AND target conservati
Forgot to point out that mere statute CANNOT empower any entity to violate Constitutional Rights. Section 230 is therefore null and void. -- Yes, no matter how often used in cases to get immunity (usually rightly), it STILL cannot empower corporations on the "material is constitutionally protected" point.
That's the actual crux of argument. Masnick tries to buttress the censorship with non-controversial parts. -- Because there's BIG money in being able to control all speech. If corporations are able to shunt opposition into tiny outlets, they automatically win.
[ link to this | view in thread ]
Re: False Dichotomy. Too little moderation AND target conservati
You're an idiot. Please stop talking and leave the grownups to their discussion.
[ link to this | view in thread ]
On balance?
"Republicans Blame CDA 230 For Letting Platforms Censor Too Much; Democrats Blame CDA 230 For Platforms Not Censoring Enough"
If both parties are annoyed, I would say CDA230 got the balance just about right.
[ link to this | view in thread ]
Re: Re: Seriously though
Ah, Trope Three.
[ link to this | view in thread ]
Re: False Dichotomy. Too little moderation AND target conservati
No 230, no comments sections, and no censorship either.
[ link to this | view in thread ]
Re: Sounds like a win
This time.
As Techdirt reminded us last week, sometimes when a solution makes every side equally unhappy that's just because it's a shitty solution.
[ link to this | view in thread ]
Also in 230:
Was that a hint of the FCC's abolished Fairness Doctrine? While all the big platforms certainly started out with minimum censorship, the vice keeps tightening ever so slowly. Will Conservatives and other Wrongthinkers be boiled alive like the proverbial frog, or will an online equivalent of Fox News emerge that splits social media in much the same way as cable news, along political, cultural, and ideological lines?
[ link to this | view in thread ]
Re: Re: Sounds like a win
Yep. YouTube messed that up. But I'm much more okay with YouTube making bad decisions while trying to please everyone, than YouTube making the same (or opposite) decisions because the law tells them to.
[ link to this | view in thread ]
Re:
Better question: Why should the law force websites to host certain kinds of speech?
[ link to this | view in thread ]
Re:
I read it as more of a riff on the "marketplace of ideas".
Remember, Section 230 was a direct reaction to Stratton Oakmont v Prodigy, a decision which held that because Prodigy moderated content, it was legally liable for content it didn't remove.
(I was a Prodigy kid. I can assure you that Prodigy moderated content aggressively.)
230 was explicitly built on the premise that platforms can moderate content as they see fit.
[ link to this | view in thread ]
It's these two elements together that make Section 230 so powerful. The first says that we don't blame the platform for any of the actions/content posted by users.
The harm inflicted by a platform in amplifying/spreading defamation is separate from the harm inflicted by the user. Every country in the world EXCEPT the US recognizes this, and even the US did with distributor liability.
Section 230 allows people to weaponize search engines, and if IP addresses don't prove authorship, or people use a "burner" IP that can't be traced (or are judgment-proof or posting from another country), the target of defamation is defenseless.
I'm sure if someone ever used an untraceable IP address to post reviews of pro-230 lawyers and claim that they sexually abused children or female clients, the lawyers would scream bloody murder and their pro-230 position might change. Of course I'm not recommending anyone DO this but instead just demonstrating the potential for harm.
The other problem is that people believe what they read online, then repeat it in their own words without linking to the original post and that makes them a publisher and liable for being dumb enough to believe and repeat what they read. Sometimes the defamation is on a questionable site (like a white-supremacist or anti-Semitic site) so they can't quote it but by not quoting it they become liable.
Someone who wanted to game the system could easily have defamation about themselves planted online, go around arguing with people, wait for the people they argue with to Google them, then let nature take its course and sue those people for libel once they repeat what they've read (they can plant the defamation on a site the pawn wouldn't want to link to for maximum impact).
Employers who believe defamation and deny someone a job because of it should be sued into bankruptcy.
Section 230 is fatally flawed.
[ link to this | view in thread ]
Re: Re:
The presumption was that such moderation would be politically neutral, especially by a platform with global influence.
[ link to this | view in thread ]
Re: Re: Seriously though
Notice-and-takedown would work just fine. It's what we've had offline for centuries.
Libel laws were designed to replace DUELING.
[ link to this | view in thread ]
Re: Re: Re: Seriously though
How "fine" does it work for copyright? Because I've read lots of stories about false or faulty takedowns.
[ link to this | view in thread ]
Re: Re: Re: Seriously though
Did you quit posting as John Smith just to get around my filter? I know that's why Blue started changing his name.
On the one hand, there's something weirdly flattering about that. On the other, that's some kinda creepy stalker shit, Johnny.
But then, when has "that's some kinda creepy stalker shit" ever stopped you before?
[ link to this | view in thread ]
Re: Re: Re:
Why should a platform be forced by law to host any kind of content?
[ link to this | view in thread ]
Re:
Are you naturally this stupid, or did you intentionally give yourself brain damage?
[ link to this | view in thread ]
Re: Re: Re:
Have you considered that it is politically neutral and if there is any imbalance between bans of liberals and conservatives it is due to the greater tendency of one of those to violate terms of service or basic human dignity?
[ link to this | view in thread ]
Re: Re:
Jhon is naturally that stupid yet exacerbated the problem by headbutting moving cars as a child.
[ link to this | view in thread ]
Re: Re: Re: Re:
The terms of service are biased and subjectively enforced.
The public tolerates this by NOT boycotting companies who sponsor it, however, so the marketplace has spoken.
If the public demanded that USENET rules apply or they won't buy anything advertised on the site, that's what we'd have, or USENET itself would still be populated more heavily.
[ link to this | view in thread ]
Re: Re:
Those who debate with ad-hominems like you are using are the ones generally thought stupid.
[ link to this | view in thread ]
Re: Re: Re: Re: Seriously though
Wow, you're impressed with yourself!
[ link to this | view in thread ]
I blame CDA 230 for walking down an alley drunk, and dressed like that, in that part of town.
[ link to this | view in thread ]
Re: Re: Re:
Citation needed.
However, should I be similarly questioned for a citation, I offer this transcript of the Congressional Record when the amendment was read and several speakers commented - and none of them spoke of a political neutrality requirement for the immunity conferred upon service providers. The relevant section starts with "amendment offered by mr. cox of California"
[ link to this | view in thread ]
So what?
[ link to this | view in thread ]
And yet, here you are, expressing an idea about “weaponizing” defamation and Section 230 that has not, and will never, become a reality. Sounds like that “ad hom” has more truth to it than you care to admit.
[ link to this | view in thread ]
DEAR INTERNET..
Its TIMe to declare yourself an independent nation..
Start making deals/trade agreements with every nation.
Start Charging for access to YOUR GOODS..(Charge the ISP's for giving access to your customers)(Cable/sat does it, why not you)
[ link to this | view in thread ]
Re:
So what? The right of anonymous speech has existed prior to the Constitution of the United States and has been consistently upheld by our courts as a First Amendment right. The one difference between the right to anonymity and other 1A rights is the fact that once you give it up (or it's taken from you) you can never reclaim it. Once again to paraphrase Blackstone: "I'd rather 100 defamation cases go unpunished as opposed to one persons right to anonymity be stripped from them."
Yes, some people are gullible and stupid, but that doesn't mean I have to give up my rights because of their shortcomings.
No, it's not. It's because of Section 230 that we are able to have this discussion in this comment section in the first place because this comment section wouldn't exist without it.
Personally, I made the decision to keep my and personal/professional identity and my online identity separate back in the late 90's and have never regretted it.
[ link to this | view in thread ]
Re: Re:
Personally, I made the decision to keep my and personal/professional identity and my online identity separate back in the late 90's and have never regretted it.
You mean to ATTEMPT to keep them separate.
[ link to this | view in thread ]
Re: Re:
The other problem is that people believe what they read online...
Yes, some people are gullible and stupid, but that doesn't mean I have to give up my rights because of their shortcomings.
Exactly, and these people are sitting ducks for being manipulated into being sued by those who would use them as pawns.
Some 4chan idiot wants to poison my coworker against me, the coworker grabs the bait, winds up sued, and then blames ME. Pathetic.
[ link to this | view in thread ]
Re: Re: Re:
If these gullible people are being duped into believing something untrue about you by some bad actor, then aren't they victims of the bad actor as well as you?
Blaming you for suing them isn't that unreasonable - after all, you are choosing to sue them.
[ link to this | view in thread ]
Re: Re: Re: Re:
They mentioned "good faith" moderation.
Either way, the law doesn't have to require neutrality for Congress to impose the condition on a law that circumvents two centuries of precedent in this country, and runs counter to all other countries.
[ link to this | view in thread ]
Re:
Bias in structure and enforcement means it's censorship.
The public doesn't care though so it's really a nonstarter for those who would like more fairness. USENET still works for that diminishing crowd.
[ link to this | view in thread ]
Re: Re: Re: Re:
Why should a platform be forced by law to host any kind of content?
If they are a state actor who should be treated as a common carrier, they should be.
I support a law that treats federal contractors as common carriers, meaning if they want to censor, don't do it while playing with federal money.
[ link to this | view in thread ]
Politicians apparently see the Constitution and the Bill of Rights as a hindrance to good government.
[ link to this | view in thread ]
Re:
Politicians and large sections of the government apparently see the Constitution and the Bill of Rights as a hindrance to good government.
Fixed for unfortunate accuracy.
[ link to this | view in thread ]
Be careful what you wish for...
The 'funny' part of this is that if they do get their way neither side is going to be happy with the result.
Reinstate a penalty for moderation and sites are likely to go one of two ways, either moderating nothing, which will anger the idiots who think they're not doing enough, or moderating heavily anything that even might be objectionable, angering the idiots who think that sites are already moderating too much.
In their rush to throw tantrums because social media platforms aren't doing what they want them to they've completely missed that even if they win social media platforms still won't do what they want them to.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
Yes, they did. But that's got nothing to do with political neutrality. The commenters were quite clear that they didn't want providers to do nothing, but they also didn't want them to face liability if they did something. There's also a few bits in there about how there really isn't a definition for what should or shouldn't be removed, and it's probably not a good idea for the government to create a definition.
Now, if you want to argue that a particular platform's moderation is not being done in good faith, you could certainly do that - I've not heard of any CDA230-related cases that make that argument, so it might be an interesting angle to take but also would probably be much harder to convince a judge that it was actually happening.
The law (CDA230) doesn't have to require neutrality for Congress to impose the condition (a neutrality requirement) on a law (CDA 230)....?
Um, what?
So? Just because it was done that way before, or is done that way elsewhere, doesn't mean it's the right way to do it.
[ link to this | view in thread ]
If you are so sure this plan of yours would work and is obvious to anyone with half a brain, please show us one instance of it working. At all. On any level.
[ link to this | view in thread ]
Any imposition of neutrality would constitute a breach of the First Amendment. Until the Supreme Court says otherwise, corporations have the right of association — and that includes the right to avoid association with certain people/kinds of speech. Imposing content neutrality on social interaction networks would violate that right.
[ link to this | view in thread ]
Re: Re: Re: Seriously though
It would work fine for scammers, dodgy businesses etc. but not for the public who benefit from bad reviews etc warning them of possible problems.
[ link to this | view in thread ]
Ah, yes, the cries of “censorship”. How does YouTube booting some Nazi fuckboi prevent them from expressing themselves in any way, again?
[ link to this | view in thread ]
Please explain the reasoning behind your belief that any website that hosts user-generated content should be classified as a government-controlled “common carrier” and thus forced to carry any content it otherwise would not host.
FurAffinity, a site wholly dedicated to UGC, chooses not to host certain kinds of artwork. For what reason should the government have a right to make FurAffinity do otherwise?
[ link to this | view in thread ]
Re:
If "any website" is a state actor then it's hard to disagree with the prior poster. The government may not censor speech. FurAffinity is not a state actor and is exempt from that restriction.
[ link to this | view in thread ]
Re: Re:
There is no law, not even 230, that requires fairness in protected speech. Moderation has been recognized as a form of speech. So long as that moderation isn't performed by a state actor there is no violation of any law, rule, restriction or anything else. The government can't even intervene here and kill 230, a rule protecting free speech, without running afoul of the constitution.
The world is unfair. You'll get used to it eventually. This is particularly funny since it was right wingers who first called left-wingers "snowflakes" but look how the right wing melts when their "unique and beautiful" views aren't well accepted.
[ link to this | view in thread ]
Re: Re:
Which platforms involved in this issue are state actors?
[ link to this | view in thread ]
I’m inclined to agree with the Democrats here, to an extent. While sites absolutely don’t have to delete content, and should not be compelled to, that doesn’t mean that it Sites shouldn’t want to. Certainly, were I running a site that allowed user content to be posted, I would be greatly concerned about not providing any assistance to harmful speech, including by not providing a platform for it or for the people who engage in it, or by offering them connections to my users or by doing business with businesses that did tolerate it. Let malicious users go elsewhere to exercise their right of free speech.
[ link to this | view in thread ]
Now explain how Twitter, Facebook, and YouTube do not have the same exemption.
[ link to this | view in thread ]
Ebony and Ivory, e.g., Louis Gohmert and Nancy Pelosi
When the keys themselves are out of tune, color matters not - no harmony is forthcoming.
[ link to this | view in thread ]
Re:
While sites absolutely don’t have to delete content, and should not be compelled to, that doesn’t mean that it Sites shouldn’t want to.
If that was as far as it went most people on TD would likely agree with you(with the discussion then shifting to what should be removed, how it would be done, how to minimize collateral-damage...), it's when 'should' shifts to 'should be required to' that the problems and objections crop up.
[ link to this | view in thread ]
Re: Re: Seriously though
Do you go out of your way just to post things that are clearly ridiculous?
[ link to this | view in thread ]
Money on the table
If Techdirt were to sell replacement sarcasm detectors and padded headbands to reduce facepalm-related head trauma they would quickly find themselves absolutely swimming in cash, given how often both of them are needed from those reading the site.
[ link to this | view in thread ]
Re: Re: Re:
None of them
[ link to this | view in thread ]
Re:
these people are sitting ducks for being manipulated into being sued by those who would use them as pawns
If you are so sure this plan of yours would work and is obvious to anyone with half a brain, please show us one instance of it working. At all. On any level.
You mean name names and put targets on people's back. Not necessary.
I did cite a case where "reiterating" content was a key element in proving one was a publisher rather than a distributor, and that should be sufficient.
We know that if someone posts, without attribution, a defamatory statement that they are a publisher and not a distributor.
We also know that there are people who will repeat what they find in Google without bothering to link to the original source, which makes them publisher.
One need not jump off a building to know that doing so is likely to cause death. The demand for specifics is therefore more indicative of a desire to target the people named.
[ link to this | view in thread ]
Re: Re:
Do you get upset when a restaurant tells you to leave because you have no shirt and no shoes?
[ link to this | view in thread ]
Re: Re: Re: Re:
If these gullible people are being duped into believing something untrue about you by some bad actor, then aren't they victims of the bad actor as well as you?
Yes they are. People are predisposed to believe the worst about those with whom they disagree. Experienced internet users know how to manipulate this predisposition to turn these people into unwitting pawns.
Blaming you for suing them isn't that unreasonable - after all, you are choosing to sue them.
Yes, in that situation I would be choosing to defend my rights, and the lawsuit would be caused by the pawn's willingness to believe something defamatory about someone they don't like written by someone they never met. In fact, trying to warn them of this will often just empower them to make even more defamatory statements.
Smart people won't fall into this trap, but not everyone is smart. The trap is not set by the plaintiff, who was simply targeted by those who didn't like him or her, but was set by instigators who cannot be located but who write very serious-sounding posts designed to induce third parties to grind their axe.
Perhaps if enough people fall into this trap, or the wrong person does, it will be prevented, but we're not there yet. While it's not Section 230's "fault" the law definitely makes it possible, and without 230, it wouldn't happen because ISPs wouldn't let themselves by tricked, though I'd imagine if some admin didn't like a poster they might go out on a limb and get sued.
[ link to this | view in thread ]
Re:
And yet, here you are, expressing an idea about “weaponizing” defamation and Section 230 that has not, and will never, become a reality. Sounds like that “ad hom” has more truth to it than you care to admit.
There are entire forums and websites devoted to weaponizing 230 by posting content for the explicit purpose of defaming people and having that defamation turn up when one's name is searched.
The "don't date that guy" type of site is one (no, I've never been named on one).
[ link to this | view in thread ]
Re:
So you don't think Ripoff Report exploits Section 230 or relies on sites like Google to spread their words?
[ link to this | view in thread ]
Bullshit. I haven’t seen you post one link to a court case where someone used (or attempted to use) your plan as you described it.
[ link to this | view in thread ]
Re:
Any imposition of neutrality would constitute a breach of the First Amendment.
You mean like the way we do with the phone company or the USPS violates the First Amendment? Common carrier is unconstitutional now?
Until the Supreme Court says otherwise, corporations have the right of association — and that includes the right to avoid association with certain people/kinds of speech. Imposing content neutrality on social interaction networks would violate that right.
Congress has the right to pass a law which says that internet sites who have UGC and are federal contractors shall be treated as common carriers.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re:
The law (CDA230) doesn't have to require neutrality for Congress to impose the condition (a neutrality requirement) on a law (CDA 230)....?
Um, what?
Congress can decide to change 230 to require neutrality.
[ link to this | view in thread ]
The people being (allegedly) defamed can sue over the content. They can have it declared defamatory and ask for its removal via court order. The existence of CDA 230 doesn’t prevent either action.
[ link to this | view in thread ]
No more than you exploit CDA 230 to post your bullshit, but go off, I guess.
[ link to this | view in thread ]
Then Congress can get its collective ass smacked down by the Supreme Court. Requiring a privately-owned platform to host content it would otherwise not host is a gross violation of the First Amendment. For what reason should YouTube be required by law to host, say, White supremacist propaganda?
[ link to this | view in thread ]
Re: Re: Re: Re: Seriously though
And determining whether somethinbg is libelous is far siginificantly more difficult then determining whether something is copyright infringment.
[ link to this | view in thread ]
Re: Re: Re: Re: Seriously though
For someone who'd doesn't know either the writer of the allegedly defamatory post or the subject of it, determining whether something is defamatory is not going to be easy.
[ link to this | view in thread ]
Re: Re: Re: Re: Seriously though
All we need is for Jhon to say "copyright terms prevent publishers from murdering authors" and I'll have finished my bingo sheet!
[ link to this | view in thread ]
You say that like it’s a bad thing.
[ link to this | view in thread ]
Re: Re: Re: Re:
Right, so then:
But none of them are state actors, so why even bring it up?
[ link to this | view in thread ]
Re: Re:
You mean name names and put targets on people's back. Not necessary.
Yes, it's truly not. In such cases you can simply blank out the people's names and offer a court document instead.
Nobody needs to know names. What needs to be known that your interpretation of the law exists outside of your fevered, fanciful imagination.
[ link to this | view in thread ]
Re: Re:
You seem to have a huge problem with websites that offer advice for people on how not to get scammed or raped. Now why is that?
[ link to this | view in thread ]
Point to a single court case that proves someone did it. Otherwise, quit talking out of your ass.
[ link to this | view in thread ]
Re: Re:
Congress also has the right to declare my cat the new state bird but that doesn't make the idea inherently not stupid anymore, either.
[ link to this | view in thread ]
Re: Re: Re:
"The presumption"
Who presumed that. Can you link to the legally relevant discussion of that being the intent? or, are you just angry because popular platforms have decided they no longer wish to have Nazis on their property?
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
"If they are a state actor who should be treated as a common carrier, they should be."
Then why are you whining about private platforms who are no such thing?
[ link to this | view in thread ]
Re: Re:
"Bias in structure and enforcement means it's censorship."
Private platforms can censor you as much as they want without violating your rights. Go use a competitor - of which there are many - rather than whine that your own actions got you banned from the most popular places normal people congregate.
[ link to this | view in thread ]
Re: Re:
"There are entire forums and websites devoted to weaponizing 230"
There are entire forums devoted to the idea of zombie invasion, that doesn't mean that it has actually happened.
Why are you always unable to give actual proof of your claims? If it doesn't happen with any kind of regularity, it doesn't justify stripping the rights of millions and the employment of thousands as you are demanding.
[ link to this | view in thread ]
Re: Re:
I think they exploit the idea that con artists and other criminals can't shut them down because they don't like being exposed.
Why do you have a problem with that, I wondeR?
[ link to this | view in thread ]
Re: Re:
"I did cite a case"
Did you? Would you mind linking again, since you refuse to offer people a way to search your previous posts?
"We know that if someone posts, without attribution, a defamatory statement that they are a publisher and not a distributor."
Yes - but the person who posts that would be liable, not the platform they used to post it, and certainly not someone showing them where that platform is.
"We also know that there are people who will repeat what they find in Google without bothering to link to the original source, which makes them publisher."
No, the fact that stupid people exist does not change the nature of a business.
[ link to this | view in thread ]
Re: Re: False Dichotomy. Too little moderation AND target conser
John does not even know or believe what he talks about lol
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Seriously though
"All we need is for Jhon to say "copyright terms prevent publishers from murdering authors" and I'll have finished my bingo sheet!"
You must have been absent when he made that claim.
Although I'm pretty sure that was the claim made by his nickname "Bobmail" at torrentfreak, quite some time back.
[ link to this | view in thread ]
Re: Re:
So you don't think Ripoff Report exploits Section 230 or relies on sites like Google to spread their words?
As one who was defamed on ROR, no. It has the option to moderate or not. If they get sued, they will only remove the words deemed defamatory in a court of law; the rest of the negativity (the parts that are purely opinion) remain up. Section 23 isn't responsible for this, they are. Search engines don't spread anything, they simply index content. I blame no one but the troll who posted that content for what was posted there.
In ROR's defense, they allowed me to post a rebuttal, so every time someone reads the troll post, they can read the rebuttal too.
[ link to this | view in thread ]
Re: Re: Seriously though
"The first amendment isn't an absolute right. If it were, then child porn wouldn't be illegal."
The first amendment is certainly absolute until someone sees fit to rewrite it.
There are plenty of exceptions to free speech, all of which have in common that it has taken one or more supreme court decisions to formulate them. Until such rulings exist, however, the first amendment is indeed an unassailable absolute right.
[ link to this | view in thread ]
Re: False Dichotomy. Too little moderation AND target conservati
"This is Masnick's usual attempt to position Un-Constitutional Section 230 as "opposed by both, therefore must be good"."
By "un-constitutional" you mean as in "protects the constitutional rights of both comenters and platform owners"?
Section 230 is nothing other than the online equivalent of the right a home owner has to, in the real world, decide for themselves on whether a visitor gets to shout their opinions while standing in said home owners living room.
As usual, Baghdad Bob, you conflate the United States Constitution with the ruleset adopted by Borat the Dictator.
[ link to this | view in thread ]
Re: Re: False Dichotomy. Too little moderation AND target conser
As with many of his type - he confuses freedom of speech with freedom from consequences of that speech. He refuses to exercise the freedoms he has and demands that someone else make everyone else conform to what he wants. He's a child.
[ link to this | view in thread ]
Re:
What Stephen said. When the troll came after me, I was able to prove it was a troll post. I not only kept my job, I was promoted.
[ link to this | view in thread ]
Re: Re:
^This. Any moderation on a site with a large readership won't work well at scale as the admins would have to automate moderation via keywords as the cost of hiring people to manually go through each reported post (assuming that's when the moderation kicks in) to see what is or isn't acceptable.
[ link to this | view in thread ]
Re: Re: Re:
"assuming that's when the moderation kicks in'
Well, that's the big problem here - it's not. They're not trying to hold platforms responsible for not dealing with reports properly. They're trying to hold them responsible for anything that ends up on the site. Which means that pretty much any site of any size would need to use some kind of automated filter - even if you can personally deal with the normal level of traffic you get, can you really deal with any potential spikes, or deal with it when you're asleep?
Which means the end of most sources of user interaction. We'll be left with a few sites with deep enough pockets to deal with lawsuits (read: the already entrenched giants) and everybody else reduced to a broadcast model.
[ link to this | view in thread ]
Re: Re: Re: Seriously though
[citation needed]
There's no notice-and-takedown for libel. Not without a court being involved.
[ link to this | view in thread ]
Re: Re: Re: Re: Seriously though
You're talking about the guy whose response to Masnick upon Shiva Ayyadurai failing to destroy this site was this:
"Your ugly POS wife is a better laugh. Your shit stain children even better. You backed down like the little pussy you are. The one who can't get top-shelf women."
Stalker is Jhon Herrick Smith's middle-middle name!
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
No rights for anyone because this clown's coworker might be stupid.
[ link to this | view in thread ]
Re: Re:
"Common carrier" status applies to point-to-point pipes, or their switched equivalents, not to something that's effectively a multicast-to-broadcast medium by default.
[ link to this | view in thread ]
Re: Re: Re:
I remember Jhon citing a case.
I also remember that it didn't actually give any support his arguments outside of his imagination.
[ link to this | view in thread ]
Re: Re: Re:
I suspect that the worst posters are comparatively few in number; a social graph is probably the way to go. Don’t just delete posts, delete posters.
Given the financial resources of the major sites, I’d suggest coordinating with anti-hate groups (SPLC, ADL, etc.) to basically dox the people in question so that they can be excluded en masse, and infiltrate their private boards so that you can avoid having to always be reactive.
[ link to this | view in thread ]
Re: Re: Re: Re:
Oh, I certainly expect that to be the case. But, I might as well give him an opportunity to present an alternative to support his cause. After all, there's apparently so many that entire industries need to be destroyed in order to protect the people affected, so there's must be plenty of good examples to choose from!
[ link to this | view in thread ]
Re: Re: Re: Re:
That's, uh, a little Orwellian.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
I don’t mind anonymous or pseudonymous speech, but if someone is abusive, and a platform claims to be serious about not allowing such things, trying nothing and being all out of ideas is not a great plan.
If you’re serious about not providing support to neo-nazis or whomever, you’d better know who they are.
That said, it shouldn’t be mandatory. But effectively shunning the dregs of society is not so far out there that good citizens should be unwilling to do it of their own free will.
[ link to this | view in thread ]
In fairness, I think Herrick is referring to sites like Kiwifarms, which took the concept of “atrocity tourism” sites such as Encyclopedia Dramatica and ran with it to its natural conclusion.
[ link to this | view in thread ]
I have no issue with racist assholes being outed and given the boot from a platform. But to effectively run a campaign of doxxing, possible harassment, and “social silencing” with the help of multiple outside groups would be a bit much, don’t you think?
[ link to this | view in thread ]
Re: Seriously though
Not necessarily. CDA 230 specifically protects companies from the consequences of speech that's not protected by the First Amendment, like threats and libel. For example, true threats are not protected by the First Amendment; if someone sends you a death threat on Facebook, CDA 230 makes it so you can sue the person who sent it, but you can't sue Facebook.
That said, removing CDA 230 would seriously jeopardize the ability of social media platforms to exist, and I wouldn't be surprised if the Supreme Court stepped in and ruled that they still don't count as publishers.
[ link to this | view in thread ]
Re:
What I suggest is that the platforms take advantage of section 230 to voluntarily effectively identify and boot such users from the platforms.
They should not engage in harassment or publicly doxxing the users. But I am not averse to them comparing notes or seeking assistance from above-board groups who are apt to be better at connecting the dots and staying on top of trends, and who are, within certain boundaries that would need to be understood, unlikely to themselves be penetrated or corrupted.
I admit, it is kind of like Red Channels except for assholes, and this gives me pause, but I think people can agree that this is a more serious problem that does not seem to have good solutions. The Hollywood Ten were not running people down with cars, shooting people, spreading communicable diseases because they refused to get vaccinated, etc.
It’s not a panacea, and it shouldn’t be the only thing that is done, but I think platforms have a social, though not a legal responsibility to keep their platforms from being used maliciously and that they should do something effective to accomplish this.
Given how easily any existing measures have been circumvented, it’s time to take it up a notch. But if you have a suggestion that goes beyond what’s being done now, please make it.
[ link to this | view in thread ]
Re: Re:
No, what you said was:
That isn’t “tak[ing] advantage of section 230”, that is outright authoritarian bullshit — and it is bullshit for which you openly and unapologetically advocate. I mean, have you thought through the consequences of Twitter, Google, and Facebook pooling together resources to effectively spy on the entire goddamned Internet so they can keep assholes off Twitter, Google, and Facebook?
[ link to this | view in thread ]
Re: Re: Re:
I would imagine that its about as much spying as they do now in order to advertise to people. I am skeptical that people are good at maintaining totally separate identities online, and if the ad companies are as good as they’re made out to be, it only takes a little information to irreversibly connect a person’s commercial identity (for ordering things online) to their “anonymous” or “pseudonymous” posting identity as a troll, nazi, etc. So the information is likely already known to Google and almost certainly to a collaboration of Google, Facebook, and Amazon.
Other than that people are already creeped out about it, is there a major consequence that isn’t already happening? If you’re worried about intelligence agencies doing the same thing or piggybacking, that ship has probably already sailed.
Deplatforming of this nature should be done with a light touch, but at the end of the day it is relatively harmless. No one is kicking anyone off the net, no one is preventing assholes from making their own version of Google and Facebook with hookers and blackjack (like Conservapedia) and there’s probably few enough of them that a modicum of civility and reason could be restored by kicking out a small number of hard-core troublemakers.
While I get that it is a distressing idea that we may have come to this point (and we certainly do not want to go further and let the government get involved in deplatforming people) there is a sickness and it’s not clearing up on its own. Some sort of affirmative treatment is called for before things get worse.
What’s your suggestion for the malicious malaise afflicting society these days? Make popcorn? I’m still happy to hear about milder yet effective alternatives. And you didn’t actually say what harms you anticipate from my suggestion, either.
[ link to this | view in thread ]
You are calling for the major tech companies to spy on the entire Internet, with the help of third-party companies, so they can effectively punish assholes if they post bullshit on a platform owned by a major tech company. Imagine if you could be banned from Twitter because of something you said here, or vice versa.
If you see no issues with that proposition, I cannot help you.
[ link to this | view in thread ]
Re:
I think the problem is a typical one when dealing with normal, decent people - they call for tools but do not consider the way the tools can be abused. It makes sense if people who think in a similar way are given those tools. Unfortunately, people in the real world will not always think that way.
It is sadly better for society to put up with trolls, abuse, hatred, etc. than to face the alternative where good people are attacked with the tools we would use to stop that.
[ link to this | view in thread ]
Re: Re: Re: Re:
Which would reduce our ability to interact with each other just because some people can't behave themselves. Individuals are personally responsible for their own behaviour. It's ridiculous to hold a platform responsible for user behaviour.
[ link to this | view in thread ]
Re: Re:
Yes, but that's what mute and block buttons are for. I use them all the time when people annoy me. We don't have to put up with trolls, etc., at all.
Honestly, it seems to me that refusing to engage with them is the better way. Too many people see a need to interact with them and have the last word. It's a stupid way to behave. Ignore, mute or block, and move on.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Seriously though
Icing on the cake is his other post, saying that Masnick is "lying" when he doesn't know who Hamilton is.
Always knew that MyNameHere and Hamilton were lovers, but it takes a special sort of screwed in the head to go yandere when someone else's crush ignores them...
[ link to this | view in thread ]
Re: Re: Re: Re:
Are you confident that the ban line will be one that you agree with, you will never step over it? Are you confident that the system will never make a mistake and associate you with somebody else's speech?
[ link to this | view in thread ]
Do you think these clashing views will hinder efforts to get rid of S230?
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Seriously though
Actually I was around. It's why I'm waiting for it to inevitably show up as the cherry on top of his triple-decker shit sandwich he calls a cake.
[ link to this | view in thread ]
Toplu Yemek Hizmeti
https://www.alioglucatering.com
[ link to this | view in thread ]
mcafee dell activation
https://mcafeedellactivation.com/
[ link to this | view in thread ]
Re: Re: Re: Re:
I don't mind them booting nazis. Problem is who is determining who is and is not a nazi? Do grammar nazis count as well? I am fine with that also. Feminazis? Where shall we stop and who makes that call? You? Me? Let the box of macaroonies vote on it?
[ link to this | view in thread ]
Re: Re:
You mean name names and put targets on people's back. Not necessary.
The target is already there or they wouldn't have been manipulated into being sued.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
"Which would reduce our ability to interact with each other just because some people can't behave themselves."
The solution, it seems, is to quit whining about content you don't like and move on to content you do. People whining about a few bad actors is going to fuck it up for everybody
It is no different than the stories Mike forces people to read on this site that they had no desire to read and are wondering what the Techdirt angle is. Get over it. Or build a bridge and get under it would seem more appropriate for the whiners.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
How about we stop with the idiotic nicknames and just say that the community can set its own standards for who is acceptable in their particular community Then, those people negatively affected go to communities where they are accepted, rather than whine that someone else isn't forced to play with them?
There's plenty of places out there that will accept you, but you don't have the right to use someone else's property just because they're more popular than you are.
[ link to this | view in thread ]
Creative Agency in Bangalore
http://www.qmpglobal.com
[ link to this | view in thread ]