Rep. Louie Gohmert Wants To Strip Section 230 Immunity From Social Media Platforms That Aren't 'Neutral'
from the making-everything-'fair'-by-making-it-suck-for-everyone dept
Rep. Louie Gohmert is one of the most technologically inept Congressmen we have the misfortune of being "served" by. Getting to the top of this list isn't easy. The halls of Congress are filled with people who truly don't understand the tech they're attempting to regulate. Nor do they appear to be making any effort to educate themselves. Gohmert, however, seems to believe his position as an elected official gives him tech smarts he actually doesn't have, so he spends a great deal of time embarrassing himself when grilling tech reps during Congressional hearings.
Gohmert was one of the participants in the Social Media Bloodsport Hearings of 2018. Held over the course of several months, the hearings were 75% grandstanding and 20% misunderstanding the issues at hand. Social media services have been hit hard recently for appearing to bury/deplatform right-wing accounts while simultaneously allowing the platforms to be overrun with foreign state-operated bots. It's ugly but the ignorance displayed by Gohmert and others during the hearings was just as galling.
It was at these hearings a new myth about internet platform immunity came into being. Somehow, these lawmakers looked at Section 230 of the CDA and decided it required platforms to be "neutral" to avail themselves of this protection. A Senate hearing in April featured Rep. Ted Cruz demanding to know if Facebook considered itself a "neutral public forum." Mark Zuckerberg said he'd look into it, claiming he wasn't familiar with the "specifics" of the "law [Cruz] was speaking to."
Bad answer. And the bad answer made Cruz look like he'd just played a successful round of "Stump the Tech Magnate." But he had done nothing more than state something not backed by actual law. That should have been the end of it, but people who really wanted to believe Section 230 immunity requires "neutral" moderation used Cruz's ignorance as the starting point for stupid lawsuits almost certainly destined for quick dismissals.
It's one thing for the public to make bad assumptions about federal laws. It's quite another when federal lawmakers do it. Rep. Gohmert, playing to the home crowd [read the replies], has declared he's going to strip immunity from service providers who "use algorithms to hide, promote, or filter user content."
Introduced a bill today that would remove liability protections for social media companies that use algorithms to hide, promote, or filter user content. Read more about it, here: https://t.co/qTDnQyuABr
— Louie Gohmert (@replouiegohmert) December 21, 2018
That would be all service providers. Gohmert wants to strip immunity from all platforms solely because he believes in Ted Cruz's ignorant fiction. The bill hasn't been written yet, but the statement issued by Gohmert explains the basis for this incredibly idiotic legislation proposal:
Social media companies like Facebook, Twitter, and Google are now among the largest and most powerful companies in the world. More and more people are turning to a social media platform for news than ever before, arguably making these companies more powerful than traditional media outlets. Yet, social media companies enjoy special legal protections under Section 230 of the Communications Act of 1934, protections not shared by other media. Instead of acting like the neutral platforms they claim to be in order obtain their immunity, these companies have turned Section 230 into a license to potentially defraud and defame with impunity.
Section 230 does not require neutrality. It never has. It does not forbid content moderation. It actually encourages good faith efforts to keep platforms free of content they don't want. Twitter and Facebook could remove every right-leaning account on their platforms without losing Section 230 immunity -- which solely shields them from being held liable for content posted by third parties. It does not insulate them from charges of fraud or defamation if, in fact, either of these were committed by the companies, rather than their users.
For Gohmert's proposal to work, he would either need to add the missing "neutrality" component or do away with Section 230 immunity altogether. Both of these are terrible ideas. Neutrality would be impossible to define, much less enforce. And the removal of immunity would mean the end of social media platforms as we know them, as companies will not be willing to be sued for content created by platform users.
Gohmert's disingenuous idiocy doesn't end there.
In one hearing, one of the internet social media executives indicated a desire to be treated like Fox News. Fox News does not have their immunity and this bill will fulfill that unwitting request. Since there still appears to be no sincere effort to stop this disconcerting behavior, it is time for social media companies to be liable for any biased and unethical impropriety of their employees as any other media company. If these companies want to continue to act like a biased medium and publish their own agendas to the detriment of others, they need to be held accountable.
The difference between Fox News and Twitter is Fox News creates the content it publishes. Twitter does not. That's why Twitter has immunity and Fox News doesn't. Maybe some tech exec said something stupid during a stupid hearing filled with chest-beating and misconceptions, but that doesn't make Gohmert's proposal any less moronic.
Make no mistake: the same people agitating for "neutral public forums" are the people who will be deplatformed first if Section 230 immunity is removed. It's already happening while the immunity remains in place. Anyone trafficking in controversy will be shown the door before they can do any damage to the platforms that used to host them. If you want more blanket moderation and faster banning, by all means, gripe about immunity and neutrality. If you actually value the free flow of speech, keep dimwits like Gohmert out of office.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: cda 230, content moderation, gohmert, intermediary liability, louis gohmert, neutral platform, neutrality, section 230
Reader Comments
The First Word
“Re: Re: Re: 230
Who would be harmed by 230?
"A doctor who receives a blackmail letter from a Russian who says "Pay me 410,000 or I ruin your reputation online" would be harmed."
So... The Google can't deliver mail until they are sure it is safe?
Brilliant!!
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
So ContentID would open YouTube up to liability?
It is so wonderful to see them pandering to the base who believe the lies. I wonder if we shoudl have a law that says everytime a Congresscritter lies they should be ejected from office. I have to think the sheer number of lies they tell dwarfs the actions of media platforms who have every right to run their business any fscking way they want.
We are the home of the brave, land of the free... unless we think there is a conspiracy then we can throw out all of the rights so we can force you to behave how we want!! They are just as bad or worse than the SJW who demand changes & once the camels nose in in the tent the entire herd comes in. (See also: Ban R.Kelly k thks, now here is a list of other artists we feel shouldn't be on the platform!!!)
[ link to this | view in thread ]
Re:
[starts cooking popcorn]
[ link to this | view in thread ]
Re:
Just like a newspaper couldn't be sued for accurately reporting what a third party said (we had that article last wee), we don't hold Facebook responsible for accurately displaying the speech of its users.
If a statement does cause a harm that creates legal liability, the harmed party can still go after the party that caused the harm. But [insert big business here] did not cause the harm.
[ link to this | view in thread ]
Either representative Gohmert doesn't know how to find the law he wants to change, or he's trying to imply that he's fixing an old law that doesn't apply to the modern internet.
Either way, that's really sloppy.
[ link to this | view in thread ]
Re:
You say that like it's a bad thing. I, for one, would love to see them get smacked around for inflicting it on us!
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Section 230
Look, many of us may feel that by their very size, the big internet platforms (facebook, etc) should be required to act more like common carriers than a small website like Techdirt.
We may even think that little guys are more important than big companies.
But that is not section 230. Section 230 came about because on Compuserve, some evil individual posted some bad stock advice, and people believed that person. When the financially injured individuals sued, they also sued Compuserve, and Congress agreed *that* wasn't right, because without some legal protection, Compuserve would not allow much of any content.
Section 230 gives the content host the legal right, but not an obligation to moderate whatever you or I may post without incurring legal liability as if the host had written the material in any way the host sees fit. For example, Daily Stormer and Techdirt are free to kick me off for no reason at all.
My only recourse for such arbitrary behavior is to find another platform. Now, when we talk google or facebook or twitter or some other huge platform, there might not be another platform, because they are so large, and I think we need much better anti-monopoly efforts. But the law does not say this; the large platforms are only fairly neutral because of the commercial incentives.
And before you go saying the law should require neutrality, consider that Techdirt has done a good job of demonstrating that moderation at scale is an unsolved and likely unsolvable technical problem.
[ link to this | view in thread ]
Re:
I do not own or run a "big internet company".
My company's annual profit is barely $50k.
It would be 4x this amount if it wasn't for bogus DMCA take down notices we receive simply because some fucking asshole blankets companies using KEYWORDS. Not actual content. FUCKING KEYWORDS.
If the alternative is an ICE takedown due to finger pointing accusations of "infringement" (based on KEYWORDS), then I'll defend Section 230.
Also, fuck you for thinking this law only applies to "big internet companies".
Seriously, FUCK. YOU.
[ link to this | view in thread ]
Re: Re: Re:
The issue being that the current proposal also makes illegal all of search, any ability to filter an information feed, almost any ability for me to find content I am looking for. Because you need to use algorithms to do it.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re: Content ID
I'm pretty sure TAC thinks ContentID is a problem because it is used mindlessly for malicious purposes. See Dancing Baby case. But algorithmic filtering can also be extremely useful.
Here's another: Techdirt does a certain amount of algorithmic content moderation. My accidental blank-body posts disappear nicely, thank you. Should such rules apply to Techdirt? What about our voting which posts to hide by flagging?
[ link to this | view in thread ]
REALLY?? Let him do it..
Then every group on this list can go onto the net and say and SHOW as they please..
YOU CAN NO LONGER STOP ANYONE FROM POSTING..
You cant have the RIGHT to be forgotten.
You cant remove derogatory comments.
You cant Sue a person for anything he says, because of this.
You cant Sue Google/youtube/Bing/any service, because someone Crosslinked to a news article..
Porn away...back to bestiality sites.. Pain, and disgusting habits..
We couldnt Stop him from turning every advert into a Post for political status..
Do it ya idiot...pass this law, and everything you have done in the last 18 years is GONE...
The corps will love you. They wont need to run Multi million dollar Filters, the net will be that much faster.
Youtube and all the vieo storage on the Net will be faster... Kim Dot.com will Love you.. As he can say his site was/is for those with Opinions.
[ link to this | view in thread ]
Re: Re: Re: Content ID
Please stop conflating the two when discussing the law.
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Re:
Explain that in detail, and then that "potential" profit isn't guaranteed to anyone (most especially not to content creators here at TD), so how are you losing?
[ link to this | view in thread ]
Re: Re: Section 230
Actually it was because a good person posted good stock advice.
You’re confusing the two major cases on this that predate the passage of the CDA. Cubby v. Compuserve (in which people using Compuserve disparaged a startup media business) came out in favor of the service provider; Conpuserve was not liable. It’s the Stratton Oakmont v. Prodigy case that dealt with stock advice. And the advice was good, which was that Stratton Oakmont was an untrustworthy firm. Turns out, they were untrustworthy— they’re the trading firm in Wolf of Wall Street, if you saw that movie. That case held the other way, that Prodigy, their service provider, was liable for statements of their users because they could and did moderate their boards.
It’s one of the delightful ironies of the whole thing.
[ link to this | view in thread ]
Penalizing federal and state officials for lying
I'm pretty sure there are laws that do this but are just not enforced. It's a felony to lie to an official, but when a congressperson lies on the floor, that's exactly what they're doing.
But our prosecutors don't go after officials any more than they go after law enforcement officers.
In the meantime, a law that prevents social media sites from moderating would stand in opposition to many other current customs that are at least defined by precedent if not by law. It would mean copyright could not be enforced, nor advertisements for trafficked humans. It would also be impossible to comb out child porn.
[ link to this | view in thread ]
Re: Re: Re:
What I'd prefer is a situation that puts copyright back in its proper place, with its proper perspective, rather than the inside-out insanity of the current regime.
What I'd prefer is for principles of Due Process and the Presumption of Innocence, which are quite uncontroversial in other contexts, to be applied here: that content accused of copyright violation is innocent until proven guilty in a court of law.
What I'd prefer is for great power to actually come with an unavoidable great responsibility, rather than bringing with it the power to dodge responsibility, as is all too often the case.
[ link to this | view in thread ]
Re:
Never mind ContentID - that list includes "promote", so even a "suggested videos" or "trending videos" or "watch next" link would lead to liability. It seems that even allowing a user to completely manually select a set of topics they are interested in, then showing them algorithmically selected videos that relate to those topics, would count. I mean really everything that happens on every website in any way involves several "algorithms" - loading a video and playing it after someone clicks on it is an algorithm; returning even completely-unfiltered results for a search is an algorithm.
Of course we haven't seen the full bill yet, so in theory it could be written with exceptions for those kinds of cases. And I am totally completely confident that Rep. Gohmert and those he will work with are fully technologically literate enough to do that properly. Aren't you? ;)
[ link to this | view in thread ]
Re: Re: Re:
The implication that these take downs are not shutting down his business, but imposing downtime, highlights a lack of validity in the claims, implying abuse of those takedown notices.
He is not claiming entitlement to that profit, but rather that he is suffering downtime due to the abusive behavior of others misusing the DMCA, and that based on current revenue trends, the downtime is significantly impacting his revenues.
His number (4 times current revenue) might not be accurate, but significant downtime on a website that would be the main avenue for the advertisement and sale of a product or service would almost certainly impact revenue.
Nor does any of that being false undermine his claims that SEC 230 defends his small business. Even if he couldn't earn another penny, Sec 230 protects him as much as anyone else.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
In 1996, Congress enacted the Telecommunications Act of 1996, which amended the Telecommunications Act of 1934. Part of the 1996 Act was the Communications Decency Act of 1996, and part of that was Section 509, which starts out by saying “Title II of the Communications Act of 1934 (47 USC 201 et seq) is amended by adding at the end the following new section: ‘Section 230. Protection for Private Blocking and Screening of Offensive Material’” and it goes on from there.
When people say section 230 of the Communications Decency Act, they really mean 47 USC 230. Using the actual US Code cites is way more useful. No one looks stuff up by the public law once it’s enacted.
[ link to this | view in thread ]
NO law in America can RESULT in over-turning First Amendment.
Nope. Whether gov't directly or attempting end-around by way of corporations (which is literally fascism): if RESULT is same then it's forbidden by We The People.
Mere statute CANNOT empower corporations to become de facto censors of political views on the very "platforms" that Section 230 creates precisely so can publish without either gov't or corporate A PRIORI censoring.
) The hosting corporations are made immune for what "natural" persons choose to publish. That does not mean the corporations are publishing, NOR that they're forced to host what don't want to associate with: it's a DEAL to help The Public and corporations, not investing censoring power to the latter.
) Persons can still be liable for what they publish. But that's their freedom.
) Corporations can in "good faith" remove content which is outside common law terms. They cannot make up their own definitions of "hate speech" by which to censor.
That is what Section 230 says. It DOES NOT empower corporations to arbitrarily censor.
[ link to this | view in thread ]
The latest citations show "platforms" are Neutral Public Forums,
where "natural" persons have First Amendment Rights, NOT money-machines for corporations having unlimited power to stifle us:
In the Sandvig v Sessions decision, from page 7 on, "A. THE INTERNET AS PUBLIC FORUM".
The discussion is businesses verus "natural" persons.
You'll need to read the whole. Minion here clearly didn't bother because refutes his premise.
Key point: "the same principles are applicable." -- Again, that's applying to "natural" persons who in that case were accessing web-sites against TOS and corporate wishes, which of course is EXACTLY apposite to using forums and requiring them to be NEUTRAL.
The bottom-line question: Do YOU want to be SUBJECT to Corporate Control? -- If so, just follow Masnick blindly, he's leading you into the high-tech prison where you won't have any "platform" to complain!
[ link to this | view in thread ]
Re:
Because his gang were the majority that voted in the election, and they vote for the gang and not the man.
[ link to this | view in thread ]
Re: The latest citations show "platforms" are Neutral Public For
The bottom-line question: Do YOU want to be SUBJECT to Corporate Control?
No, I don't! Which is why I'm very happy s.230 protects me from being sued by corporations if someone posts a defamatory comment about them on a small blog or forum I operate. Why are you so desperate to give corporations and the wealthy more power to censor the public via lawsuits?
[ link to this | view in thread ]
Answer one question for me: What law, statute, or court ruling says the government can force Twitter, Tumblr, Facebook, etc. to host speech that the owners/operators of those platforms do not want on those platforms? Make sure to cite your answer properly.
[ link to this | view in thread ]
Re: The latest citations show "platforms" are Neutral Public For
Sandvig v. Sessions addressed whether it is a 1A violation for congress to create a law that makes TOS violations a crime, or more specifically for courts to interpret and enforce the CFAA in that way.
It had nothing to do with whether websites are allowed to have TOS agreements, or whether they are allowed to enforce them and deny (or attempt to deny) service to anyone who violates them.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in thread ]
Couldn't have said it better
Yes, it most certainly does create the content they broadcast on their stations. Reporting is so much easier when you simply make it up. ;)
[ link to this | view in thread ]
Re:
Offline platforms don't have immunity. Section 230 even immunizes websites against illegal housing and employment ads. It used to immunize sex-trafficking ads but no more.
Distributor liability for defamation has 150 years of precedent behind it. Libel laws were an alternative to DUELING.
[ link to this | view in thread ]
Re: Re: The latest citations show "platforms" are Neutral Public
Section 230 destroys an individual's ability to protect their name or their business. Russians engage in "reputation blackmail."
So you think Ripoff Report is a good thing then.
[ link to this | view in thread ]
Re: Re:
Section 230 is also at the root of false advertising, hate marketing, cyberbullying, etc. People have won large judgments in Australia and the UK over what ppears in search results.
Section 230 literally immunizes those who inflict the harm of defamation (search engines), to the point where even if one "sues the original publisher" the engine still doen't have to remove what was posted.
This is unique to the US. it is not the law globally, for a reason. ONe judge in NY wondered why he couldn't sue ebay when someone put him up for sale btw.
[ link to this | view in thread ]
Re: Re: Re: The latest citations show "platforms" are Neutral Pu
Do YOU want one person with an axe to grind to be able to destroy someone's reputation (or business) by weaponizing search engines?
I assume you mean in a circumstance where said person is knowingly using libelous falsehoods to do so, rather than simply saying things that are true? In that case no, which is why I'm glad the person doing that is subject to defamation law.
[ link to this | view in thread ]
Re: Re: Re:
The next two lines? Please present evidence in support of your assertions. I have no reason to believe what you have said, since they are simply assertions and nothing else.
Final line:
That judge should realize that it's because eBay did not put him up for sale. He can try and sue the guy who posted it. But eBay didn't post it. One judge's woeful lack of understanding is... one judge's woeful lack of understanding.
Remember, 230 says that if some dude walks into the park you own and commits a crime, you can't be sued just because you own the park.
[ link to this | view in thread ]
Re: Re:
They can host whatever they want, doesn't mean they have to have immunity.
I understand what you are saying, but the original commenter in this thread has a more expansive position that they have established firmly over time: they believe that it is an illegal, unconstitutional violation of the first amendment for an internet platform to moderate user content. They assert that even under the s.230 immunity provisions as currently written, any platform that has (for example) a hate speech policy or a harassment policy is somehow in direct violation of the US constitution and should face serious penalties, and the government should force them to eliminate those policies.
[ link to this | view in thread ]
Can you prove the people who own and operate Twitter have created defamatory speech or have actively encouraged/helped someone publish defamatory speech? If not, for what reason should Twitter be held legally liable for defamatory speech written by a third party?
[ link to this | view in thread ]
Re: Re: Re: The latest citations show "platforms" are Neutral Pu
So you think Ripoff Report is a good thing then.
I'd like you to take a moment here and think about what you're saying: that if someone supports a law, they must believe that everything which it permits/enables is broadly "good".
I don't think that's a very tenable position.
I also support laws preventing police from kicking down any door they feel like without a warrant and searching houses on a whim. And yet, funny thing, that doesn't mean I believe everything everyone does inside a house is "a good thing".
[ link to this | view in thread ]
You should sell this scarecrow to a farm, it’s got some damn fine straw in it.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
The amount of wrong…
No, no it is not. That would be caused by shitty people being shitty people. Specifically, shitty people who are absolutely liable for their own shitty actions. Section 230 does nothing to stop anyone from going after them.
Which makes no sense at all. Why on earth would the platform need to be liable for that?
Section 230 does not immunize anyone who defames. Search engines are also not the ones doing the defamation in whatever example you're thinking of. That would be like suing the company that made the megaphone when someone yells a lie through it.
Oh, and as an added bonus, your note about the engine not having to remove the offending content is flat out wrong. This has happened many times before. They can refuse to remove it in some cases involving default judgements, but that's an exception.
I believe it was also a judge in NY who posited that prior restraint was the answer to all problems of "Internet" as well. People say stupid things, and judges are no exception. Neither are you.
[ link to this | view in thread ]
Ah politics...
Where the dumber you are the easier your job is, and you can feel safe in the knowledge that no-one will call your monumental stupidity to your face, allowing you to lie and/or make a fool of yourself all you want.
Idiots like this should never be allowed in positions of power, as you can be damn sure(as evidenced now) that they'll take their stupidity and run with it, causing immense damage in their quest to look like they 'stuck it to those dastardly companies', despite the fact that everyone but the large companies he's jousting against will get screwed over vastly more.
If there's one silver lining to this idiot's actions it's that if by some disaster he does manage to cram his train-wreck of a law through I can guarantee it will not go the way he and the buffoons cheering him on think it will. If they think they're being treated 'unfairly' now, when sites aren't liable for what users post, they will not like what happens to them and theirs when sites are liable.
[ link to this | view in thread ]
Re: "Natural" Troll
[ link to this | view in thread ]
All or nothing John, all or nothing
[ link to this | view in thread ]
Re: All or nothing John, all or nothing
It's almost like copyright enforcement and supporters have a significant overlap with scam artists. I'm shocked, I tell you - shocked!
[ link to this | view in thread ]
'We can survive a few lawsuits, can YOU?'
More than that, you could argue that 230 protects those smaller individuals/groups more than it does 'big internet companies', as holding the platform liable stands to do a lot more damage to those that don't have massive amounts of resources for excessive filtration and/or the inevitable lawsuits.
A platform like YT can survive several lawsuits by some idiot going after them rather than the one who posted the content being sued over, but for a smaller platform even one lawsuits could very well drive them under and cause the platform to shut down as not worth the risk.
Much like the stupidity in the EU over the link tax and mandatory filters(despite lies to the contrary) efforts to undermine 230 stand to help large companies more than they stand to hurt them, with the small fry(individual blogs, up and coming platforms that might provide competition to the bigger players, and so on) being the ones who stand to lose the most.
[ link to this | view in thread ]
Re: The amount of wrong…
Oh, and as an added bonus, your note about the engine not having to remove the offending content is flat out wrong. This has happened many times before. They can refuse to remove it in some cases involving default judgements, but that's an exception.
Just to clarify here, while I disagree with the larger argument the comment your responding to is making, he is correct that under 230 a search engine is not legally required to remove content, even after it's been determined by a court to be defamatory. What that commenter conveniently ignores, is that nearly every search engine WILL remove such content upon receipt of a legitimate court ruling to that effect (meaning that the "harm" the commenter describes, is mostly mythical).
Indeed, the fact that most sites will remove such content when presented with a court ruling to that effect created a big business in creating fake court orders (or worse, using fake "defendants" who quickly "settle" allowing a misleading "agreement" to be sent to the search engine in an effort to delete access to content someone dislikes).
[ link to this | view in thread ]
Re: Re: Re: The latest citations show "platforms" are Neutral Pu
[ link to this | view in thread ]
Re: Couldn't have said it better
[ link to this | view in thread ]
Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: 230
Who would be harmed by 230?
"A doctor who receives a blackmail letter from a Russian who says "Pay me 410,000 or I ruin your reputation online" would be harmed."
So... The Google can't deliver mail until they are sure it is safe?
Brilliant!!
[ link to this | view in thread ]
Re: Re: The amount of wrong…
[ link to this | view in thread ]
And they have a LOT of blackmail material on Gohmert's various sexual liasons with various social strata and age groups. (underage rent boys being a favorite of his!).
So expect chest-beating as said then nothing at all as Comcast whispers in his ear.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
blue isn't exactly subtle when it comes to things he disagrees with even if it technically meshes with his worldview. He doesn't care a jot for creators, just the corporations that buy them over.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Lets drain the swamp..
Please please, DONT blame the educational system, these folks are the ones that changed it..
So WHY are they SO DUMB?? they want us as bad as they are..
Anyone test the congress chambers for Airborne drugs?? Hallucinogens..
[ link to this | view in thread ]
Re: Re: Re:
He would be harmed by the Russian, not by anything related to section 230. Section 230 is not involved, either in the threat, or in the doctor's options n going after the people responsible for the blackmail and any harm caused.
All that section 230 means is that the doctor can't sue the services used by the Russian because they're easier to find than the Russian himself, just as they're not allowed to sue the post office for delivering the initial letter. The Russian is still culpable, the doctor would just have to go after him for reparations rather than the nearest related innocent 3rd party.
As ever, you're lying about the very root of the issues you are opposing. All you are rooting for is that more innocent parties be held liable for things they did not do.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re: Re: Re:
You're still blaming the wrong people if you're blaming YouTube. The reason they went "above and beyond" is because they were being sued for all sorts of random shit - even content the labels had uploaded themselves - and the climate at the time was that the major content groups were too scared to offer legal channels. It was necessary to show some proactive filtering to avoid a situation where a court found them liable for encouraging infringement. If that had happened, no service like YouTube would have been legally allowed. Forget the occasional mistakes you're talking about - how about if the independent creators didn't have a platform at all?
ContentID is a pain in the arse for many reasons, but leave the blame for it where it belongs. Watching Google suffer will do absolutely nothing to address the problems that created ContentID, and may in fact encourage far worse ways of dealing with problems. You're attacking YouTube because they occasionally make mistakes while filtering millions of videos an hour, but not attacking those who require the filters to be in place - and against YouTube's original wishes.
[ link to this | view in thread ]
Re: Re: Re: Re: The latest citations show "platforms" are Neutra
The potential to abuse made possible by Section 230 is obvious. All I said was that supporters of the law are placing the rights of big internet companies above the reputation of this class of individuals.
If search engines were smart they'd eliminate this concern by allowing this class of individuals to protect their reputations. If they don't, it will likely be the undoing of section 230, likely when some beauty queen is Google-bombed by an ex-boyfriend and becomes the poster child for unaccountability.
[ link to this | view in thread ]
Re: All or nothing John, all or nothing
Police have been involved for a while.
The people who have gotten visits from the cops don't post about it.
Masnick, please put a litigation bhold on these comments. . Thanks.
[ link to this | view in thread ]
It's not as simple…
citizens and earning U.S. dollars from advertisers.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: The latest citations show "platforms" are Ne
Right, it isn't the villain that's causing the damage when he destroys that city, it's the civilians' susceptibility to death by nuclear fire that's the root problem.
The potential for abuse is obvious.
[ link to this | view in thread ]
Re: Re: All or nothing John, all or nothing
The truth is nobody's going to lose any sleep over some intentional no-name chucklenut who's too afraid to expose himself. You couldn't bring a suit to the drycleaners, John.
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
No it wasn't. What was necessary was to push back and say "this is what the law requires, this is all the law requires, and screw you if you want more than that." Instead, they tried to appease the extortionists, and the result was... well... exactly what you'd expect from someone who tries to appease extortionists. And we're all worse off for it.
[ link to this | view in thread ]
Re:
Oh, yeah - I'm quite certain there are ample opportunities for these "platforms" in other countries where they would not be subjected to ridiculous governmental / business restrictions.
lol
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re:
No it wasn't. What was necessary was to push back and say "this is what the law requires, this is all the law requires, and screw you if you want more than that."
Yeah, judges respond really well to stuff like that. Great plan.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: The latest citations show "platforms" are Ne
How, exactly? If you just allowed search engines to be sued for indexing defamatory stuff, they'd respond by just not linking to anything negative about anyone, expect for court decisions, because they can't tell ahead of hand what's defamatory or not. And that's assuming that there's some algorithmic way to determine potentially defamatory material from material which can't defamatory.
If the solution is to have a law saying that search engines must take down links to links to negative material about John Doe if John doe claims that the material is defamatory, how do you prevent that from be abused by people who claim negative material about is defamatory when it's actually true?
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in thread ]
Re: It's not as simple…
Had Kim dotcom not had any of his servers in the us, he would be facing extradition to the us, because us laws would not have applied.
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Re: All or nothing John, all or nothing
[ link to this | view in thread ]
Re: Re:
While not Section 230, the DMCA rule actually has liability for false reporting. It would not be YouTube that gets sanctioned, but the reporting party. Fortunately the rule was written so well that it cannot be used said the **AA Exec.
[ link to this | view in thread ]
Had MegaUpload servers all been outside the US
Given that US laws weren't applying anyway (hence the shotgun-load of ambiguous charges like conspiracy) I don't think that would have kept ICE from Dotcom's door.
They were acting as MPAA / RIAA mercenaries under the color of law enforcement as it was.
[ link to this | view in thread ]
Thanks for the laugh
I gotta say, probably the funniest part of your comments is that you think anyone, up to and including the owner of the site, would believe someone so grossly dishonest, a person who has a history of making unsupported claims/threats and then running away and/or changing the subject when asked to support them.
It's like watching a habitual liar of a little kid threaten a full grown adult by claiming that their dad could totally beat them up, so they'd better do what the kid wants and take them seriously.
[ link to this | view in thread ]
Gohmert advocating FOR Net Neutrality?
[ link to this | view in thread ]
Re: Re: Re: Re: The latest citations show "platforms" are Neutra
[ link to this | view in thread ]
Re: Re:
Maybe try to stop being a fat sack of shit?
[ link to this | view in thread ]
Re: Re: Re: Re: Re: The latest citations show "platforms" are Ne
So, you don't know how search engine actually work? Figures.
" the person could be judgment-proof, or in a country where they can't be sued"
So what? It being less convenient for you to attack the person doing the deed does not make it acceptable to go after innocent 3rd parties who happen to be easier to get to.
"The potential to abuse made possible by Section 230 is obvious"
Not nearly as obvious as the abuse that would happen without it.
"All I said was that supporters of the law are placing the rights of big internet companies above the reputation of this class of individuals."
No, they also very much support individuals, but your Google hate boner is too strong to allow you to think about the real arguments.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re:
You are a major part of the problem.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re:
Again, ContentID is crap, but to blame YouTube and only YouTube is not only blaming the victim, it's utterly ignorant of what was happening at the time.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re:
Reacting to extortion in the wrong way is something blameworthy, because it legitimizes the extortion and empowers it. YouTube refusing to stand up for their rights--for our rights--to flip the script and take the fight to the enemy, and choosing to appease them and collude with them instead, is directly responsible for a large part of the mess we find ourselves in today.
There's a reason why phrases like "millions for defense but not one cent for tribute" and "we do not negotiate with terrorists" are a thing. Because when you go outside of that mindset, this is what happens.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re:
Yes, they didn't create the problem, but they absolutely did make the problem worse, and for that they deserve to be blamed, completely independent of whatever anyone else may have done.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re:
It is also part of the reason that YouTube still exists, as the people attacking them are experts is using the legal system to bankrupt people who do not give in to their demands. When the legal system does not allow you to recover your costs, you can easily go bankrupt while winning your case, and YouTube could have won the case, and gone out of business because of the cost of winning.
[ link to this | view in thread ]
Re:
-why he does not either
I’m sorry
[ link to this | view in thread ]
Re:
Done PER LAW however this comes out becuase you know he has no idea how to use anything other then a typewriter barely.
And if he has a problem to DEAL WITH IT.
[ link to this | view in thread ]
Re: Re: Re:
You run away every single time someone proves exactly how full of shit you are.
[ link to this | view in thread ]
Re:
See: http://transition.fcc.gov/Reports/1934new.pdf
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re:
As a perfect example, one needs look no further than Veoh, which won case after case yet was still driven into the ground regardless.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re:
But, not worthy of the majority of blame in the situation, and certainly not something that needs you gloating like an idiot over the entire industry suffering because you'd prefer them to have done something different.
Stop being a prick, and place the blame where it belongs.
"There's a reason why phrases like "millions for defense but not one cent for tribute" and "we do not negotiate with terrorists" are a thing. "
YouTube had 2 choices at the time - play the game as it was being presented, or not only have themselves shut down but have a legal precedent on the books to ban user created content online completely.
You're pretty pathetic if you think they should have chosen the second option.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re:
Oh, and there's a reason why "perfect is the enemy of good" and similar phrases exist. See if you can work it out.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
Did you attack anyone else in your rant? I must have missed that.
"Yes, they didn't create the problem, but they absolutely did make the problem worse"
They really didn't if you actually understood the other options that were on the table at the time. Would you prefer YouTube to have been shut down, and thus everyone else who didn't have Google's weight behind them?
[ link to this | view in thread ]
Whether or not you have a physical presence…
Physical evasion does not negate laws or regulation.
Those DOING BUSINESS in any jurisdiction are subject
to it's laws and regulations, no matter their location.
The Dotcom case is an example where Biden caused U.S.
authorities to ignore normal legal processes which could have
worked and tried to make an experimental criminal precedent.
That created the quagmire which outlived Biden's VP career
and is likely to see him sued for costs once it collapses.
[ link to this | view in thread ]
Re: Re:
They dont have internat, but Generally, there are Fiber lines threw their country..
Build it and they will come..
But then yo get Nations Blocking access to THAT country..
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
You apparently missed the meaning of the word "extortion" if you think I didn't blame anyone else for what happened.
Once again, typing this nice and slow so you'll comprehend it: simply because one party did something wrong doesn't mean that someone else responding to it couldn't have *also* done something wrong that made the whole situation worse.
> They really didn't if you actually understood the other options that were on the table at the time. Would you prefer YouTube to have been shut down, and thus everyone else who didn't have Google's weight behind them?
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re:
False dichotomy, and if you really think that some movie studio had a big enough war chest to do to Google, of all companies, what was done to Veoh, you have no room to be throwing around words like "pathetic."
What Google could have done to resolve this, if they actually had the spine to, was to take one look at the lawsuit and say "let's handle this like businessmen," and initiate a hostile takeover of Viacom. (Among other things. There were other very viable options available, such as sticking to the actual law and pushing back against attempts to abuse the court system to expand it beyond its bounds, but that's the one I like the best. "Play the game as it was being presented" when the person presenting it is your enemy who is making it up as they go was the worst possible response.)
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
First off, Viacom is hardly "some movie studio". Would Google in 2008 have been able to just buy them out? Almost certainly not. But, even if they did - so what? The Viacom case is only notable because of clearly stupid it was (they listed content they'd uploaded themselves as infringing). There were hundreds more already in place, and more and more people were lining up with lawsuits both frivolous and justified, that number increasing as the sharks smelt the blood in the water. Should they just buy everyone out? Where does it end?
Then, this still wasn't a core part of Google's business. They could have taken a major hit and dropped YouTube completely and survived. Veoh was a video hosting company. Google was an advertising and search company that happened to have bought a video host. It would have hurt, but they could have said "screw this, we're no longer interested" and sold YouTube off or written it down. Even if they didn't we'd still be facing the same lawsuits right now.
They would have been fine. But, the rest of the sector with new legal precedents and investor uncertainty would have been screwed. Especially given that even Google weren't able to negotiate a lot of its licensing deals until after this was in place. How would anyone else have had a chance?
ContentID is far from perfect, but again it's better than the realistically alternatives, and no, buying the people suing them is not realistic.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
But teh lawsuits! But teh lawsuits! But teh lawsuits! But teh lawsuits!
What about them? Once again, for the zillionth time, the law was on YouTube's side! (As evidenced by the fact that they won the case!) You keep ignoring this simple fact. ContentID was never necessary. The law was on their side, and if they had said "screw you, the law is on our side and we're sticking to our guns," that "new legal precedent" you keep talking about would indeed have come down, but it would have been a good one. But they didn't do that. They caved and took the easy way out, and that did set a precedent--not one in a court of law, but a very real precedent notwithstanding--that's been screwing over everyone except the major publishing interests to this very day. And that was the wrong thing for them to have done.
I don't know how it's possible for you to know as much about this whole issue as you evidently do, and yet remain unaware of this basic fact. But somehow you've managed it!
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
FFS, you're missing the fact that this was ultimately irrelevant. One of the core components of those cases being brought forwards was the idea that YouTube was somehow complicit in infringing because they didn't do anything proactively to look for or filter infringing content.
You're focussing on the one lawsuit, I'm focussing on the entire bodies of hundreds, or maybe even thousands upcoming that used the same types of arguments - and those others might have had more compelling evidence than the obvious ridiculous Viacom one.
Sure, they may well have won all the cases eventually, but they'd probably still be fighting them now with zero movement forward in the industry.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
Yes, and that's my entire point: this is what they call a "novel legal theory," which is legalese for "you're making up stupid crap that has no foundation in the actual law." The law did not require them to proactively look for or filter infringing content. In fact, it specifically said they had no obligation whatsoever to do so. The. Law. Was. On. Their. Side.
I'm focusing on one lawsuit because one is all it takes. All they had to do was get the one win, and then they could use that as a precedent to say "this case needs to be thrown out because it's based on the exact same faulty reasoning as this other case that we won" for all the rest of them.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re:
It took 6 years and multiple court proceedings to settle the case (ultimately settled out of court), and Viacom specifically changed their demands part way through as a direct result of ContentID being put into place.
https://www.nytimes.com/2010/06/24/technology/24google.html
"To a large extent, the case addressed past conduct, as Viacom said it was not seeking damages for any actions since Google put in its filtering system, known as content ID, in early 2008."
In other words, had they not created ContentID, they would have continued to seek damages for current activity, about which they may have been able to formulate a good enough argument to win.
"All they had to do was get the one win, and then they could use that as a precedent"
...and all they needed was one loss and it could be used as precedent against not only YouTube, but every other video provider that didn't have Google's warchest. Had Google not created ContentID, and had Viacom been competent enough to not include videos they uploaded themselves as the primary evidence, there is a risk that either they could have lost, or that Google decided the potential losses wren't worth the potential gains and written off YouTube completely before the case was settled, leaving the entire sector at risk.
Again, ContentID is not particularly well implemented and is annoying as hell, but you're deliberately missing the context about why it exists.
[ link to this | view in thread ]