As Intermediary Liability Is Under Attack, Stanford Releases Updated Tool To Document The State Of Play Globally
from the useful-stuff dept
We've spent many years talking about the issue of intermediary liability on the internet. While the term is one that nearly everyone agrees sounds boring as anything, it's incredibly important in protecting your rights to express yourself online. The key issue is who is liable for speech that is posted online. The common sense reaction should be that "the speaker" is responsible for any speech they make online. However, for reasons I still don't full comprehend, many, many, many people would prefer that the site hosting the speech should be liable. In many cases, this might not seem to matter. But it can actually matter quite a bit for absolutely everyone. While most speech is perfectly legal, there remain some exceptions (including copyright, defamation, true threats and more).
And while some people think that those exceptions are narrow enough that pinning liability on websites shouldn't be a big deal, that's not true in practice. Because if you say that the website (the intermediary or platform) is liable for the speech, then merely making an accusation of illegality in the speech has a high likelihood of censorship of protected speech. That's because most platforms will take down speech that is reported in an attempt to avoid potentially crippling legal liability. Indeed, in many cases, platforms are then pressured (either by law or threat of laws or legal action) to pre-filter or moderate certain content just to avoid even the possibility of legal liability.
And because of that, lots of perfectly legitimate, protected speech gets blocked and censored. Much of this is abusive. Because once you've supplied a tool that allows someone to flag certain content for censorship, that tool gets used, even if the content doesn't really qualify, and the internet platform is heavily incentivized to remove that content to avoid liability.
That's why this matters so much. That's why we're so concerned at attempts to chip away at intermediary liability protections in the US, such as the immunity clause under CDA 230 or the safe harbor clause under the DMCA 512. But the US is, of course, just one country of hundreds. And lots of other countries have their own (frequently changing) laws on intermediary liability. For years Stanford's Center for Internet and Society has hosted a World Intermediary Liability Map, and that map has just been updated. This is an incredibly thorough and useful tool in understanding how these laws play out in other countries, how they differ and even the impact of how they work.
With the updated version, you can also drill down on topic pages around specific types of liability regimes, such as looking at how the Right to be Forgotten has been spreading around the globe, or look at how intermediary liability is handled around the globe for copyright or look at the monitoring obligations imposed by various laws.
For those of us who continue to believe that proper intermediary liability laws are key to a functioning internet and freedom of expression online, this is a fantastic tool -- only slightly marred by the fact that so many of the developments concerning intermediary liability (including here in the US) have been around successful attempts at chipping away from those principles, leading inevitably to greater censorship.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: copyright, free speech, intermediary liability, laws, liability, right to be forgotten, world map
Companies: stanford
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Better yet…
[ link to this | view in chronology ]
Re: Better yet…
If you stand at the top of the hill, you can see cars coming both ways, but if you're standing to one side or the other, you can't see cars coming until they're one or two seconds away from running over you.
Guess where the pedestrian crossing is? Not at the top of the hill.
So there's your argument for bad road design contributing to potential accidents and why the government should be liable.
[ link to this | view in chronology ]
Re: Re: Better yet…
That isn't intermediary liability though. The government decided where to place the crosswalk, so they'd be directly liable for their choice. Intermediary liability would be holding the contractor who painted the crosswalk where the government told him to liable for the government's choice.
[ link to this | view in chronology ]
Re: Re: Better yet…
[ link to this | view in chronology ]
That could, and probably would severely limit the amount of speech published on the Internet, especially as anything that raised the slightest question would be dropped or lost in the ever growing moderation queue.
[ link to this | view in chronology ]
Re:
It would also limit the amount of platforms. Who in their right mind would open a service like Twitter if they knew they could be held legally liable for someone else's speech?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
These folk are the same ones who want the internet to be just like broadcast tv was. No interaction, just a one way spew of whatever propaganda they like.
[ link to this | view in chronology ]
"Well yeah, but the light's better over here."
However, for reasons I still don't full comprehend, many, many, many people would prefer that the site hosting the speech should be liable.
I suspect it's a mix between having an easier target, offloading the work onto someone else, deniability when it comes to censorship, and not having an easy 'fix' if they had to go after the actual speaker.
A big company tends to be a lot easier to get a hold of, and a lot more visible, whereas going after an individual would require personal data that might not be available immediately, and would likely require that you convince the site that you have solid grounds to be given that information, which may very well require taking the matter to court.
By offloading the liability to the platform you also offload the work finding 'objectionable' material, which means all the work is on their end, you can blame them if they don't get 'everything', and any claims of censorship can be laid at their feet should anyone object, rather than the one who is demanding/'persuading' that the content be taken down.
Lastly, for content that is 'objectionable' but not actually illegal or in violation of a site's ToS, if you had to deal with the speaker directly you're not likely to have much luck finding an excuse as to why the content should be pulled. If you make the site responsible though, and make it clear that if they don't do a 'good enough job' then they're in trouble, they are very likely to go overboard and remove even questionable stuff 'just in case', likely taking care of the 'problem' without you having to do a thing.
[ link to this | view in chronology ]
Re: "Well yeah, but the light's better over here."
[ link to this | view in chronology ]
Re: Re: "Well yeah, but the light's better over here."
I dearly hope you're misremembering that one, the idea that the judge allowed a lawsuit against a party that was only marginally involved simply because they had more money is insane. They might as well have flat out said 'Profit is more important than sound legal footing' with a ruling like that.
[ link to this | view in chronology ]
"Profit is more important than sound legal footing"
That actually sounds like typical judge logic here in the US.
Other favorites are:
Screw defendant rights if the crime is awful enough
Screw defendant rights if it's close to a border
Screw defendant rights if he's black / a terrorist / a pedophile / I don't like him
The loot siezed is high value? Rightful forfeiture
OMG Good faith exception? well that changes everything!
The defendant is law enforcement? Acquitted!
The crime is too complicated to understand? Guilty!
Judges in the US are commonly biased and or idiots. What I can't tell is if they're typically biased or idiots.
[ link to this | view in chronology ]
Re: "Well yeah, but the light's better over here."
[ link to this | view in chronology ]
Re: Re: "Well yeah, but the light's better over here."
[ link to this | view in chronology ]
Re: "Well yeah, but the light's better over here."
I suspect that many people are working from the (probably implicit and subconscious) idea that "by agreeing to host the content - either when you know what the content is in advance, or by continuing to host it after you learn what it is - you are speaking that content yourself, and therefore you are the speaker, and can be held liable for the speech".
That is, it's not that they'd object to the original speaker being liable - they just consider anyone who chooses to cooperate / collaborate in the act of speaking, such as someone who chooses to host the speech, to be equally responsible for what is said.
That's why that whole tangle of ideas including terms like "red-flag knowledge" and "knew or should have known" develops in the first place.
The result of all of that is the setup you describe, but I think that setup is all based on this deeper root.
[ link to this | view in chronology ]
A familiar occurrence goes something like this:
A small independent site owner gets an email saying that a posted comment is not just incorrect, but is libelous, defamatory, or whatever else, and demands that it immediately be taken down under the threat, whether actual or implied, of a potentially expensive lawsuit. So the comment naturally gets deleted as a purely economic and/or survival decision.
I've had product reviews taken down this way, and I had no hard feelings toward the site owners for doing what was necessary to survive. Large sites that have the money to fight bogus lawsuits can be just as risk-averse, however, as a more bottom-line focused corporate mentality replaces the more ethically or ideologically-charged principles of the random young guy running a site, someone who passionately hates bullies but knows his limitations and recognizes the risks of getting sued, especially by someone with deeper pockets.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
If webhosts end up liable for user-created content...
...everyone is going to discover the full glory of adversarial input.
It won't be pretty. It'll be beautiful.
[ link to this | view in chronology ]
Internet Bad. Must force someone to fix.
We *do* need a better model for trust and attribution, though.
[ link to this | view in chronology ]
Just
[ link to this | view in chronology ]
Because their current goal is back-handed demonetization on an ever-increasing wholesale amount of content, backed by cheerleaders who fool nobody (cheerleaders like you) it is ALWAYS ABUSIVE in the hands of Google and it's subsidiaries, primarily YouTube.
[ link to this | view in chronology ]
Re: Demonetization...it's a fact of life!
My cell phone will video and sound record anything important, for free...
Printed books are killing live storytellers! Ban printed books!
[ link to this | view in chronology ]
I think we're becoming Cardassia...
I specifically remember a scene where the Doctor says to Garak, "I'm sick of these Cardassian mystery novels. There's no mystery, everybody is always guilty!".
Garak replies: "Of course, Doctor, the mystery is: who is guilty of what?"
- - - - - - - - - - - - -
Can you imagine the world we'll create if this becomes common and normal? Nobody will start a new internet business unless they can already afford cutting-edge censorship techniques and an army of lawyers.
[ link to this | view in chronology ]
Re: I think we're becoming Cardassia...
It's a war of attrition and obfuscation and they are doing an excellent job of moving the goal posts every time we get close, ooh look over there a shiny... Did something happen while we were looking the other way (probably but we won't know for a while yet).
[ link to this | view in chronology ]
surely the reason is because the sites can easily be found, as can, in most cases, those running and acting as admins on the sites whereas the individuals who are posting on the sites are much more difficult and sometimes, damn near impossible to find! there's nothing easier than being able to blame joe, if he can be found, regardless of whether he has done anything wrong or not, than finding xy3 who is hiding somewhere! on top of that, if blaming joe means he is told to do as something, refuses and has to go to court, the cost will probably shut the site anyway so it's a win-win situation for those who want to impose their will on the site and will do the same, eventually, everywhere!
people seem to be oblivious as to what is actually going on, world-wide, where the rich, the famous, the powerful few and their friends are actually taking control of everything, by locking things up, locking us out and preventing us from knowing what the fuckers are up to, while knowing the ins and outs of everything to do with us, 24/7!!
[ link to this | view in chronology ]
But you're assuming all platforms are just passive hosts
Website #2: hitmentruecrimestories.com. Come on our site and post your stories about famous hit men and interact with those who share your passion.
Same type of platform?
[ link to this | view in chronology ]
Re: But you're assuming all platforms are just passive hosts
Different types of platforms. Laws that shield against intermediary liability, like 230, don't do squat if the platform is deliberately involved in something illegal.
If the site is actively involved, rather than passively, then they can be held liable.
As for your two examples #1 would probably not be protected, as that seems to be actively engaged in facilitation of illegal activity(assuming it's not a honey-pot anyway), whereas #2 would be, as the description you list seems to suggest it's just a site to share stories, which isn't illegal even if the activity described is.
[ link to this | view in chronology ]