As Intermediary Liability Is Under Attack, Stanford Releases Updated Tool To Document The State Of Play Globally

from the useful-stuff dept

We've spent many years talking about the issue of intermediary liability on the internet. While the term is one that nearly everyone agrees sounds boring as anything, it's incredibly important in protecting your rights to express yourself online. The key issue is who is liable for speech that is posted online. The common sense reaction should be that "the speaker" is responsible for any speech they make online. However, for reasons I still don't full comprehend, many, many, many people would prefer that the site hosting the speech should be liable. In many cases, this might not seem to matter. But it can actually matter quite a bit for absolutely everyone. While most speech is perfectly legal, there remain some exceptions (including copyright, defamation, true threats and more).

And while some people think that those exceptions are narrow enough that pinning liability on websites shouldn't be a big deal, that's not true in practice. Because if you say that the website (the intermediary or platform) is liable for the speech, then merely making an accusation of illegality in the speech has a high likelihood of censorship of protected speech. That's because most platforms will take down speech that is reported in an attempt to avoid potentially crippling legal liability. Indeed, in many cases, platforms are then pressured (either by law or threat of laws or legal action) to pre-filter or moderate certain content just to avoid even the possibility of legal liability.

And because of that, lots of perfectly legitimate, protected speech gets blocked and censored. Much of this is abusive. Because once you've supplied a tool that allows someone to flag certain content for censorship, that tool gets used, even if the content doesn't really qualify, and the internet platform is heavily incentivized to remove that content to avoid liability.

That's why this matters so much. That's why we're so concerned at attempts to chip away at intermediary liability protections in the US, such as the immunity clause under CDA 230 or the safe harbor clause under the DMCA 512. But the US is, of course, just one country of hundreds. And lots of other countries have their own (frequently changing) laws on intermediary liability. For years Stanford's Center for Internet and Society has hosted a World Intermediary Liability Map, and that map has just been updated. This is an incredibly thorough and useful tool in understanding how these laws play out in other countries, how they differ and even the impact of how they work.

With the updated version, you can also drill down on topic pages around specific types of liability regimes, such as looking at how the Right to be Forgotten has been spreading around the globe, or look at how intermediary liability is handled around the globe for copyright or look at the monitoring obligations imposed by various laws.

For those of us who continue to believe that proper intermediary liability laws are key to a functioning internet and freedom of expression online, this is a fantastic tool -- only slightly marred by the fact that so many of the developments concerning intermediary liability (including here in the US) have been around successful attempts at chipping away from those principles, leading inevitably to greater censorship.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: copyright, free speech, intermediary liability, laws, liability, right to be forgotten, world map
Companies: stanford


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    PlagueSD (profile), 16 May 2018 @ 4:25pm

    Saying the site hosting the speech should be liable is like saying the car manufacturer should be held liable if you get pulled over and charged with a DUI.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 16 May 2018 @ 4:38pm

    >Indeed, in many cases, platforms are then pressured (either by law or threat of laws or legal action) to pre-filter or moderate certain content just to avoid even the possibility of legal liability.

    That could, and probably would severely limit the amount of speech published on the Internet, especially as anything that raised the slightest question would be dropped or lost in the ever growing moderation queue.

    link to this | view in thread ]

  3. icon
    Stephen T. Stone (profile), 16 May 2018 @ 4:42pm

    Re:

    That could, and probably would severely limit the amount of speech published on the Internet

    It would also limit the amount of platforms. Who in their right mind would open a service like Twitter if they knew they could be held legally liable for someone else's speech?

    link to this | view in thread ]

  4. icon
    That One Guy (profile), 16 May 2018 @ 4:44pm

    Re:

    As the traditional publishers/gatekeepers, and those that might object to people saying the 'wrong' things would say(were they honest enough to do so): 'That's a feature, not a bug.'

    link to this | view in thread ]

  5. icon
    That One Guy (profile), 16 May 2018 @ 4:55pm

    "Well yeah, but the light's better over here."

    However, for reasons I still don't full comprehend, many, many, many people would prefer that the site hosting the speech should be liable.

    I suspect it's a mix between having an easier target, offloading the work onto someone else, deniability when it comes to censorship, and not having an easy 'fix' if they had to go after the actual speaker.

    A big company tends to be a lot easier to get a hold of, and a lot more visible, whereas going after an individual would require personal data that might not be available immediately, and would likely require that you convince the site that you have solid grounds to be given that information, which may very well require taking the matter to court.

    By offloading the liability to the platform you also offload the work finding 'objectionable' material, which means all the work is on their end, you can blame them if they don't get 'everything', and any claims of censorship can be laid at their feet should anyone object, rather than the one who is demanding/'persuading' that the content be taken down.

    Lastly, for content that is 'objectionable' but not actually illegal or in violation of a site's ToS, if you had to deal with the speaker directly you're not likely to have much luck finding an excuse as to why the content should be pulled. If you make the site responsible though, and make it clear that if they don't do a 'good enough job' then they're in trouble, they are very likely to go overboard and remove even questionable stuff 'just in case', likely taking care of the 'problem' without you having to do a thing.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 16 May 2018 @ 5:03pm

    It would seem that, unfortunately, the laws regarding intermediary liability are probably irrelevant in most cases.

    A familiar occurrence goes something like this:

    A small independent site owner gets an email saying that a posted comment is not just incorrect, but is libelous, defamatory, or whatever else, and demands that it immediately be taken down under the threat, whether actual or implied, of a potentially expensive lawsuit. So the comment naturally gets deleted as a purely economic and/or survival decision.

    I've had product reviews taken down this way, and I had no hard feelings toward the site owners for doing what was necessary to survive. Large sites that have the money to fight bogus lawsuits can be just as risk-averse, however, as a more bottom-line focused corporate mentality replaces the more ethically or ideologically-charged principles of the random young guy running a site, someone who passionately hates bullies but knows his limitations and recognizes the risks of getting sued, especially by someone with deeper pockets.

    link to this | view in thread ]

  7. icon
    Uriel-238 (profile), 16 May 2018 @ 5:34pm

    If webhosts end up liable for user-created content...

    ...everyone is going to discover the full glory of adversarial input.

    It won't be pretty. It'll be beautiful.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 16 May 2018 @ 5:34pm

    Re: "Well yeah, but the light's better over here."

    Several years ago I read a summary of a case where a judge allowed a lawsuit to go forward against an intermediary provider specifically because the actual speaker likely wouldn't have any money to pay damages while the provider would.

    link to this | view in thread ]

  9. identicon
    Christenson, 16 May 2018 @ 6:39pm

    Internet Bad. Must force someone to fix.

    Me Igor. lol

    We *do* need a better model for trust and attribution, though.

    link to this | view in thread ]

  10. icon
    ECA (profile), 16 May 2018 @ 7:25pm

    Just

    Just a way to bypass the laws..dont need proof, YOU NEED PROOF you didnt do it..

    link to this | view in thread ]

  11. icon
    Aaron Walkhouse (profile), 16 May 2018 @ 7:52pm

    Better yet…

    …the government which built the road is liable for all deaths and all fines in all cases. ‌ ‌ ;]

    link to this | view in thread ]

  12. identicon
    Daydream, 16 May 2018 @ 8:26pm

    Re: Better yet…

    I live next to a secondary school; the road in between my house and said school is actually on a slight hill.
    If you stand at the top of the hill, you can see cars coming both ways, but if you're standing to one side or the other, you can't see cars coming until they're one or two seconds away from running over you.

    Guess where the pedestrian crossing is? Not at the top of the hill.

    So there's your argument for bad road design contributing to potential accidents and why the government should be liable.

    link to this | view in thread ]

  13. icon
    TKnarr (profile), 16 May 2018 @ 8:41pm

    Re: Re: Better yet…

    That isn't intermediary liability though. The government decided where to place the crosswalk, so they'd be directly liable for their choice. Intermediary liability would be holding the contractor who painted the crosswalk where the government told him to liable for the government's choice.

    link to this | view in thread ]

  14. icon
    That One Guy (profile), 16 May 2018 @ 9:07pm

    Re: Re: "Well yeah, but the light's better over here."

    I dearly hope you're misremembering that one, the idea that the judge allowed a lawsuit against a party that was only marginally involved simply because they had more money is insane. They might as well have flat out said 'Profit is more important than sound legal footing' with a ruling like that.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 17 May 2018 @ 12:58am

    "And because of that, lots of perfectly legitimate, protected speech gets blocked and censored. Much of this is abusive. Because once you've supplied a tool that allows someone to flag certain content for censorship, that tool gets used, even if the content doesn't really qualify, and the internet platform is heavily incentivized to remove that content to avoid liability."

    Because their current goal is back-handed demonetization on an ever-increasing wholesale amount of content, backed by cheerleaders who fool nobody (cheerleaders like you) it is ALWAYS ABUSIVE in the hands of Google and it's subsidiaries, primarily YouTube.

    link to this | view in thread ]

  16. icon
    :Lobo Santo (profile), 17 May 2018 @ 1:12am

    I think we're becoming Cardassia...

    (a fictional planet & race from Star Trek: Deep Space 9)

    I specifically remember a scene where the Doctor says to Garak, "I'm sick of these Cardassian mystery novels. There's no mystery, everybody is always guilty!".

    Garak replies: "Of course, Doctor, the mystery is: who is guilty of what?"

    - - - - - - - - - - - - -

    Can you imagine the world we'll create if this becomes common and normal? Nobody will start a new internet business unless they can already afford cutting-edge censorship techniques and an army of lawyers.

    link to this | view in thread ]

  17. icon
    PaulT (profile), 17 May 2018 @ 1:42am

    Re: Re: Better yet…

    Yes, the government should be liable for the decision they themselves made. That's not in question. The question is why people who did not have a say in that decision (say, the guys who installed it or the manufacturers of the lights on that crossing) should be held liable.

    link to this | view in thread ]

  18. identicon
    Christenson, 17 May 2018 @ 7:23am

    Re: Demonetization...it's a fact of life!

    With ubiquitous copy-content machines (computers and the internet) everywhere, how exactly do you plan on charging for content?

    My cell phone will video and sound record anything important, for free...

    Printed books are killing live storytellers! Ban printed books!

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 17 May 2018 @ 7:41am

    Re:

    Said site could post a review of the situation explaining in detail how they were extorted into the removal. Keeping it fact based will put the burden upon the idiot(s).

    link to this | view in thread ]

  20. icon
    Ninja (profile), 17 May 2018 @ 9:06am

    Re: "Well yeah, but the light's better over here."

    So you mean people are a bunch of lazy dirtbags that would rather go after the money than fix the situation by punishing the source of the problem (the speaker). I'm shocked.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 17 May 2018 @ 10:22am

    'for reasons I still don't full comprehend'

    surely the reason is because the sites can easily be found, as can, in most cases, those running and acting as admins on the sites whereas the individuals who are posting on the sites are much more difficult and sometimes, damn near impossible to find! there's nothing easier than being able to blame joe, if he can be found, regardless of whether he has done anything wrong or not, than finding xy3 who is hiding somewhere! on top of that, if blaming joe means he is told to do as something, refuses and has to go to court, the cost will probably shut the site anyway so it's a win-win situation for those who want to impose their will on the site and will do the same, eventually, everywhere!

    people seem to be oblivious as to what is actually going on, world-wide, where the rich, the famous, the powerful few and their friends are actually taking control of everything, by locking things up, locking us out and preventing us from knowing what the fuckers are up to, while knowing the ins and outs of everything to do with us, 24/7!!

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 17 May 2018 @ 11:45am

    Re:

    I thought that was their objective - stopping the minions from having a voice.

    These folk are the same ones who want the internet to be just like broadcast tv was. No interaction, just a one way spew of whatever propaganda they like.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 17 May 2018 @ 11:46am

    Re: Re: "Well yeah, but the light's better over here."

    They worship money and laugh at morals 'n ethics.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 17 May 2018 @ 11:46am

    Re: I think we're becoming Cardassia...

    and having NO NEW INTERNET COMPANIES is exactly what one industry (three guesses which one, and the first 2 don't count) wants, and this whole 'liabilty' shuffle is just the first step in this cold war (and make no mistake, we have been at war with USAA* since the first recording Mr Bell made.)

    It's a war of attrition and obfuscation and they are doing an excellent job of moving the goal posts every time we get close, ooh look over there a shiny... Did something happen while we were looking the other way (probably but we won't know for a while yet).

    link to this | view in thread ]

  25. identicon
    Contrarian, 17 May 2018 @ 11:51am

    But you're assuming all platforms are just passive hosts

    Website #1: www.findahitman.com. Make an account, pay us $20, and we'll provide you a list of professional hit men. Hell, we may even make the introduction.

    Website #2: hitmentruecrimestories.com. Come on our site and post your stories about famous hit men and interact with those who share your passion.

    Same type of platform?

    link to this | view in thread ]

  26. icon
    Uriel-238 (profile), 17 May 2018 @ 1:36pm

    "Profit is more important than sound legal footing"

    That actually sounds like typical judge logic here in the US.

    Other favorites are:

    Screw defendant rights if the crime is awful enough

    Screw defendant rights if it's close to a border

    Screw defendant rights if he's black / a terrorist / a pedophile / I don't like him

    The loot siezed is high value? Rightful forfeiture

    OMG Good faith exception? well that changes everything!

    The defendant is law enforcement? Acquitted!

    The crime is too complicated to understand? Guilty!

    Judges in the US are commonly biased and or idiots. What I can't tell is if they're typically biased or idiots.

    link to this | view in thread ]

  27. icon
    That One Guy (profile), 17 May 2018 @ 3:29pm

    Re: But you're assuming all platforms are just passive hosts

    Different types of platforms. Laws that shield against intermediary liability, like 230, don't do squat if the platform is deliberately involved in something illegal.

    If the site is actively involved, rather than passively, then they can be held liable.

    As for your two examples #1 would probably not be protected, as that seems to be actively engaged in facilitation of illegal activity(assuming it's not a honey-pot anyway), whereas #2 would be, as the description you list seems to suggest it's just a site to share stories, which isn't illegal even if the activity described is.

    link to this | view in thread ]

  28. icon
    The Wanderer (profile), 23 May 2018 @ 6:00am

    Re: "Well yeah, but the light's better over here."

    That certainly covers a lot of it, but I'm not so sure it's the root of the reason in many cases.

    I suspect that many people are working from the (probably implicit and subconscious) idea that "by agreeing to host the content - either when you know what the content is in advance, or by continuing to host it after you learn what it is - you are speaking that content yourself, and therefore you are the speaker, and can be held liable for the speech".

    That is, it's not that they'd object to the original speaker being liable - they just consider anyone who chooses to cooperate / collaborate in the act of speaking, such as someone who chooses to host the speech, to be equally responsible for what is said.

    That's why that whole tangle of ideas including terms like "red-flag knowledge" and "knew or should have known" develops in the first place.

    The result of all of that is the setup you describe, but I think that setup is all based on this deeper root.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.