How Free Speech Online Is Enabled By Not Blaming Websites For The Actions Of Their Users
from the save-section-230 dept
We've written many, many times about the importance of Section 230 of the Communications Decency Act, which provides protections against secondary liability (except for intellectual property) claims for internet service providers. In the simplest form, Section 230 says that if a user does something on a site that's illegal, you need to blame the person who did it, rather than the site. As we've said repeatedly, this makes so much sense it's almost troubling that you need a law to make the point. You blame the person who broke the law, not the tools they used. Duh. You don't blame Ford because that was the getaway car in a bank robbery and you don't blame AT&T because the extortion call was made via an AT&T phone line.But, on the internet, blame often gets thrown around wildly, and websites themselves are too frequently targets of misguided legal claims. Section 230 protects against that and lets companies quickly extricate themselves from bogus legal claims. Unfortunately, Section 230 has been under attack lately by a number of parties. There are a bunch of state Attorneys General who are looking to change the law to exempt it from applying to cases they might bring against sites. You have short-sighted law professors who think that a way to fix "bad" sites online is to gut Section 230. And, unfortunately, you have one really bad district court ruling (which disagrees with every other ruling on the law) which hopefully will get reversed on appeal.
Much of the academic discussions concerning the importance of Section 230 have, quite reasonably, focused on the impact on innovation. As some have pointed out, Section 230 can, in many ways, be credited with "giving us the modern internet." So many of the sites you know, love and use today probably wouldn't exist, or would exist in greatly limited forms, without Section 230. It's truly enabled a revolution in innovation online, allowing sites to build powerful tools for the public, without having to be liable for how the public then uses those tools.
And that brings us to a second important point about Section 230 that perhaps gets less attention, though it probably should get much more. Section 230 has really helped to enable a tremendous amount of free speech online. On a recent edition of his Surprisingly Free podcast, Jerry Brito interviewed Anupam Chander who recently co-authored a paper (with Uyen Le), which highlights "the free speech foundations of cyberlaw" and how Section 230 has been truly key in guaranteeing a lot of free speech online -- basically highlighting how the First Amendment and Section 230 work well together. However, it also points out that it was the First Amendment that underpins all of this, and that we should be wary of challenges to the law that might undermine the First Amendment.
If you know the history of the Communications Decency Act, you know that it was a very, very different bill than Section 230. The original CDA was basically the opposite. It was a bill to censor websites that had "indecent" communications. Most of that got thrown out (after quite the legal fight) as unconstitutional under the First Amendment. What remained, and should continue to remain, is Section 230 -- which was the one part of the bill that was consistent with the First Amendment.
The paper also goes into how copyright law, separately, has been trying to chip away at the First Amendment online, and how dangerous that is as well. Remember, Section 230 doesn't apply to copyright law, though there is Section 512 of the DMCA, which is similar, but not nearly as strong. The paper notes that things like SOPA really were a direct attempt to attack the First Amendment, which is why it's a good thing that it failed.
All in all, it's a good reminder of both how important protecting free speech online is today and how fragile it is, since the laws that have made it possible are so constantly under attack.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: cda, dmca, first amendment, free speech, secondary liability, section 230, sopa
Reader Comments
Subscribe: RSS
View by: Time | Thread
At least i hope that humanity hasnt gotten that retarded and wants to break things they are afraid of.
[ link to this | view in chronology ]
Re:
Unfortunately the bad habit of breaking things that they are afraid of has deep roots, and is the basis of tribalism. To a large extent it is the habit of humanity that is harnessed by those who seek power as king, emperor, pope, prophet etc.
[ link to this | view in chronology ]
Re: Re:
Also, there is a recent tendency to blame the owner for the behavior of others, which the owner has limited, if any, control.
[ link to this | view in chronology ]
Good arguments, but...
If I have a billboard company that allows anyone to put up any advertising and entirely refuses to take it down, even if it's obscene, abusive, illegal, or slanderous... what would happen? Clearly as the billboard company I would be in a position to stop such speech. In fact, I am a key part of the process of making that speech. Without the billboard, the speech (good or bad) would not be made in that manner.
So the billboard owner isn't just a passive "service provider", but more of an enabler, one who aids in making such speech. The result is that advertising companies of all sort (and any other publishing company) has certain responsibilities under the law to deal with such issues.
Section 230 is wonderful, except that it grants a level of protection above and beyond what would be found in the real world. If you anonymously put up slanderous posters on every light pole in town, you can be sure that someone would work to figure out who put them up, and take you to court for it. Heck, you might even end up with a criminal case (harassment) to deal with as well.
A website owner has some responsibility to deal with what is on their site. Most take this for granted at least to some extent, as an example I don't think we can post child porn or death threats here in the comments on Techdirt. Like it or not, you have ignored section 230 to do what is right, morally and legally. Having crossed that line already for CERTAIN illegal or distasteful acts, why would you think that you have no liability for the rest of them?
Section 230 would work and get supported by all sides if it didn't create a special "on the internet" class which allows users to hide behind a website or ISP's skirt. It creates a hole where the website / service provider won't take action, and at the same time impedes access to the information required for an injured party to take action. It creates a situation where the abuser gets a double helping of protection, and the abused party gets told to pound sand.
The hope should be that over time, the internet remains free with a little sideline of "but you don't get more rights here".
[ link to this | view in chronology ]
Re: Good arguments, but...
I disagree. The company can only reactively remove the "speech" from their property. They wouldn't have control over who comes there to post anything on there. So, they can't "stop" a damn thing, they can only try to make sure that anything unauthorised is not seen for long. You can perhaps argue that they're responsible if they refuse to take it down, but they sure as hell aren't liable for the "speech" being present in the first place.
It's a bad analogy, anyway. What billboard company allows people to use the finite space that's their entire business to be overrun by people posting their own stuff for free. In this case, the "on the internet" part is absolutely vital, since it changes the whole concept once you're not dealing with finite. expensive real estate.
You're basically arguing that the owner of a wall should be liable for the content of the graffiti painted on it. It's idiotic, short-sighted and will achieve nothing if enforced, since the owner can only ever react to something being posted. Unless you're arguing for people to have to screen everything ever posted, which ironically probably means I wouldn't have been able to real this comment of yours and reply to it - at least not yet since the volume of comments being screened by a human being would cause ridiculous delays.
[ link to this | view in chronology ]
Re: Re: Good arguments, but...
You're basically arguing that the owner of a wall should be liable for the content of the graffiti painted on it.
Is that such a bad standard?
I agree that it would be absurd to have strict liability for the owner of the wall to be liable for graffiti painted on it. But, lets agree that the graffiti is somehow violating a third party's rights.
Then, I think nobody would say the wall owner is liable at the moment that the graffiti is placed there. After all, he had nothing to do with creating the graffiti, and probably didn't even want it there.
But, would you argue that he should never become liable? If he gets a letter from the subject of the graffiti, bringing it to his attention? If it is easy to remove? And, yet, the wall owner decides that he just doesn't care? At some point wouldn't you want the wall owner to have any responsibility at all?
[ link to this | view in chronology ]
Re: Re: Re: Good arguments, but...
And this will never be in dispute, corporations will always be correct in their assertion of being violated whereas the common folk will always be incorrect when they claim violations. Furthermore, simply stating this will become a violation of someone's rights. Off to a for profit prison you insulter of Kings.
[ link to this | view in chronology ]
Re: Re: Re: Re: Good arguments, but...
[ link to this | view in chronology ]
Re: Re: Re: Good arguments, but...
If the owner of the wall is ordered by a court to remove it, and they fail to comply with that order, then it might be acceptable for them to be charged with violating the court order. But they should NEVER be held responsible for the graffiti itself. The writer of the graffiti should always be the only one responsible, not the easiest to target innocent 3rd party.
[ link to this | view in chronology ]
Re: Re: Re: Re: Good arguments, but...
Including if the aggrieved party is willing to clean the graffiti off himself, but the wall owner says "nah, I like it there" -- or "well, I might not have written it, but so many people come by to see it, that it has actually improved my property values, so I am leaving it."
Should the wall owner be automatically or immediately liable? I say no. However, I see no reason why the wall owner should be able to reap the benefits of the content without enjoying the responsibility for it as well. That responsibility should not attach immediately, nor should the system have "clean the wall" as its default. But, if a party has a legitimate legal grievance, is willing to bear the cost of cleaning the wall, and asks the wall owner for permission to do so, then the wall owner should either have to permit it, or make the words his own.
The reflexive desire to protect Section 230, in its current form, at all costs, borders on irrational.
Only Sith think in absolutes, PaulT.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Good arguments, but...
If believing that a person who did not write a comment should never be held liable for its content makes me a Sith, then call me Darth Paul. A person should never be held directly liable for the actions of others. However, this does not mean that I excuse the owner from their own actions (e.g. refusing to allow access to remove the graffiti, refusal to comply with a valid court order, etc.).
This is a very simple concept, not an irrational one. A 3rd party should never be held liable for actions committed by others, only for actions they themselves committed. I'm yet to hear a sane argument as to why they should that doesn't fall into the realm of "but it's too hard to go after the people actually responsible", and that's not an acceptable excuse in my book.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Good arguments, but...
Further, you seem to be ok with that concept -- you just draw the line at the wall (or website) owner being liable only after a court order. I'm not arguing for liability prior to a court order. I'm arguing that the website owner should get the choice - once put on notice - of either defending the content or complying with a takedown request (with appropriate safeguards and blowback for bogus complaints).
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Good arguments, but...
...and that's exactly the point I'm trying to get at. If the content is found by the appropriate authority to require a takedown for whatever reason, it's perfectly acceptable for the site owner to be charged with failing to comply with that legal request. Safeguards to prevent the abuse we've often seen with takedown requests are also a great idea. But, nowhere do I think that the owner should be liable for the content of the message they refuse to take down, only the act of refusing the court order to do so.
Either we're agreeing and you're arguing semantics, or you're not understanding my basic point.
[ link to this | view in chronology ]
Re: Re: Re: Good arguments, but...
If it bothered the person that much, I would probably give them permission to clean it off.
I can't be responsible for what everyone or anyone may find offensive. Maybe the offended party needs to take a little responsibility for their feelings.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
What would happen to TV
WHAT would TV channels do, if they were liable for commercials and TV programs??
HOW many Pharmacy commercials would be WIPED off of TV??
How many Money loan commercials?
Fox news? GONE..
[ link to this | view in chronology ]
Re: What would happen to TV
[ link to this | view in chronology ]
Re: Re: What would happen to TV
Being able to SUE anyone for having a different opinion?
Having to prove Facts?
1/2 the Doctor shows,
MOST of the COP shows,
Dont want religion on TV, sue them..
Only thing on TV would be FACT or REAl fantasy..
MORE Cartoons..
[ link to this | view in chronology ]
Re: What would happen to TV
[ link to this | view in chronology ]
Re: Re: What would happen to TV
[ link to this | view in chronology ]
Re: Re: What would happen to TV
The big advantage of the Internet is that creative people can find an audience without having to convince a publisher to carry their work. This is enabling many more authors, musicians artists etc. to find an audience, and much of this work would not be available to the public if it had to be approved by a third party.
Requiring that all content is approved by an editor is an insidious form of censorship, in that it requires that a third party, usually an editor, decide which created works will be offered to the public.
[ link to this | view in chronology ]
Re: Re: What would happen to TV
TV stations also "opt" out of that responsibility by broadcasting a bit of legalese stating that "the following is the view and opinions of others and not necessarily the views or opinions of this station or it's owners".
Section 230 does that as the blanket default for websites. How is that any different from what the TV stations do if you omit the opt-in part?
[ link to this | view in chronology ]
Re: Re: What would happen to TV
It's a horrible analogy anyway, since advertising complaint authorities exist and are regularly upholding complaints from the public. So - by your own example - even if sites screened everything ever posted, they would probably still be held liable for wrongdoing of others.
I do, however, find it amusing that an anonymous coward is arguing for controls that would probably remove their right to post as such. Seems to me someone hasn't thought this through.
[ link to this | view in chronology ]
Re: Re: What would happen to TV
[ link to this | view in chronology ]
Then web forums were created, giving the censors targets to attack. Everything went downhill from there.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Why can't we go back to that?
[ link to this | view in chronology ]
Response to: Anonymous Coward on Feb 15th, 2014 @ 2:34am
[ link to this | view in chronology ]
Laws prohibit, not enable.
Lets pretend there are no laws. Now make one law that allows me to do something that I can't already do.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
toxic reader comments
It is, sadly, the de facto responsibility of webmasters and bloggers to closely monitor all comments and quickly delete any that might result in a knock on their door -- or worse-- by police or federal agents. This is especially true if the blogger tends to write about controversial subjects, especially with an attitude that angers law enforcement authorities or people/"legal-persons" that influence them. Let's not forget that it's one thing to be in legal compliance, but entirely another to attract unwanted attention from law enforcement despite your legal compliance, and not surprisingly, few people want to become the latest martyr by exercising their legal rights to the fullest.
People who run websites of a controversial nature tend to develop a habit of vigorously patrolling the comment section and reflexively deleting the more extreme or inflammatory remarks (and banning the user, sometimes even back-deleting all previous comments) since experience has taught them they were likely put there by an agent provocateur trying to get them in trouble.
Frankly, it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years. Although most sites start out adhering to the concept of 'free speech' (minus the obvious, univeral no-no's of course), this gradually erodes as more and more restrictions are put in place -- at least that's been the case in every site I've ever followed. Power and control being an ever-present human temptation, I can understand the site owner's (apparent) point of view: reader comments become increasingly time-consuming to monitor, tiresome to read, and the liability danger more acknowledged (and fear is a BIG motivator in overriding a person's sense of -and adherence to- ethics and principle.)
[ link to this | view in chronology ]
Re: toxic reader comments
Techdirt intentionally holds comments for moderation when they come from people they don't like. Basically, while Mike talks a good game about freedom of speech, he clearly isn't beyond manipulating it to make himself look better. Techdirt as a whole dislikes any viewpoint that doesn't echo whatever the trend of the month is on this site, and Mike and his moderators are very careful to keep things all uniform and positive on the messages presented.
It's the snowy white world of secret, quiet, and discreet management of viewpoints by blocking those they don't agree with. Some would call it censorship, but they swim away by allowing posts to finally appear days after they are posted, when people are no longer actively reading the thread. It's pretty sneaky, even for Techdirt.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: toxic reader comments
No, they hold comments that come from spamming or trolling idiots whose previous comments have regularly been reported as such by other users. If you're one of the people who suffers from moderation on a regular basis, you might want to read your comments and work out why. Unless they just contain lots of links, of course, in which case they're usually moderated and most of us (including myself) have had that happen. They DO get approved if they're acceptable, however.
In short, stop lying and you'll stop getting filtered - although I find it amusing that *this* is the level of censorship you find unacceptable. You don't get out much, do you?
[ link to this | view in chronology ]
Re: toxic reader comments
No, it's not the responsibility, de facto or otherwise. Sites that do this are simply choosing not give up freedom in exchange for (apparent) safety. That is their right, and in some cases it may even be understandable, but it is not a responsibility.
"Frankly, it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years."
Not really. Such policies are still not rare amongst the better sites. Sites that do delete comments for fear of a "knock on the door" are sites that don't have comment sections worth reading.
Note that I differentiate between deleting comments for fear of attention from the authorities and deleting comments because they are against the stated policies of the site in question. Enforcing an editorial policy is fair, acting out of fear is just cowardly.
[ link to this | view in chronology ]
Re: toxic reader comments
I imagine there are sites that patrol their comment sections out of fear. However, it is probably more prevalent that sites patrol their comment sections because they engage in censoring things they do not agree with.
"reader comments become increasingly time-consuming to monitor, tiresome to read ..."
Or, they tire of the additional overhead going to legal council. Has anyone done a study of the affect upon traffic (ad revenue) when the comment section(s) are stifled or even completely removed?
[ link to this | view in chronology ]
Re: toxic reader comments
[ link to this | view in chronology ]
Re: toxic reader comments
Frankly, it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years.
Well, the agents provocateur we get here are OOTB&co., so really, it's not shocking at all.
[ link to this | view in chronology ]
Lets not totally close our minds...
Section 230 is one of those beliefs. When I look back at all the Section 230 cases that I have both handled and researched, it seems to me that it might be superior from a policy perspective to have some kind of notice and responsibility provision. If an OSP receives a notice that the content is somehow violative of the complainant's rights, then the OSP can either a) take it down, or b) accept legal responsibility for the content.
I would, however, suggest that it would not be that simple -- as there should be prevailing party attorneys' fees on both sides - so if the OSP takes responsibility and gets sued, the content had better be actually violative of some legal right. I'd also like to see a provision that the author would receive statutory damages (payable by the complaining party) if the content was taken down due to a bogus complaint. And perhaps even the right of the OSP to bill the complaining party for its fees in having a lawyer review the complaint, whether the complaint is bogus or legitimate (after all, the OSP should not be taxed with the cost of a letter writing campaign by someone with censorious intent).
Section 230 has been a good engine for development of countless services, and that is a good thing. Unfortunately, it has also been a good engine for harmful arrogance on the part of a lot of 650/415 area code businesses.
There are companies that exercise responsibility - in my experience, Automattic is pretty protective of its users' rights, but it will not simply cover its ears and say "Section 230, Section 230, Section 230" when concerns are brought to its attention. I've represented a number of companies that have internal responsibility policies that make me proud to represent them. I have Sect 230 clients who don't give a shit too, and for as long as that is the law, then I'll defend their right to not give a shit until they tell me to relent.
But, the problem with Section 230 is that it has become a license to not give a shit. The more irresponsible, the more profitable. That's not really desirable.
[ link to this | view in chronology ]
Re: Lets not totally close our minds...
Now, the attorney's fee provision would help that a bit, but even with a guarantee of recovering all the costs you incurred going to court, or even just the costs of a lawyer should it not get that far, heading to court would still be a major hassle, and one a lot of sites, and likely all smaller sites with user created content would rather avoid, leading to said content likely being pulled at the first sign of trouble.
I think the current system is probably the best way to handle it, if someone has a problem with content on a site, they go after the one who put it there, and afterwards, should they win in court, then they can present the win in court to the site owner and have the content pulled, so the site itself is only involved at the very end, and has minimal work to do.
[ link to this | view in chronology ]
Re: Re: Lets not totally close our minds...
But, I can't say that I don't have personal feelings of sympathy for the poor bastards trying to deal with that. Did the original poster use Tor? Tough shit for you then. Is the original poster judgment-proof? Tough shit for you then.
And, if someone does wind up proving that the content violates some right, they do that after five-to-six figures in attorneys fees, and two years of fighting. Meanwhile, the OSP could have made a judgment call.
Now sometimes they do. I've represented Sect. 230 businesses who would act responsibly, and take clearly violative content down. I've also represented aggrieved parties, and sent Sect. 230 businesses letters stating "I know you're protected by Section 230, but here's why this should come down..." and I have had good results.
The current system is very nice for me, Google, and Facebook. I love the money I make defending Section 230. Google loves having as much content as it can, without giving half a shit about anyone else. But, to think that the current system is the best one is to think quite narrowly.
[ link to this | view in chronology ]
Re: Re: Re: Lets not totally close our minds...
However, there is no such thing. Whatever is put in place will be gamed and those that censor will have free rein with little recourse.
[ link to this | view in chronology ]
Re: Re: Re: Lets not totally close our minds...
How does this fit into the long standing legal tradition of treating anonymous speech as protected by First Amendment? From your tone in what I quoted is sounds like you may wish to erode that protection, but I'm not sure, so I'm asking.
[ link to this | view in chronology ]
Re: Re: Re: Re: Lets not totally close our minds...
To answer your question, I do not wish to erode the protection for anonymous speech. I think that the balance we have is pretty good -- when the courts actually follow it. I think a very good case articulating the standard is Krinsky v. Doe 6: http://www.dmlp.org/sites/citmedialaw.org/files/2008-02-06-Krinsky_v._Doe_Opinion.pdf
The bottom line is, if the speech is truly actionable (and shows that before revealing the speaker's identity), then the speaker should not be able to evade liability only because the speaker manages to hide. On the other hand, I do not think that plaintiffs should be able to unmask anonymous speakers without making a showing that the speech is actionable, because First Amendment.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Lets not totally close our minds...
[ link to this | view in chronology ]
Corporations have their Mission Statements and claims of being community members but when it comes to the bottom line, money matters most. I would not lay the blame for this at the feet of section 230.
[ link to this | view in chronology ]
Section 230
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Abused
Should they be sued and exposed?
[ link to this | view in chronology ]