Explainer: How Letting Platforms Decide What Content To Facilitate Is What Makes Section 230 Work
from the Congress-got-this-right dept
There seems to be some recurrent confusion about Section 230: how can it let a website be immune from liability for its users' content, and yet still get to affect whether and how that content is delivered? Isn't that inconsistent?
The answer is no: platforms don't lose Section 230 protection if they aren't neutral with respect to the content they carry. There are a few reasons, one being constitutional. The First Amendment protects editorial discretion, even for companies.
But another big reason is statutory, which is what this post is about. Platforms have the discretion to choose what content to enable, because making those moderating choices is one of the things that Section 230 explicitly gives them protection to do.
The key here is that Section 230 in fact provides two interrelated forms of protection for Internet platforms as part of one comprehensive policy approach to online content. It does this because Congress actually had two problems that it was trying to solve when it passed it. One was that Congress was worried about there being too much harmful content online. We see this evidenced in the fact that Section 230 was ultimately passed as part of the "Communications Decency Act," a larger bill aimed at minimizing undesirable material online.
Meanwhile Congress was also worried about losing beneficial online content. This latter concern was particularly acute in the wake of the Stratton Oakmont v. Prodigy case, where an online platform was held liable for its user's content. If platforms could be held liable for the user content they facilitated, then they would be unlikely to facilitate it, which would lead to a reduction in beneficial online activity and expression, which, as we can see from the first two subsections of Section 230 itself, was something Congress wanted to encourage.
To address these twin concerns, Congress passed Section 230 with two complementary objectives: encourage the most good content, and the least bad. Section 230 was purposefully designed to achieve both these ends by providing online platforms with what are ultimately two complementary forms of protection.
The first is the one that people are most familiar with, the one that keeps platforms from being held liable for how users use their systems and services. It's at 47 U.S.C. Section 230(c)(1).
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
It's important to remember that all this protection provision does is say that the platform cannot be held liable for what users do online; it in no way prohibits users themselves from being held liable. It just means that platforms won't have to be afraid of its users' online activity and thus feel pressured to overly restrict it.
Meanwhile, there's also another lesser-known form of protection built into Section 230, at 47 U.S.C. Section 230(c)(2). What this protection does is also make it safe for platforms to moderate their services if they choose to. Because it means they can choose to.
No provider or user of an interactive computer service shall be held liable on account of (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
Some courts have even read subsection (c)(1) to also cover these moderation decisions too. But ultimately, the wisdom of Section 230 is that it recognizes that to get the best results – the most good content and also the least bad – it needs to ensure platforms can feel safe to do what they can to advance both of these things. If they had to fear liability for how they chose to be platforms, they would be much less effective partners in achieving either. For instance, if a platform had to fear legal consequences for removing user content, they simply wouldn't. (We know this from FOSTA, which, by severely weakening Section 230 has created disincentives for platforms to try to police user content.) And if platforms had to fear liability for enabling user activity on its systems, they also wouldn't do that either. They would instead end up engaging in undue censorship, or cease to exist at all. (We also know this is true from FOSTA, which, by weakening Section 230, has driven platforms to censor wide swaths of content, or even cease to provide platform services to lawful expression.)
But even if Section 230 protected platforms for only one of these potential forms of liability, not only would it not be nearly as effective at achieving Congress's overall goal of getting both the most good and least bad online as protecting them in both ways would, but it wouldn't be nearly as effective for achieving even just one of those outcomes as a more balanced approach would. The problem is that if ever platforms find themselves in the position of needing to act defensively, out of fear of liability, it tends to undermine their ability to deliver the best results on either of these fronts. The fear of legal liability forces platforms to divert their resources away from the things they could be doing to best ensure they facilitate the most good, and least bad, content and instead spend them on only what will protect them from whatever the threat of legal liability is causing them to spend outsized attention on.
As an example, see what happens under the DMCA, where Section 230 is inapplicable and liability protection for platforms is so conditional. Platforms are so fearful of copyright liability that this fear regularly causes them to overly delete lawful, and even often beneficial, content, despite such a result being inconsistent with Congress's legislative intent, or waste resources weeding out the bad takedown demands. It's at least fortunate that the DMCA expressly does not demand that platforms actively police their users' content for infringement. Because if they had to spend their resources policing content in this way it would come at the expense of policing their content in a way that would be more valuable to the user community and public at large. Section 230 works because it ensures that platforms can be free to devote their resources to being the best platforms they can be to enable the most good and disable the most bad content, instead of having to spend them on activities that are focused only what protects them from liability.
To say, then, that a platform that monitors user content must then lose its Section 230 protection is simply wrong, because Congress specifically wanted platforms to do this. Furthermore, even if you think that platforms, even with all this protection, still don't do a good enough job meeting Congress's objectives, it would still be a mistake to strip them of them of what protection they have, since removing it will not help any platform, current or future, from ever doing any better.
What tends to confuse people is that curating user content appearing on a platform does not turn the content into something the platform should now be liable for. When people throw around the imaginary "publisher/platform" distinction as a basis for losing Section 230 protection they are getting at this idea that by exercising editorial discretion over the content appearing on their sites it somehow makes the content become something that the platforms should now be liable for.
But that's not how the law works. Nor how could it work. And Congress knew that. At minimum, platforms simply facilitate way too much content for them to be held accountable for any of it. Even when they do moderate content, it is still often at a scale beyond which it could ever be fair or reasonable to hold them accountable for whatever still remains online.
Section 230 never required platform neutrality as a condition for a platform getting to benefit from its protection. Instead, the question of whether a platform can benefit from its protection against liability in user content has always been contingent on who created that content. So long as the "information content provider" (whoever created the content) is not the "interactive computer service provider" (the platform), then Section 230 applies. Curating, moderating, and even editing that user content to some degree doesn't change this basic equation. Under Section 230 it is always appropriate to seek to hold responsible whomever created the objectionable content. But it is never ok to hold liable the platform they used to create it, which did not.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: balance, cda 230, congress, content moderation, free speech, section 230
Reader Comments
Subscribe: RSS
View by: Time | Thread
"if they aren't neutral"
It may be beneficial to their argument if they were to define exactly what the word neutral means to them. I imagine there would be plenty of differing responses.
[ link to this | view in thread ]
Except the editorial decisions in question were not supposed to show political bias, but instead the level of "offensiveness" in a nonpartisan sense.
[ link to this | view in thread ]
The bug in the system...
The problem with section 230 is that under the provisions you describe above, a platform that simply allows users to upload content and then does post-hoc moderation is exempt from pretty much all forms of liability related to that content. An organization (like a traditional publisher) that does pre-publication moderation/curation and enters into a paid agreement with a creator does not gain the same immunity.
This encourages a "slave labor" market for content creators. Or at best, an expansion of the "gig economy" where creators are dependent on the scraps of income they can get from ad revenue sharing, viewer donations and merch sales. We're encouraging freedom of the "press" at the expense of further hollowing out of incomes for the middle class. Only a select few get the views and clicks that earn them a steady income.
While this "luck of the draw" has always been true in entertainment media, the time investment has changed to make the effort more risky for creators. In the old model, a creator has to audition or apply for projects. But once they get accepted for a project, they have a contracted rate of pay. Applying for a project requires time investment for updating your C.V., rehearsing audtion scripts, or whatever, but if the projects are available, you can pretty much apply continuously until you get a gig.
Under the platform model, a creator has to select a small number of projects and focus on them more intensely over a longer period of time to actually create the content, with no guarantee of reward. The risk increase comes from the fact that there are fewer eggs in the basket of opportunities under the new model. Some content creators will enjoy the new "lotto" economy: benefits to creators under the new system include a greater share of the income if they do score big. Others prefer less risk and more stability.
There's an unresolved tension of benefits and trade-offs between the old and new models. I'm not saying section 230 needs to change, but a new publication model that mitigates the economic risks of pure "platform" publication while still maintaining the Section 230 immunities for the platform host would be useful here.
[ link to this | view in thread ]
And you can create that model without having to wring your hands over CDA 230 and the “publisher vs platform” issue.
[ link to this | view in thread ]
Re: The bug in the system...
What you are ignoring is that under the publisher pre-moderated model is that most creative content never gets looked at. Those that get selected for publication have won a lottery ticket, and have a chance at a big prize, while everybody else gets nothing, and has no chance of getting anything for their creative efforts.
[ link to this | view in thread ]
Preaching to the choir and pissing into the wind
This is another good, clear-cut article on why Section 230 is just fine and frequently misunderstood. At this point, though, these articles all say essentially the same thing and are delivered to an audience that already knows and agrees with them for the most part. The few outliers/trolls will never be convinced no matter how many sane articles are published and will continue to attack the people who post and comment on them rather than offering any kind of reasonable argument against them, all out of a willful misunderstanding of how things work.
In short, all these stories do now is offer yet another opportunity for the densest among us to spew their bile and hate, further demonstrating their ignorance.
It's a critically important topic and one we need to get our densest representatives to leave be. But there must be something more we can do than keep posting pro-230 articles in the same venues (shades of echo chamber) and fruitlessly writing our representatives to advance good sense.
[ link to this | view in thread ]
Re: The bug in the system...
Very simple explanation here, think of this site right here where you are commenting. TD should in no way whatsoever be held liable for the comments they did not post. They have every right to moderate as they see fit in order to maintain a sense of community and to rid the comments section of spam.
Now, if they post something to their site, such as the article you just read, they can be held liable for that post because TD is the entity that did the posting.
Why is it so hard for people to understand that liability should be placed where it belongs, on the person who made the post, and not the site where the post was made.
[ link to this | view in thread ]
Re:
No, clarity is rarely useful in politics. Defining words makes it much harder to weasel out of criticism by saying "of course that's not what I meant". Facts can be refuted; emotional arguments can't.
[ link to this | view in thread ]
Re: Re:
Hence, no one believes their bs
[ link to this | view in thread ]
I don't think the people who claim they want "neutrality" actually want that. They just want the content they approve of to be immune from removal. When you then point out that if a platform is forced to host their content, the platform (to maintain its neutrality) must host other content that the complainers might find objectionable, they sputter out, "well, er, um... that's DIFFERENT!" without explaining why it's different or how such a difference would be codified into the "neutrality" rule and how a platform would be expected to be able to filter one but not the other.
[ link to this | view in thread ]
Re: Re: The bug in the system...
At this point, I don't think people do have trouble understanding that.
I think people understand that, but don't want it to be that way, because it means they can't go after the organizations that actually have money.
It all comes down to greed, masquerading as concern for these poor defamed people. Note they can never point to specific examples.
[ link to this | view in thread ]
Re: The bug in the system...
That's an interesting straw man you have there... would be a shame if something happened to it...
Oh, wait, old model, "enters into a paid agreement with a creator" vs new model, "posts to website and receives payment from interested individuals". You are trying to compare a contracted service for creative individuals (ie. work for hire, so result is owned by the 'corporation' that the artist sold their soul to) to content creators creating what they want and selling it. GEE, I wonder why they aren't the same? (hint: one heavily benefits the 'corporation' while the other has the potential to benefit the actual 'creator'... I know it's hard to see the difference when your job depends on you not understanding, but we'll wait, it's a fairly simple topic.
[ link to this | view in thread ]
Re: Re: Re: The bug in the system...
Nope, the issue is not greed, unless you want to talk Kamela Harris, now senator going after backpage.
The issue is that some bad things (TM) have gone down on the big platforms, and they were easy. Harassment, Alex Jones style? That is now easy; Mr Jones himself did not do it! Radicalization, let’s turn you into a terrorist, all with Facebook or what have you. And let’s not forget the moral panic over porn or child porn, or to mix in Hollywood unhappy about you possibly seeing it’s movies without paying, or god forbid watch it in Urdu!
The result? DO SOMETHING! (Especially if it is wrong, lol)
What we actually need is for the big platforms to start offering those technical means mentioned, including on a federated basis. That federation of interested users voting on posts is a big part of what makes Techdirt work.
For another proposal, suppose Alex Jone’s confederate harassers started getting legal notices when they contacted Sandy Hook victims, warning that they were to be de-anonymized and risking liability if they continued.
[ link to this | view in thread ]
Re: Preaching to the choir and pissing into the wind
Thing is, this choir needs some new ideas.
How would you combat an Alex Jones harrassment campaign or an anti-vaxxer idiot? Oh, and don’t call those two THEM, because they could be you or me, that is why censorship does not work.
[ link to this | view in thread ]
Re: Re: Re: The bug in the system...
Nope, the issue is not greed, unless you want to talk Kamela Harris, now senator going after backpage.
The issue is that some bad things (TM) have gone down on the big platforms, and they were easy. Harassment, Alex Jones style? That is now easy; Mr Jones himself did not do it! Radicalization, let’s turn you into a terrorist, all with Facebook or what have you. And let’s not forget the moral panic over porn or child porn, or to mix in Hollywood unhappy about you possibly seeing it’s movies without paying, or god forbid watch it in Urdu!
The result? DO SOMETHING! (Especially if it is wrong, lol)
What we actually need is for the big platforms to start offering those technical means mentioned, including on a federated basis. That federation of interested users voting on posts is a big part of what makes Techdirt work.
For another proposal, suppose Alex Jone’s confederate harassers started getting legal notices when they contacted Sandy Hook victims, warning that they were to be de-anonymized and risking liability if they continued.
[ link to this | view in thread ]
Re:
I think that's the reason they all sink to insulting the authors and commenters. It's their last line of defense for their terrible ideas.
[ link to this | view in thread ]
Re: Re:
Are they really ideas? More like psychosis.
[ link to this | view in thread ]
Spin spin spin
[ link to this | view in thread ]
Re: Re: The bug in the system...
You know what, that's a good point -- particularly for writers and to a lesser extent for directors and musicians. For actors, makeup artists and crew, maybe not as much. But still knocks a lot of the wind out of the original argument.
[ link to this | view in thread ]
Re: Re: The bug in the system...
My point wasn't about who should be liable for action, it's about how the non-paying model is a crapshoot for creators. Poster above pointed out that a lot of creators like writers and musicians still have to put in way more effort up front under both of the publishing models I was comparing which is a much more relevant argument.
[ link to this | view in thread ]
Re:
CDA 230 isn't the bug I'm talking about here.
I'm not wringing my hands over CDA 230, I'm wringing my hands over how the new social media platforms are just as predatory as the old school publishers in terms of getting something for nothing out of underpaid creators. IMO CDA (as an unintended consequence) incentivizes development of platform-based businesses over publishing businesses, but that doesn't necessarily mean that CDA is what needs to change.
For platforms that prioritize views and clicks, simple economics incentivizes unpaid user contributions over paid user contributions within a platform. Plus it's not worth it to them to open the can of worms for the courts to decide whether paid user content would cross the line from service provider to publisher. Originally, I was claiming that the new platforms are even more predatory than the old school, but in retrospect, that was ridiculous (Hollywood accounting, among other examples).
I'm not here just to complain about CDA or about the platforms though, I'm trying to envision if there's a business model that can:
a) properly compensate creators.
b) avoid the injection of obscene amounts of obscenity into the site, resulting in huge moderation costs.
c) fall into the realm of a media platform service.
and the catch is that there's a fundamental conflict between A) and B). If the platform is big enough for creators to get rich, it's big enough that the trolls will move in. Or if it's small enough to slip under the radar of the general public, it's too niche to make money on.
One way to mitigate this is for platforms where creators aren't expecting compensation, such as this comment section. But comments are a drop in the ocean of user-created content.
[ link to this | view in thread ]
Re: Re: Re: The bug in the system...
Except for the paragraph that I quoted where you say exactly that.
But as to your point about paid/non-paid, when has it even been anything but a crapshoot? In the old gatekeeper model, you have <1%, in the new model, you have >1%. Either way, it's still a crapshoot.
[ link to this | view in thread ]
Re: Re:
I get what you are trying to say, but how can you properly compensate a creator on your platform if that creator is not generating any revenue for the platform?
Let's take a hypothetical, I am a musician. My chances of signing a record contract (if that is what I really wanted) is very slim. Instead, I upload my music to YouTube (or some other site), and it is very unpopular and nobody likes it and basically I suck as a musician. How should YouTube properly compensate me?
What happens if I actually am a talented musician and my work becomes popular, therefore earning the platform some revenue? How should YouTube compensate me? Isn't their revenue sharing model a good thing in that I have an opportunity to make money on my music, something that I could never have in the Gatekeeper model. I may not get rich off of my music, but I am making a lot more money off my music than I could have in the old model.
[ link to this | view in thread ]
Re: Re:
Publishing for free on the Internet is a voluntary activity, as is most creation of new works. That said, the Internet has enabled more people to build a fan base and go full time with their creativity, mainly relying on services like patreon, than the pre-internet gain a publisher model ever did.
Creating new works has never been the problem for creators, while gaining a fan base always has. It is now no longer necessary to gain a publishing contract to gain a world wide audience for anyone who can use social media to their advantage.
[ link to this | view in thread ]
So, need to get Trumpians in. And you'd have no objection.
You've here given up the philosophical basis.
Therefore, your whole support of these mega-corporations is based on that you think "the right people" are in charge, and that "The Right" won't ever be.
Cool. Just don't complain if corporations turn out to not actually share your views, but to be amoral money-machines. Because YOU have enabled them to control ALL speech on the internet.
[ link to this | view in thread ]
Re: So, need to get Trumpians in. And you'd have no objection.
Fuck off nazi scum
[ link to this | view in thread ]
Re:
Have a FOSTA vote, blue. Brought to you by your favorite corporations! How about them apples?
[ link to this | view in thread ]
Re:
What a strange comment. What in the article is "spin"? It's a rather detailed explanation from a practicing lawyer.
[ link to this | view in thread ]
And form where did you get your Law Degree Mike??? oh... wait ..... what??? Fuck!
[ link to this | view in thread ]
Re:
Where’s your degree from bro? Uncle McDonald’s Fry Academy?
[ link to this | view in thread ]
Conservatives are reaping what they've sown
It's so bizarre watching conservatives complain about corporate tyranny when they defended it for generations. To quote the Epistle to the Galatians, you reap what you sow.
It was conservatives who supported the Telecommunications Act of 1996 in particular and defanging antitrust law in general, yet they complain about how Facebook and Google have no real competition.
When malls were taking over as the new town square, conservatives vehemently defended their property rights and were incensed by the Pruneyard decision in California. As recently as a few years ago, they were in favor of the Mall of America ejecting Black Lives Matter protesters. But now that it's them getting kicked off social media, they cry foul.
Conservatives defended the monstrosity of at-will employment for generations -- the United States is the only developed country with such a hideous arrangement. In at least one state (Colorado) it was put on the ballot, and conservatives overwhelmingly voted in favor of it. But now that it's them getting fired for off-the-clock, non-performance related issues (most often bigoted social media posts), they whine about how unjust it is.
Conservatives stood by and cheered as unions were destroyed in this country -- a union with a collective bargaining agreement is one of the few bulwarks against at-will employment, and one of the few organizations that will go to bat for wage earners.
In America, if you lose your job you often also lose your healthcare -- and it was conservatives who rallied against public healthcare plans by President Clinton in the 1990s, President Obama in 2010, and Medicare for All by left-leaning politicians such as Bernie Sanders and AOC even today.
Conservatives are finally realizing that giving near total control of our society over to a group of unelected robber barons (and contrary to the Horatio Alger nonsense we are force fed in school, the overwhelming majority of them did nothing to earn it except be born to wealthy parents -- most wealth is inherited) was not such a good idea.
[ link to this | view in thread ]
Re: Re:
Woooooshhh.... That was the sound of that comment going over your head.
Did you not notice that Mike didn't write this article, and the author who did write the article actually does have a law degree?
I gave the comment a funny vote, why did you have a hard time understanding it?
[ link to this | view in thread ]
Re: Re: Re: Re: The bug in the system...
In politics, greed is always an issue.
[ link to this | view in thread ]
Re: Conservatives are reaping what they've sown
If it does not support their latest get rich quick scheme it gets tossed like a used prophylactic.
[ link to this | view in thread ]
Re: Re: So, need to get Trumpians in. And you'd have no objectio
Wow, solid content here! Glad to see more people in the camp of "Everyone I disagree with is a nazi!". This is not productive and provide nothing to the conversation since you aren't debunking anything he/she/it is adding to the conversation.
[ link to this | view in thread ]
Re: Re: Re: So, need to get Trumpians in. And you'd have no obje
The OP is constantly being debunked and proven wrong but still returns and spews the same racist BS on just about every post on this site. So, sometimes the retort needs to just be short and to the point.
[ link to this | view in thread ]
Re: So, need to get Trumpians in. And you'd have no objection.
Nope, only the speech on THEIR platforms.
[ link to this | view in thread ]
I simply want these companies like Google/Youtube to say what they are. Are they a content provider or a Publisher. You can't have it both ways.
If you're a content provider, then you're protected under 230. It's open to EVERYONE. You can't just ban a person because YOU don't like what they are saying. That's not how it works under 230.
If you're a Publisher, there is no 230 protection. But you can ban whoever and whatever to your heart's content. At least you're upfront about it.
You can't have it both ways. I see what's going on with youtube as the elections get closer. More people are getting banned, or getting blocked from being paid from youtube anymore. These are leftist sites. These leftists are banning the right. While these leftist channels with the real hate and vilance continue on. Getting away with clear rules violations.
If Youtube wants to continue down that path, fine, but you are then a Publisher and you no longer have 230 protection. Do a good job with being a pubisher.
[ link to this | view in thread ]
Re:
It's funny how people with extremist leaning scream about their rights when it could be to their advantage, while at the same time urging people to trample the rights of those they hate.
[ link to this | view in thread ]
Re:
Um, that's exactly how it works under 230. You set the rules for your site, and you can moderate at your discretion. 230 protects you if you moderate some things but do not moderate other things - it's plainly stated in the language of the law.
If someone who gets banned wants to argue that the moderation is not occurring "in good faith", they could take that up in the courts. It'd be a high bar to pass though. Starting with needing actual evidence, not anecdotal information.
[ link to this | view in thread ]
Re: Re: Re: Re: So, need to get Trumpians in. And you'd have no
Everyone I disagree with is a nazi is getting quite tiresome
There must be a bigger nazi problem than I thought for so many to be trying to desensitize to it
I did nazi it coming
[ link to this | view in thread ]