Wired's Big Cover Story On Facebook Gets Key Legal Point Totally Backwards, Demonstrating Why CDA 230 Is Actually Important
from the bad-reporting dept
If you haven't read it yet, I highly recommend reading the latest Wired cover story by Nicholas Thompson and Fred Vogelstein, detailing the past two years at Facebook and how the company has struggled in coming to grips with the fact that their platform can be used by people to do great harm (such as sow discontent and influence elections). It's a good read that is deeply reported (by two excellent reporters), and has some great anecdotes, including the belief that an investigation by then Connecticut Attorney General Richard Blumenthal into Facebook a decade ago, was really an astroturfing campaign by MySpace:
Back in 2007, Facebook had come under criticism from 49 state attorneys general for failing to protect young Facebook users from sexual predators and inappropriate content. Concerned parents had written to Connecticut attorney general Richard Blumenthal, who opened an investigation, and to The New York Times, which published a story. But according to a former Facebook executive in a position to know, the company believed that many of the Facebook accounts and the predatory behavior the letters referenced were fakes, traceable to News Corp lawyers or others working for Murdoch, who owned Facebook’s biggest competitor, MySpace. “We traced the creation of the Facebook accounts to IP addresses at the Apple store a block away from the MySpace offices in Santa Monica,” the executive says. “Facebook then traced interactions with those accounts to News Corp lawyers. When it comes to Facebook, Murdoch has been playing every angle he can for a long time.”
That's a pretty amazing story, which certainly could be true. After all, just a few years later there was the famous NY Times article about how companies were courting state Attorneys General to attack their competitors (which later came up again, when the MPAA -- after reading that NY Times article -- decided to use that strategy to go after Google). And Blumenthal had a long history as Attorney General of grandstanding about tech companies.
But, for all the fascinating reporting in the piece, what's troubling is that Thompson and Vogelstein get some very basic facts wrong -- and, unfortunately, one of those basic facts is a core peg used to hold up the story. Specifically, the article incorrectly points to Section 230 of the Communications Decency Act as being a major hindrance to Facebook improving its platform. Here's how the law incorrectly described in a longer paragraph explaining why Facebook "ignored" the "problem" of "fake news" (scare quotes on purpose):
And then there was the ever-present issue of Section 230 of the 1996 Communications Decency Act. If the company started taking responsibility for fake news, it might have to take responsibility for a lot more. Facebook had plenty of reasons to keep its head in the sand.
That's... wrong. I mean, it's not just wrong by degree, it's flat out, totally and completely wrong. It's wrong to the point that you have to wonder if Wired's fact checkers decided to just skip it, even though it's a fundamental claim in the story.
Indeed, the whole point of CDA 230 is exactly the opposite of what the article claims. As you can read yourself, if you look at the law, it specifically encourages platforms to moderate the content they host by saying that the moderation choices they make do not impact their liability. This is the very core point of CDA 230:
No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected
This is the "good samaritan clause" of the CDA 230 and it's encouraging platforms like Facebook to "take responsibility for fake news" by saying that no matter what choices it makes, it won't make Facebook liable for looking at the content. Changing CDA 230 as many people are trying to do right now is what would create incentives for Facebook to put its head in the sand.
And yet, Thompson and Vogelstein repeat this false claim:
But if anyone inside Facebook is unconvinced by religion, there is also Section 230 of the 1996 Communications Decency Act to recommend the idea. This is the section of US law that shelters internet intermediaries from liability for the content their users post. If Facebook were to start creating or editing content on its platform, it would risk losing that immunity—and it’s hard to imagine how Facebook could exist if it were liable for the many billion pieces of content a day that users post on its site.
This one is half right, but half misleading. It's true -- under the Roommates case -- that if Facebook creates content that breaks the law, then it remains liable for that content. But not for editing or moderating content on its platform as that sentence implies.
Indeed, this is a big part of the problem we have with the ongoing debates around CDA 230. So many people insist that CDA 230 incentivizes platforms to "do nothing" or "look the other way" or, as Wired erroneously reports, to "put their head in the sand." But that's not true at all. CDA 230 not only enables, but encourages, platforms to be more active moderators by making it clear that the choices they make concerning moderating content (outside the context of copyright -- which uses a whole different set of rules), don't create new liability for them. That's why so many platforms are trying so many different things (as we recently explored in our series of stories on content moderation by internet platforms).
What's really troubling about this is that people are going to use the Wired cover story as yet another argument for doing away with (or at least punching giant holes in) CDA 230. They'll argue that we need to make changes to encourage companies like Facebook not to ignore the bad behavior on their platform. But the real lesson of the story -- which should have come out if the reporting were more carefully done -- is that CDA 230 is what we need to encourage that behavior. The fact that Facebook is able to and is willing to change and experiment in response to increasing public pressure, is only so because CDA 230 gives the company that freedom to do so. Adding liability for wrong decisions is actually what would make the problem worse, and would encourage platforms like Facebook to do less.
It's tragic that in such a high profile, carefully reported story, a key part of it -- indeed, a part on which much of the story itself hinges -- is simply, factually, wrong.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: cda 230, content moderation, fake news, filtering, moderation, section 230, wired
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
"Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
Of course, corporatists can only read that section without "good faith" or "user" because want corporations to be given power besides immunities which print publishers don't get.
In short, Section 230 is an unworkable tangle that must be revised in light of what's now known of how corporations abuse the power precisely to deny "natural" persons outlet for Constitutionally protected speech.
[ link to this | view in thread ]
A nice little "Erratum" item at Wired would go a long way toward clearing that up, and would be a searchable citation for fact checkers on the next publication's CDA 230 story.
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
By the way, from quoted: a "user" is given same authority as "provider", so if users considers, say, Facebook's advertising objectionable, then surely must be provided a way to block it!
The law does not, in any way, say that you have to be given tools to block (though that can be a good offering if platforms decide to offer it), but I do find it bizarre that you make this claim in the same comment where you insist that the law should not allow any blocking at all. Which one is it? Are you for or against content moderation?
And no one ignores the "good faith" aspect of the law, despite your constant assertions to the contrary.
[ link to this | view in thread ]
Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
Facebook is free to do whatever it pleases with its platform. In striving to provide a good environment for most they moderate the platform. CDA 230 ensures they are not liable for any user action even if they try to moderate. I wonder where you got delusional and understood that the platform has to do everything the user wants.
"Of course, corporatists can only read that section without "good faith" or "user" because want corporations to be given power besides immunities which print publishers don't get."
Comparing apples with ostriches are we? If they publish user generated content they get the protections. Most print publishers have their online counterpart and they are protected as well. They may be liable for content they produce as it has been seen in other cases where platforms lost their CDA 230 protections.
"In short, Section 230 is an unworkable tangle that must be revised in light of what's now known of how corporations abuse the power precisely to deny "natural" persons outlet for Constitutionally protected speech."
Nobody is obliged to let you say whatever you want except the government. The 1st applies to the government, not to corporations. CDA 230 is what prevents idiots like you from trying to make the platform liable for what their users produce. It's what has allowed the Internet to flourish.
[ link to this | view in thread ]
not surprising
Hey, there should be an FDA regulation! When something that was good gets fucked up by some media mogul douche, they should have to run a picture of said douche on the front page for at least 4 issues. That way everybody knows who to punch in the face if they see them on the street. /s
[ link to this | view in thread ]
Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
Also, if you decide to run your own sites, you will have to vet everything that you allow to be published, and if you accidentally let something through that is against the law, you will be held liable.
In other words what you keep demanding will most like;y silence you completely.
[ link to this | view in thread ]
Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
Where does it say the user is given the same authority as the provider? I don't see where it says that anywhere in CDA 230.
Also, grammar man! Learn it! Good lord I can barely make out what that gibberish is trying to say.
[ link to this | view in thread ]
Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
[ link to this | view in thread ]
Re: not surprising
That's a real shame. Following the space industry, I'd occasionally see good (and often surprised) things said about them. I even kept a couple quotes:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
I seem to remember some case law...
[ link to this | view in thread ]
Re: I seem to remember some case law...
Also mote, that most of those pushing for the removal of section 230 want to make the sites liable for user published content, and if it is repealed, expect the RIAA/MPAA to go on a rampage.
[ link to this | view in thread ]
Re: Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
[ link to this | view in thread ]
Re: Re: "Good Samaritans" must be GOOD is key point. And must not be arbitrary, nor is a "provider" given full power over "users". -- It's for The Public's good, in any case, NOT the "provider" as such.
Platforms like Facebook and Twitter, even YouTube, will find it next to impossible to continue to operate at all. The costs of moderation would destroy their business model. We'll go back to the days of media being controlled entirely by media corporations, all the power back in their hands. And this fool (blue) will be partly to blame.
[ link to this | view in thread ]
The edit issue
"the ever-present issue of Section 230"
The issue I see is that 230 is not ever-present. That "safe harbour" provision is under threat and Facebook will become legally vulnerable were it to disappear.
"If Facebook were to start creating or editing content"
The hypothetical here is if Facebook was to publish articles like BuzzFeed. Then they get into legal hot-water because there is no clear division between what they "allow" and what they "contributed". You know, like Wikileaks.
So: Techdirt, my question is did you reach out to the Wired authors for comment on why they have bashed 230? I hope so.
[ link to this | view in thread ]
hold it..
Understand all the laws that would apply to ANY FORUM, from 200 Countries.
IF' you were made responsible for ANYTHING said/posted/sent..to your site..OMFG.
Isnt this a Form of Censorship?? and keeping up with 200 countries..and deleting anything and everything would leave EVERY SITE, BARE AND NAKED..
Some countries you CANT NAME Officials..
Some you cant Criticize Officials..
NO bad words,
No spitting
No pubic hairs
NO WORDS TO EXPRESS ALLAH..
NO FEMALE POSTS..
No opinions against/for A religion..
Anyone?
Could you say/post/send anything??
Free speech is strange, as there are Strange people. Strange reasoning's...
[ link to this | view in thread ]
Re:
Please show any place in the Constitution, US Code, or caselaw that states that anyone, corporation or otherwise, is required to give you a platform for your speech.
[ link to this | view in thread ]
Be careful what you wish for...
We'll go back to the days of media being controlled entirely by media corporations, all the power back in their hands. And this fool (blue) will be partly to blame.
Which is extra funny when you consider how rabidly anti-corporation they like to present themselves as. Not only would they be forced to either create an account or not post, but they'd have provided a huge boon to the very groups they claim to hate so much.
Truly, Blue is a gift that keeps on giving.
[ link to this | view in thread ]
Brilliant
Write a solid article, and get one of the core points not just wrong but dead backwards. I'm not sure which is a better possibility, that they screwed up so epicly on purpose or accidentally, but neither case is good.
[ link to this | view in thread ]
Of course they would never think of letting fake news thru or mod ( I don't know if that's right term ) posts up because it fits their cause du jour.
It's not like anyone is holding their feet to the fire on anything now a days.
I think that outside of posts that advertise or advocate things that are unlawful or things that bog their site down like spam, nothing should be marked up or down by the site itself.
Sites should not be immune if they want to moderate things up or down to suit their agenda. Users who have a clue would be blocking others who are obnoxious, or maybe they would give up their Crackbook all together.
And yes I despise Facebook and their methods. GF gets all her "news" from her FB feed. No point in arguing with her because everybody on FB "knows" whatever is true because everybody on FB is re posting it or whatever they do.
Sorry for getting off topic a bit in last paragraph.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
The first casualty of media was fact checkers....
They should fix it.
They won't.
Maybe pay the top guy a little less & bring back fact checkers, it seems without them you have a serious problem with fake news on your platform... and 230 won't protect you.
[ link to this | view in thread ]
Re: a bunch of rot
On a related note, our resident dipshits have worked out a means to avoid having their comments hidden by overloading the subject lines with their blather.
I suggest that some means be adopted to also hide the parts of subject lines originating from hidden comments. It would take some work on the back end but it seems more in line with this place than the second best solution, limiting the amount of text that can go in the subject line.
Just a thought.
[ link to this | view in thread ]
Re:
While I'm sure it's nice that the strawman Mike you've got in your head did just what you wanted it to, here in the real world your comment doesn't even come close to what was actually written, as those of us who read it can clearly see.
Nice try though.
[ link to this | view in thread ]
Re: Re: a bunch of rot
[ link to this | view in thread ]
Re:
This is groundbreaking!
[ link to this | view in thread ]
Re:
Phew, I was worried for a minute that the other post about Facebook today wouldn't be offset by a "we should let Facebook do whatever they want, forever" post by Masnick. Crisis averted.
I recognize you're trolling, but can you explain where I have ever said that Facebook should be able to do whatever it wants forever? Or can we just agree that you make up stupid strawmen that the imaginary "Mike Masnick in your head" says, rather than responding to anything I actually say?
Because that would make this a lot faster.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: a bunch of rot
Just sayin'.
[ link to this | view in thread ]
[ link to this | view in thread ]
One feature that would really help with the clutter…
if the first post was flagged and then hidden from view.
Without that clutter we'd see more adult conversations.
[ …and fewer of us would waste time arguing with fools. ;]
[ link to this | view in thread ]