Why The Ninth Circuit's Decision In Lemmon V. Snap Is Wrong On Section 230 And Bad For Online Speech

from the another-hard-case dept

Foes of Section 230 are always happy to see a case where a court denies a platform its protection. What's alarming about Lemmon v. Snap is how comfortable so many of the statute's frequent defenders seem to be with the Ninth Circuit overruling the district court to deny Snapchat this defense. They mistakenly believe that this case raises a form of liability Section 230 was never intended to reach. On the contrary: the entire theory of the case is predicated on the idea that Snapchat let people talk about something they were doing. This expressive conduct is at the heart of what Section 230 was intended to protect, and denying the statute's protection here invites exactly the sort of harm to expression that the law was passed to prevent.

The trouble with this case, like so many other cases with horrible facts, is that it can be hard for courts to see that bigger picture. As we wrote in an amicus brief in the Armslist case, which was another case involving Section 230 with nightmarish facts obscuring the important speech issues in play:

"Tragic events like the one at the heart of this case can often challenge the proper adjudication of litigation brought against Internet platforms. Justice would seem to call for a remedy, and if it appears that some twenty-year old federal statute is all that stands between a worthy plaintiff and a remedy, it can be tempting for courts to ignore it in order to find a way to grant that relief."

Here some teenagers were killed in a horrific high-speed car crash, and of course the tragedy of the situation creates an enormous temptation to find someone to blame. But while we can be sympathetic to the court's instinct, we can't suborn the facile reasoning it employed to look past the speech issues in play because acknowledging them would have interfered with the conclusion the court was determined to reach. Especially because at one point it even recognized that this was a case about user speech, before continuing on with an analysis that ignored its import:

Shortly before the crash, Landen opened Snapchat, a smartphone application, to document how fast the boys were going. [p.5] (emphasis added)

This sentence, noting that the boys were trying to document how fast they were going, captures the crux of the case: that the users were using the service to express themselves, albeit in a way that was harmful. But that's what Section 230 is built for, to insulate service providers from liability when people use their services to express themselves in harmful ways because, let's face it, people do it all the time. The court here wants us to believe that this case is somehow different from the sort of matter where Section 230 would apply and that this "negligent design" claim involves a sort of harm that Section 230 was never intended to apply to. Unfortunately it's not a view supported by the statutory text or the majority of precedent, and for good reason because, as explained below, it would eviscerate Section 230's critical protection for everyone.

Like it had done in the Homeaway case, the court repeatedly tried to split an invisible hair to pretend it wasn't trying to impose liability arising out of the users' own speech. [See, e.g., p. 10, misapplying Barnes v. Yahoo]. Of course, a claim that there was a negligent design of a service for facilitating expression is inherently premised on the idea that there was a problem with the resulting expression. And just because the case was not about a specific form of legal liability manifest in their users' speech did not put it outside of Section 230. Section 230 is a purposefully broadly-stated law ("No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."), and here the court wants the platform to take responsibility for how its users used its services to express themselves. [p. 15, misapplying the Roommates.com case].

Section 230 also covers everything that could be wrong with expression unless the thing wrong with it happens to fall into one of the few exceptions the statute enumerates: it involves an intellectual property right, violates federal criminal law, or otherwise implicates FOSTA. None of those exceptions apply here, and, in fact, in the same section of the law where these few exceptions are set forth there is also a pre-emption provision explicitly barring any state law from becoming the basis of any new exceptions. Which, with this decision giving the go-ahead to a state law-based tort claim of "negligent design," is what the Ninth Circuit has now caused to happen.

It hurts online speech if courts can carve out new exceptions. If judges can ever post hoc look at a situation where expressive activity has led to harm and decide the degree of harm warrants stripping service providers of their Section 230 protection, then there is basically no point in having Section 230 on the books. If platforms have to litigate over whether it protects them, then it doesn't really matter whether it does or not because they'll already have lost out on so much of the value the protection was supposed to afford them to make it possible for them to facilitate others' expression in the first place. The inevitable consequence of this functional loss of statutory protection is that there will be fewer service providers available to facilitate as much user expression, if any at all.

But even if there were some limiting principle that could be derived from this case to constrain courts from inventing any other new exceptions, just having this particular "negligent design" one will still harm plenty of speech. To begin with, one troubling aspect the decision is that it is not particularly coherent, and one area of confusion relates to what it actually thinks is the negligent design. [see, e.g., p. 15]. The court spends time complaining about how Snapchat somehow deliberately encourages users to drive at unsafe speeds, even though the court itself acknowledged that while Snapchat apparently rewards users with "trophies, streaks, and social recognitions" to encourage them to keep using their service [p. 5], it "does not tell its users how to earn these various achievements" [p. 5], and it is a leap to say that Snap is somehow wrongfully encouraging users to do anything when it is not actually saying anything of the kind. [See p. 6 ("Many of Snapchat’s users suspect, if not actually 'believe,' that Snapchat will reward them for 'recording a 100-MPH or faster [s]nap' using the Speed Filter.")]. In fact, as the decision itself cites, Snapchat actually cautioned against reckless posting behavior. [See p. 6 with the screenshot including the text, "Don't snap and drive."] If the case were actually about Snap explicitly encouraging dangerous behavior ("Drive 100 mph and win a prize!") then there might legitimately be a claim predicated on the platform's own harmful speech, for which Section 230 wouldn't apply. But the record does not support this sort of theory, the theory of liability was predicated on a user's apparently harmful speech, and in any case the alleged encouragement wasn't really what the plaintiffs were charging was actually negligently designed anyway.

Instead, what was at issue was the "speed filter," a tool that helped users document how fast they were traveling. Unlike the district court, the Ninth Circuit could not seem to fathom that a tool that helped document speed could be used for anything other than unsafe purposes. But of course it can. Whether traveling at speed is dangerous depends entirely on context. A user in a plane could easily document traveling at significant speed perfectly safely, while a user on a bike documenting travel at a much slower speed could still be in tremendous peril. One reason we have Section 230 is because it is impossible for the service provider to effectively police all the uses of its platform, and even if it could, it would be unlikely to know whether the speeding was safe or not. But in denying Snapchat Section 230 protection with the presumption that such speech is always unsafe, the court has effectively decided that no one can ever document that they are traveling quickly, even in a safe way, because it is now too legally risky for the platform to give users the tools to do it.

Furthermore, if a platform could lose its Section 230 platform because the design of its services enabled speech that was harmful, it would eviscerate Section 230, because there are few, if any, whose design would not. For example, Twitter's design lets people post harmful expression. Perhaps one might argue it even encourages them to by making it so easy to post such garbage. Of course, Twitter also makes it easy to post things that are not harmful too, but the Ninth Circuit's decision here does not seem to care that a design eliciting user expression might be used for both good and bad ends. Per this decision, which asserts a state law-created "duty to design a reasonably safe product," [see p. 13, misapplying the Doe 14 v. Internet Brands case], even a product that meets the definition of an "interactive computer service" set forth in Section 230 (along with its pre-emption provision), if the design could be used to induce bad expression, then the platform no longer qualifies for Section 230's protection. But that would effectively mean that everyone could always plead around Section 230 because nearly every Section 230 case arises from someone having used the service in a harmful way the service enabled. It is unfortunate that the Ninth Circuit has now opened the door to such litigation, as the consequences stand to be chilling to all kinds of online speech and services Section 230 was designed to protect.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: 9th circuit, intermediary liability, lemmon, negligence, product liability, section 230, speech, speed filter
Companies: snap


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Pixelation, 8 Jun 2021 @ 4:11pm

    "Furthermore, if a platform could lose its Section 230 platform because the design of its services enabled speech that was harmful..."

    According to the court document, this case wasn't about harmful speech. It was about induced, dangerous behavior. Whether that is correct or not, Snapchat wasn't being sued because of the speech, it was being sued because (according to the plaintiffs) of inducing the behavior, and they should have known it would cause dangerous behavior.

    Section 230 as a defense in this case is a bit of a stretch in my mind. IANAL and don't even play one on TV.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 8 Jun 2021 @ 4:20pm

    Re:

    If snap chat is responsible for a drivers bad decision, why was the car manufacturer not included in the case for having installed a speedo that shows illegal speeds in a car capable of high speeds?

    link to this | view in thread ]

  3. identicon
    Pixelation, 8 Jun 2021 @ 4:24pm

    Re: Re:

    I would guess that if the automakers gave out rewards like Snapchat did to induce the behavior, they would get sued as well.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 8 Jun 2021 @ 4:27pm

    The EFF say there nothing to worry about and they are usually right and do not believe it will eviscerate Section 230's protections for everyone by the backdoor.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 8 Jun 2021 @ 4:33pm

    Re: Re: Re:

    Nobody has shown that speeding won awards.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 8 Jun 2021 @ 4:37pm

    The idea here is this: Snap publishes these filters. The speed filter isn't content that users generated. It's a filter provided by Snapchat, who wants its users to then use it. Snap's demographic skews largely to younger users, and they should've expected that them publishing this kind of tool could easily encourage teenagers, i.e. those with less developed senses of personal judgment or regard for safety, to go for huge numbers in the only vehicles they could reliably get their hands on to then show off to their friends and other people.

    There's a reasonable expectation here that a company like Snap, with their scale, reach, and understanding of their core demographics, would use better judgment. I agree with the EFF and the Ninth Circuit here; this case should go forward based on its merits and Snapchat doesn't get to use Section 230 as a defense.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 8 Jun 2021 @ 5:12pm

    Re:

    And then can society as a whole, or localities involved, or people who were travelling on the road at the same time, sue the plaintiffs for raising morons?

    link to this | view in thread ]

  8. identicon
    much anonymous very coward, 8 Jun 2021 @ 5:17pm

    I disagree with this piece, for three reasons.

    1) It's a little overblown to say the Ninth Circuit opinion carves a hole in Section 230. This is just a motion to dismiss at the earliest part of the case. Just because the Ninth Circuit thinks it's possible that Snap negligently designed its app doesn't mean it's likely. When this first came out, my initial reaction was also horror because I didn't dig into the reasoning.

    But it's important to focus on why the lawsuit feels silly. It's not because Snap is being sued for snaps sent on the service. It's because the lawsuit claims that offering a speed filter encourages users, particularly young ones, to use a car to drive at high, dangerous speeds. That's a really big leap in logic, but it's different than suing over third party content. Section 230 only covers the former.

    2) This piece blurs the issue of causation. The piece argues that the court took a "post hoc look at a situation where expressive activity has led to harm." But that's not what happened. The kids who got in a car crash didn't create or receive posts that hurt them. The harm is (supposedly) because Snap chose to create filter that created an incentive to drive unsafely. I think that's a bullshit argument, but again, it's not the same as saying third-party content "led to" harm. .

    3) This is not the typical negligent design case. Often, "negligent design" is used by lawyers as a way to get around Section 230, like the Herrick v. Grindr case where the argument was basically "I was harassed and stalked by an ex on Grindr, and his stalking is Grindr's fault because they built an app to make stalking easy. My proof: it happened to me." It's pretty obvious what the lawsuit is about, and it's not that they actually want to redesign the product. They want Grindr to pay them because their service was used by someone in a bad way.

    But the car crash in this case wasn't the result of "using" Snapchat in a bad way. It was (allegedly) the result of Snap's product design decision encouraging dangerous real-life behavior. If we want to avoid politicians making false claims about Section 230, we also have to avoid making inaccurate claims about what Section 230 does and doesn't cover.

    tl;dr The plaintiffs in this case didn't die because of a snap they received, they died because they drove dangerously. That's probably not Snap's fault, but for different reasons than Section 230.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 8 Jun 2021 @ 5:28pm

    Re:

    I believe you are probably referring to this eff deeplink.

    From the decision:

    In short, Snap, Inc. was sued for the predictable consequences of designing Snapchat in such a way that it allegedly encouraged dangerous behavior. Accordingly, the panel concluded that Snap, Inc. did not enjoy immunity from this suit under § 230(c)(1) of the CDA.

    EFF: This isn't a matter blaming Snapchat for the kids' speech, § 230 is intact.
    Gellis: The problem is that if the design could be used to induce bad expression, then the platform no longer qualifies for Section 230's protection.

    Consider another twitter example: some kid sees a gazillion posts saying that drinking bleach cures covid, and so drinks bleach and dies. The parents sue, claiming that because twitter didn't prevent someone from using automation to create bot accounts and spreading the lie, Twitter was a defective product, despite every part of Twitter working correctly and as designed and without accusing twitter of being responsible for the troll's speech.

    I think that the EFF's take is short sighted, as this effectively illustrates a viable way to cut 230 out of the loop. And the appeals court has to view the accusations in the light most favorable to the moving party (the plaintiffs, in this case). So... back to district court this goes, where Snapchat has a pretty good chance of prevailing anyway. But without being able to use 230 to cut to the chase.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 8 Jun 2021 @ 5:33pm

    Re: Re:

    Perhaps. Unfortunately, the kids used the same logic that says, "I've only had three beers, I should still be able to get home safely", or "this shortcut through this dark alley is probably safe", or "I've got a cunning plan, it's sure to work this time!" You know, where you've got odds highly in your favor, even though the consequences for losing are dire? And especially the young are not going to take the consequences into consideration.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 8 Jun 2021 @ 5:36pm

    Re:

    It's a little overblown to say the Ninth Circuit opinion carves a hole in Section 230. This is just a motion to dismiss at the earliest part of the case.

    Since the motion to dismiss is the whole advantage of section 230, it is more than a little dissonant to put those sentences together like that.

    link to this | view in thread ]

  12. icon
    sumgai (profile), 8 Jun 2021 @ 7:26pm

    While this is a civil case (wrongful death), it is my experience that such cases often come about when a criminal case cannot be sustained. (In this case, the proximate perpetrator died.) Now, if he had lived, the State would have charged him with criminal manslaughter, at the least. Supposing he presented a defense of "But Snapchat made me do it", that would've been dismissed for one simple reason: Each and every State in the Union has a simple declaratory rule upon issuing a driver's license (usually formally called an Operator's License (or permit)): "You as a driver in control of a motor vehicle must assume and accept total responsibility for your actions while in charge of said vehicle. There are no exceptions to this rule."

    Court cases have abounded over the years, pondering this law and how it should/must be applied to contested situations. But the fact of the matter is, regardless of any distractions whatsoever, you are responsible for your actions, plain and simple. You can claim all you want that "The devil made me do it!", and no matter what guise the devil may take, you're still responsible, end of story. Snapchat's defense should rest on only one ideal - "This kid was breaking the law, and we made no inducements to persuade him to do to. He abdicated his personal responsibility to the State of his own volition, and we should not be used as a substitute for restitution to a harmed party..... particularly when we had no prior agreement between ourselves any of the remaining parties to this action."

    It is a given in law that no party can be made to assume responsibility for another without a prior agreement (usually in writing). If Party A strikes Party B, and in falling down Party B strikes Party C, Party C cannot directly capture Party A for damages, instead he must go through Party B, the most direct proximate case of the tort. (The sole exception is parent/child relations.)

    Now to the heart of the matter. The car maker analogy is a good one. If we start making third parties suffer the consequences of our (STUPID) actions, then we might as well go back to the cave man days, because a man's word is no longer his bond. By no rational person's personal gauge are we a civilized society if one can point his finger at another person and shift the blame/guilt in that manner. That is heresy of the highest order, and morally reprehensible.

    But legal?? It is starting to look that way, isn't it. Sigh.


    Side comment: On the face of it, this case deserves adjudication. But if we get crass and look at the underside, we see that it's nothing more than "muh feelz". Which then says that the case is a direct result of:

    a) Too fucking many lawyers, most of them acting like ambulance chasers;

    and

    b) Judges refusing to meaningfully punish lawyers who continually prove that George Santyana was correct.


    tl;dr:

    Final scorecard for Snapchat: A for effort, but F for lack of validity.
    Attempting to side-track the actual issue does not often win the day. The issue was never about speech, it is, and will always be, about personal responsibility.

    link to this | view in thread ]

  13. icon
    Stephen T. Stone (profile), 8 Jun 2021 @ 7:57pm

    Other than a Darwin Award, maybe…

    link to this | view in thread ]

  14. icon
    Toom1275 (profile), 8 Jun 2021 @ 8:01pm

    Re:

    The speed filter isn't content that users generated.

    The number the filter displays, along along with any significance such holds, are solely and exclusively user-determined. Your narrative doesn't hold water.

    link to this | view in thread ]

  15. icon
    TKnarr (profile), 8 Jun 2021 @ 10:54pm

    Re:

    By that logic the camera on the cel phone used should render the phone manufacturer liable. After all, it was the phone manufacturer, not the user, that put the camera in there and made the phone. And surely they knew that, since that camera can be used to document how fast you're travelling, teenagers would use it to document themselves going for exactly those same high speed numbers to show off just as you say. Similarly for the car manufacturers, knowing that simply putting a speedometer in the car would encourage reckless teenagers to capture an image of it to demonstrate just how fast they were capable of going.

    I don't buy that.

    link to this | view in thread ]

  16. icon
    BJC (profile), 8 Jun 2021 @ 11:29pm

    Section 230 Does Not Mandate Libertarianism

    Once again, this is an argument against a Section 230 decision which is really, "I think this part of not-specific-to-the-internet law is dumb and I wish the internet was allowed to disrupt it," which is not how Section 230 works. If this was a camera with a speedometer that, even though almost exclusively used for meme posting, required attaching a cable or moving a memory card to upload the pictures to the internet, this wouldn't be a Section 230 issue. The makers of the camera and its programming are not "content publishers." What would happen to the picture after it's taken does not change that it's the ability to take it that's the issue. And so, as I see it, Snap slapped a camera with a speedometer on its social app. When it gets sued over the camera with a speedometer bit, it can't say, "well, we're a social app" to get out of it. The same is true for the Homeaway decision. The law said, "if you get a taste of the money that flows to renting apartments, you have to follow the regulations." Facilitating payments is, on its own, regulatable, and giving people who use your Section 230-protected apartment listing app payment facilitation services doesn't somehow get you out of money handling regulation. I'm not anti-libertarian and I am sympathetic to the arguments that current US product liability law and the extent that a municipal government can put its hand into your pocket are BAD POLICY. But Section 230 doesn't automatically invalidate bad policy. It only prevents regulation of hosting what other people say.

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 8 Jun 2021 @ 11:38pm

    If Party A strikes Party B, and in falling down Party B strikes Party C, Party C cannot directly capture Party A for damages, instead he must go through Party B, the most direct proximate case of the tort.

    Party A rams their car into the back of Party B's car. Party B's car, as a result of the impact, is pushed into Party C's car. If I'm understanding you correctly, Party C would need to go through Party B's insurance? That doesn't seem accurate, because I was Party B once, and C went directly to A for recovery.

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 8 Jun 2021 @ 11:58pm

    Re: Re:

    By that logic the camera on the cel phone used should render the phone manufacturer liable.

    The speedometer reading on the phone is calculated via GPS isn't it? So then wouldn't the GPS system itself be responsible as well, by the same logic?

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 9 Jun 2021 @ 12:11am

    Stunning Hypocrisy

    So, when a person using Social Media (a tool) is involved in a tragic event that results in an avoidable death, its the tool's fault and the person is beyond reproach.

    When a person using a gun (another tool) is involved in a tragic event that results in an avoidable death, its the person's fault and the tool is beyond reproach.

    Amazing

    link to this | view in thread ]

  20. icon
    Narcissus (profile), 9 Jun 2021 @ 2:44am

    Re:

    We can't have a society where people have no personal responsibility or obligation to make proper risk assessments. That way madness lies.

    Most cars can go (way) over the legal speed limit. There is a reasonable expectation that if you buy a car that can go that fast, some people will go that fast. You can't sue your car manufacturer for the consequences of irresponsible driving though. Note that speed limiters are an extremely easy and existing fix for driving over the speed limit but nobody sued Ford because they didn't build them into each car.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 9 Jun 2021 @ 4:53am

    Note:

    Tbf, there are some exceptions, this is one of them.

    The EFF might be aware about the fact that there short sighted, but considering the fact that there's no exception for harm, I'm pretty sure the EFF wouldn't approve if Snapchat had this case dismissed cause of 230.

    That said, your example is pretty much something the EFF is hoping that other courts won't resort to.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 9 Jun 2021 @ 4:55am

    Re:

    Exactly.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 9 Jun 2021 @ 5:29am

    Re:

    May vary. I've been in a four car chain, as yes the insurances all maintained that chain of liability.

    link to this | view in thread ]

  24. icon
    BJC (profile), 9 Jun 2021 @ 6:27am

    Re: Re:

    We can't have a society where people have no personal responsibility or obligation to make proper risk assessments. That way madness lies.

    Is this a statement of what you think California law is, or what California law should be?

    Because my understanding is that California's pretty liberal (i.e., what the U.S. Chamber of Commerce might call "jackpot justice" or a "judicial hellhole") when it comes to products liability.

    It's reasonable for you to think that kind of law is insane and society-breakingly bad, but just thinking that doesn't make it less likely to be the law.

    Regardless, the whole reason this is on Techdirt instead of a tort reform blog is the Section 230 defense, which is what the original commenter was speaking to; if California's general products liability law gives bags of money to morons, Section 230 isn't a great defense when the app as-installed without any communication or publication is claimed to be defective.

    link to this | view in thread ]

  25. icon
    BJC (profile), 9 Jun 2021 @ 7:05am

    Re: Re:

    Tort law is, basically, taking a scale of behavior from "obviously fine" and "obviously horrible" and drawing an infinite number of hairsplitting distinctions between it, then picking one particular point of distinction to say, "yeah, it's this one thing that makes someone responsible."

    (An aside: this line is not arbitrary, but it's not "objective"; based on your philosophy, your first principles of things like "the greatest good" and/or "do no harm" will lead you to different places than other people)

    There's a line between what Snapchat did and your examples. Snapchat made the risk-taking behavior easier. Snapchat's functionality took the camera and the GPS and the other existing components and made it easier (and possibly more entertaining) to take one's picture at high speed than it would have been otherwise.

    So, what's the line of liability for a device or function that makes high-speed selfies easy? Is it always on the user to be safe? Is there some level of "gamification" that makes using the app too attractive to adrenaline junkies that would create liability?

    link to this | view in thread ]

  26. identicon
    Portent, 9 Jun 2021 @ 7:16am

    Im sorry but the "speed filter" is not 3rd party content. It is a feature that was added to the app by SnapChat itself. Section 230 was never intended to give immunity to "internet service" providers for their own content. SnapChat can argue in court that they should not be held liable for the speed filter encouraging the children to drive recklessly but in no way should they be given immunity under section 230.

    link to this | view in thread ]

  27. identicon
    Portent, 9 Jun 2021 @ 7:18am

    Re: Re:

    If say Mercedes included a feature that caused the the windshield to give a light speed effect when the car exceeded 100mph you would bet your ass that Mercedes would be sued into oblivion.

    link to this | view in thread ]

  28. icon
    Cathy Gellis (profile), 9 Jun 2021 @ 7:46am

    Re: Re: Re:

    There was an idea I was wrestling with that I couldn't get into the piece, about why not sue the car manufacturer too? I left it out because I wasn't sure they hadn't, and wasn't going to be able to easily ascertain the answer.

    I think it might have been easier to see why this decision would have been bad if it had been about the car manufacturer. If you give someone the tools that could be used in good or bad ways, should you be liable? Some people do think yes, but there's a cost to that, because it means you'll take away the tools from someone who wanted to use them for good.

    But in this case, the cost of taking the tools away are even higher because it means we're taking away the tools for SPEECH. And we pointedly have Section 230 to make sure we don't lose those tools.

    link to this | view in thread ]

  29. icon
    sumgai (profile), 9 Jun 2021 @ 8:34am

    Re: Re:

    I should've used the car analogy. But in my defense, I was indeed a Party C, many year ago. I mean, I even saw it coming, and still couldn't get out of the way fast enough. Bad times.

    Yes, state laws do vary, but in the main, the chain of evidence usually proves to be most easy to follow if one doesn't attempt to jump/skip over pieces of evidence. In that manner, a court can find the proper ration of justice for each party.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 9 Jun 2021 @ 9:28am

    There should be a law that says no company is responsible for its customers' stupidity.

    link to this | view in thread ]

  31. identicon
    Pixelation, 9 Jun 2021 @ 10:06am

    Re: Re: Re: Re:

    "Nobody has shown that speeding won awards."

    True, I've never heard of automakers offering rewards/awards for speeding. On the other hand, if you are suggesting awards given to drivers on race tracks, it would no longer be speeding, since there is no longer a legal speed limit there.

    link to this | view in thread ]

  32. icon
    TaboToka (profile), 9 Jun 2021 @ 11:37am

    Re:

    There should be a law

    There is: CDA §230.

    link to this | view in thread ]

  33. icon
    BJC (profile), 9 Jun 2021 @ 11:42am

    Re: Re: Re: Re:

    Only Snap was sued. Look in the C.D. Cal. PACER for the same case (complaint was filed May 23, 2019).

    Your analysis, though, misses part IV of the 9th Circuit's decision (starting on pg. 17), which I'd paraphrase as:
    "So, Snap asks us to say that this is a dumb products liability suit. The lower court judge didn't decide that, and didn't even decide if this was governed by California or Wisconsin law so we would really be considering stuff for the first time -- an appellate court no-no -- to make that judgment. Because of that, we have to assume that, but for Section 230, this would be a perfectly OK products liability suit."

    So, because the court has to assume that having a camera app with a speedometer incurs liability, the Section 230 question becomes, to the Ninth Circuit:
    "Does the fact that the camera app with speedometer is integral to a social media platform give Section 230 protection to the camera app?"

    And I don't think it's crazy to say that the answer is no.

    Imagine a hypothetical where Snap is actually Snap Holding, and there are two subsidiaries:

    1. Snap Posting Things on the Internet, and
    2. Snap Creating a Containerized, Black Box Camera App by Folks in a Faraday Cage Who Deliver a Physical Medium Containing the Code to Any Licensee and Never Personally or Professionally Connect to the Internet

    And Snap Posting licenses the camera app from Snap Creating.

    If the plaintiffs in this case were suing Snap Creating (either because they were smart or because Snap Holding won a motion substituting them as the real party), would Section 230 protection apply? I don't think it would, because the app itself isn't a content service.

    In the real world, it's messy, there's no obvious division, everything's blended together. But that doesn't make the camera programming inseparable conceptually from the social app.

    And, once again, whether this is fundamentally a bad tort claim doesn't matter because, in front of the Ninth Circuit, procedurally they have to assume it's viable.

    link to this | view in thread ]

  34. icon
    BJC (profile), 9 Jun 2021 @ 11:49am

    Re: Re:

    Section 230 is only for communications by third parties on the internet.

    If I make a "memeable chainsaw" that people hurt themselves doing avoidably stupid things with for the internet, and my products liability jurisdiction is one where it says, "yes, you have to go the extra mile to prevent morons from doing stupid things or we make you pay them bags of money," then Section 230 doesn't stop me from being sued for the chainsaw.

    The question here is closer, because it's a non-communicative function of a social media app (you can save your snaps to memories and to your phone gallery without ever transmitting them, after all), but the issue is the characterization of the app as communication or non-communication, not the thought process of the user.

    link to this | view in thread ]

  35. icon
    Lostinlodos (profile), 10 Jun 2021 @ 4:46pm

    See, one thing missing from the 230 argument is how things were before it.
    Section 230 didn’t suddenly protect platforms from user material. They were protected as long as they didn’t prune anything.
    It’s creation allowed companies to delete non-illegal content they didn’t wish to host and still maintain the distributed protection whilst removing content they disagree with.

    The biggest thing that is pushing the anti-230 movement is that intentional mis-stating.
    230 did NOT create protections from user content. That was already there prior to 230.
    It created a protection for platforms that censor content. (Pro-230 people downplay the act of censorship as moderation).

    That a private business should have a right to censor anyone it wants falls well in the property rights law, and is the reason I lean in support of 230 despite the limitations on reach of speech.

    link to this | view in thread ]

  36. icon
    Lostinlodos (profile), 10 Jun 2021 @ 4:53pm

    Wtf!

    How the hell is snap reliable for anything.

    Stupid kids posted a picture, illegally using a cell phone while driving to post evidence of their criminal driving speed.
    The restaurant isn’t responsible for you dumping your coffee in your lap. You are.

    Amazon isn’t responsible for your toddler eating a battery they had no business having access to. You are.

    Snap isn’t responsible for a bunch of kids being reckless illegal fools. The kids are.

    About the only way this could even remotely bite snap is if someone believe La pictures of criminal activity are themselves criminal.
    And that has far greater repercussions.

    People need to take responsibility for themselves. Snap didn’t make them do anything.

    link to this | view in thread ]

  37. icon
    BJC (profile), 10 Jun 2021 @ 6:56pm

    Re: Wtf!

    You know this is irrelevant to the Ninth Circuit's decision, right?

    The way the District Court looked at the case, which is the way a lot of courts look at legal defenses, was to say something like:

    "Section 230 is a complete immunity if the actions fall under its umbrella. Since I think Section 230 applies, I'm not going to decide the underlying case as it doesn't matter whether it's a strong or weak case absent Section 230 immunity."

    And then the Ninth Circuit said:

    "Snap wants us to notice that the underlying lawsuit seems crazy dumb. Problem is, the judge below didn't rule on that. So we have no ruling to review; the district judge will have to decide that when the case gets back there."

    Snap may ultimately not be responsible because California or Wisconsin law isn't plaintiff-friendly enough to allow this case to go forward. But that's totally separate from the 230 analysis, which is whether the camera app part of SnapChat is covered by Section 230.

    link to this | view in thread ]

  38. icon
    Lostinlodos (profile), 10 Jun 2021 @ 7:20pm

    Re: Re: Wtf!

    Problem here, is the suite should have been tossed walking in the door. Ever since hot coffee (the dumb lady, not the funny mini game) this country has been going further and further out of the way (legally) to absolve people of personal responsibility.

    Much of this has to do with implied liability. Guns don’t kill people. People kill people.
    This is the same
    Snap didn’t kill anyone. Stupid kids driving in excess of the speed limit took out their phone to take a picture and died.
    Snap didn’t do anything themselves to encourage this.

    link to this | view in thread ]

  39. identicon
    Rocky, 11 Jun 2021 @ 1:05am

    Re:

    See, one thing missing from the 230 argument is how things were before it.
    Section 230 didn’t suddenly protect platforms from user material. They were protected as long as they didn’t prune anything.

    No quite, before section 230 a site could be sued if they moderated but they could also be sued for not moderating. See:

    These two cases where at odds which meant internet services/platforms where screwed regardless of what they did.

    link to this | view in thread ]

  40. icon
    Lostinlodos (profile), 11 Jun 2021 @ 3:08am

    Re: Re:

    You miss a key point. Compuserve only deleted materials alon court order.

    Prodigy chose to actively censor anything they didn’t agree with under the guise of family accessibility.

    The prodigy ruling is quite clear in the difference.
    Compuserve didn’t censor. Prodigy did.
    The basis of 230 is to allow the likes of Prodigy to do as they will content wise without ramifications.
    230 was designed to protect companies that removed content from liability for content they didn’t remove.
    230 place private property over speech.

    link to this | view in thread ]

  41. identicon
    Rocky, 11 Jun 2021 @ 7:01am

    Re: Re: Re:

    You miss a key point. Compuserve only deleted materials alon court order.

    That's not a key point, that has always been the case under the right circumstances plus they actually removed content that broke federal and state laws. Regardless, that really didn't stopped them from being frivolously sued.

    Prodigy chose to actively censor anything they didn’t agree with under the guise of family accessibility.

    This is a question that actually you need to answer since you brought it up, what content did Prodigy remove because they didn't agree with it under the guise of family accessibility? If you can't answer it, your assertion has no meaning at all. Remember, removing content that can be considered libel isn't "censorship"...

    The prodigy ruling is quite clear in the difference.
    Compuserve didn’t censor. Prodigy did.

    Please point to where in the ruling it says it was censorship? Also, at this point you should be able to explain the difference in the rulings and the ramifications? If you can, perhaps then you will understand the reasoning behind section 230.

    The basis of 230 is to allow the likes of Prodigy to do as they will content wise without ramifications.

    Well, it also allows you to post here, doesn't it? Actually, it enables ALL interactive websites to host comments and user generated content without the fear to frivolous lawsuits. It also allows them to say "we don't talk about politics here and if you do your post will be deleted", but that's censorship according to your definition.

    230 was designed to protect companies that removed content from liability for content they didn’t remove.

    No, it was designed to allow internet services to choose what they wanted their service to be without being sued all the time. I can link to the article where one of the creators explains the reasoning behind section 230 if you want. Or do you think that you know more about the design of 230 than the creators of it?

    230 place private property over speech.

    You still don't understand. You can exercise your free speech anywhere you want, but not at the cost of other persons rights because your rights stop where theirs begin. If you think otherwise, you seriously have to come up with a good argument why YOUR rights are more important and should infringe the rights of others.

    link to this | view in thread ]

  42. icon
    BJC (profile), 11 Jun 2021 @ 7:02am

    Re: Re: Re: Wtf!

    I'm not sure what you mean that the suit should have been tossed "walking in the door."

    Judges don't on their own initiative throw out suits on the basis of being ridiculous unless they're by prisoners trying to waive the filing fee to sue, or the plaintiff is trying to sue Satan or some other individual who does not exist physically or legally in this plane of reality to show up and file a defense.

    For everything else, ridiculous or otherwise, the judge waits for a motion to dismiss. That's the earliest time a case gets thrown out.

    Now, Snap did that. And they did say the case was pretty dumb on the merits.

    And you know what? The district judge mostly agreed! The appellate decision points out that the judge at the lower level said something along the lines that, if there was no Section 230 defense, the case would be "dismissed with leave to amend," meaning the plaintiffs would have to plead more facts (which they probably didn't have) to show how there was liability in this case.

    Why would the District Court want more facts before it dismissed with prejudice? Well, there's a choice of law question -- whether it's a California or Wisconsin law case -- and it would be nice to have that cleared up in the pleadings so that wasn't a point of appeal. And requiring the plaintiffs to come back with more facts avoids an appeal on the basis of, "the court should have taken these super-broad pleadings in the best possible light." And keeping it to the pleadings is still the cheapest stage of the lawsuit.

    But the district court went on to say that it didn't have to dismiss with leave to amend because Section 230 immunity dismissed the whole case right now.

    That was a bad call because Section 230 doesn't fit well to this products suit. So it's back to the district court to dismiss with leave to amend, see if the plaintiffs can actually find some facts that make a case, and then it'll be dismissed.

    So, for me, I don't get the dudgeon here. Let's say, instead of Section 230, which seems to apply to this case if you squint while tilting the pleadings the right way, Snap made an argument for something more left field to dismiss the case, like the Tax Anti-Injunction Act. And the district judge agreed, and then the Ninth Circuit said, "no! This isn't a case about a tax," would you then be saying, "because this case is so ridiculous, the court should let it be dismissed for any reason claimed, even if it makes no legal sense?"

    link to this | view in thread ]

  43. icon
    Lostinlodos (profile), 11 Jun 2021 @ 12:24pm

    Re: Re: Re: Re:

    “That's not a key point“
    It is though. They only removed content when they were informed of it. What laws it broke or had a court order to remove something.
    They didn’t “moderate”

    “ what content did Prodigy remove because they didn't agree with it under the guise of family accessibility?”
    Porn. Violent content. Access to hacking tools. Erotic entertainment. Many unfiltered portals to usenet access. They “moderated” language in postings. Etc.

    “If you can, perhaps then you will understand the reasoning behind section 230“
    Yes. 230 was enacted to offer a guarantee of protection for activity monitoring content and removing it. It added protection against the claim that if they actively monitor they should be responsible for what they miss.


    Is that the oped where the author stated they intended to “encourage moderation”?

    “You still don't understand. You can exercise your free speech anywhere you want, but not at the cost of other persons rights because your rights stop where theirs begin.”
    I fully understand. Which is why I changed my leaning on 230. Public rights end at private property.

    link to this | view in thread ]

  44. icon
    Toom1275 (profile), 12 Jun 2021 @ 10:47am

    Re: Re: Re: Re: Re:

    Yes. 230 was enacted to offer a guarantee of protection for activity monitoring content and removing it.

    The First Amendment and free speech rights gave that power; all Section 230 does it make it financially affordable to do so.

    It added protection against the claim that if they actively monitor they should be responsible for what they miss.

    That's how things always were before the flawed prodigy ruling; Congress saw how Prpdigy was completely wrong and contrary to how things were supposed to be, and so made the law to make clear how secondary liability online should remain.

    link to this | view in thread ]

  45. icon
    Toom1275 (profile), 12 Jun 2021 @ 12:26pm

    Re: Re: Re: Re: Re:

    I fully understand.

    [Asserts facts not in evidence]

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.