Netflix Files Anti-Slapp Motion To Dismiss Lawsuit Claiming One Of Its Series Caused A Teen To Commit Suicide

from the even-an-algorithm-is-protected-speech dept

Because Netflix is big, it draws lawsuits. It has been sued for defamation, copyright infringement, and, oddly, defamation via use of a private prison's logo in a fictional TV show. It has also been sued for supposedly contributing to a teen's suicide with its series "13 Reasons Why," which contained a lot of disturbing subject matter that teens deal with daily, like bullying, sexual assault, and -- most relevant here -- suicide. The final episode of the first season contained a suicide scene, one that was removed by Netflix two years after the show debuted.

While undeniably a tragedy, the attempt to blame Netflix for this teen's suicide is severely misguided. The lawsuit filed by the teen's survivors alleges Netflix had a duty to warn viewers of the content (content warnings were added to the show a year after its release) and it failed to do so, making it indirectly liable for this death.

Netflix is now trying to get this lawsuit dismissed using California's anti-SLAPP law because, as it argues persuasively, this is all about protected speech, no matter how the plaintiffs try to portray it as a consumer protection issue. (h/t Reason)

Netflix's anti-SLAPP motion [PDF] points out this isn't the first time teen suicide has been depicted in pop culture, nor is it the first time people have tried to sue creators over the content of their creations. None of those lawsuits have been successful.

13 Reasons Why is not the first work to tell a story about teen suicide. The subject has been explored in countless literary works, motion pictures, songs, and TV shows—everything from Romeo and Juliet to Dead Poets Society. And this is not the first lawsuit that has claimed that the media’s depiction of suicide and violence is to blame, and should be held legally liable, for real-life suicides and other tragic events. Courts, however, have repeatedly rejected such suits.

Since this lawsuit directly implicates creative expression, Netflix says California's anti-SLAPP law applies.

Without question, 13 Reasons Why is protected expression under the First Amendment. And the FAC itself, which is replete with citations to articles about 13 Reasons Why and its subject matter, makes plain that 13 Reasons Why’s speech is in “connection with a public issue.” Plaintiffs try to sidestep the anti-SLAPP statute and the First Amendment by insisting that their claims are not based on the content of 13 Reasons Why, but on an alleged “failure to warn” or breach of a purported duty to protect “vulnerable populations” from the content. But without the allegations that the show’s content is “dangerous,” Plaintiffs’ theories fall apart. Not only do those theories strike at the free expression embodied in the show, they target Netflix’s conduct “in furtherance” of the distribution of the show, and therefore bring this lawsuit squarely within the ambit of the anti-SLAPP statute.

Not only is the content of the series protected expression, but so is the algorithm that possibly recommended the show to the teen. The plaintiffs' "failure to protect" theory argues that Netflix's algorithm is itself reckless and dangerous, given that it prompts users to select titles that may contain disturbing subject matter. But whether or not a human is performing the recommendation makes no difference. It's a form of editorial control which is protected under the First Amendment.

The recommendations system, and the display of suggested titles, is speech. It evinces “[a]n intent to convey a particularized message,” Spence v. Washington, 418 U.S. 405, 410–11 (1974)—namely, a message about what shows and movies a viewer might choose from to watch. The recommendations fall within the well-recognized right to exercise “editorial control and judgment.” Miami Herald Pub. Co. Tornillo, 418 U.S. 241, 258 (1974). Plaintiffs allege that the recommendations here are different because they are dictated by an algorithm. But the fact that the recommendations “may be produced algorithmically” makes no difference to the analysis.

[...]

The suggestion that a viewer watch a particular show falls within the broad scope of conduct in furtherance of the right to free speech. The subject of the recommendation—13 Reasons Why—is itself protected speech, and the recommendations facilitate Netflix’s protected distribution and dissemination of that speech.

The plaintiffs' claims can't be separated from the undeniable fact that the allegations are all about creative expression, which is exactly why the state's anti-SLAPP law should apply.

Finally, like Plaintiffs’ failure-to-warn theory, Plaintiffs’ recommendation theory cannot be disentangled from Plaintiffs’ allegation that the underlying content is “dangerous”: the entire premise of the claim is that Netflix had a duty to identify which viewers are “vulnerable,” and ensure that the algorithm does not recommend 13 Reasons Why (or other content Plaintiffs deem unsuitable) to such viewers.

If, as the plaintiffs argue, creators and producers of disturbing content can be held directly liable for the actions of viewers, the end result would be self-censorship and a dearth of options for creators to distribute their creations. Suicide has been the subject matter of countless creative efforts for hundreds of years. A lack of content warnings ahead of viewing doesn't make a viewer's death Netflix's fault. Viewers have control over what they watch and, given the vast amounts of information contained on the internet about movies and TV series, there's little reason to believe viewers have to go into any show "blind."

As tragic as this situation is, the blame (if there even is any) doesn't lie with Netflix. Netflix is no more responsible for this death as the creators of the series or the writer of the novel that inspired it.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: 13 reasons why, 1st amendment, anti-slapp, blame, california, liability, protected speech, suicide
Companies: netflix


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • This comment has been flagged by the community. Click here to show it
    identicon
    Pixelation, 2 Nov 2021 @ 11:23am

    I understand

    Some of the Netflix shows are so bad I want to blow my brains out.

    link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 2 Nov 2021 @ 12:06pm

    Something something we bear no fault in not noticing our child was suicidal & this never would have happened if this tv program hadn't infected our perfectly fine & stable child with these wild thoughts.

    We all know I am not nice people, but am I the only one wondering if the child who committed suicide watched the final episode? (and was it the version with the suicide shown or not).

    Something something parental controls on Netflix?
    Something something parents looking at what the kids are viewing?
    Something something corporations aren't good selections to raise your kids.

    One would think a good lawyer would have explained that there wasn't a case here & refer them to counseling, but hey lets throw shit on the wall, see if anything sticks we might get a good payday... and if we don't we still got the retainer.

    link to this | view in chronology ]

  • identicon
    Glenn, 2 Nov 2021 @ 3:53pm

    Dear parents: if your kid committed suicide, then you are the ones almost entirely--maybe even totally--to blame. Congratulations on raising a kid with so little self-worth; it's all on you. Maybe pay actual attention to your kids from now on? (Feel bad? ...good.)

    link to this | view in chronology ]

    • icon
      nasch (profile), 2 Nov 2021 @ 4:33pm

      Re:

      Congratulations on raising a kid with so little self-worth; it's all on you.

      You sound like someone who has never dealt with a family member suffering from depression. I hope that you never have to, and that you develop some empathy for those who do at some point, hopefully soon.

      link to this | view in chronology ]

    • icon
      That Anonymous Coward (profile), 2 Nov 2021 @ 5:08pm

      Re:

      I really am the last one to have sympathy for parents showing up to blame everyone else for their kids actions, but wtf dude?

      Yes parents need to pay more attention, but kids hide all sorts of things because as a society y'all suck at dealing with the hard stuff. There isn't a warning light on kids heads that flashes when they are having suicidal thoughts.

      Suicide isn't a self-worth thing, I've seen highly successful teens who seem to have it all who kill themselves.

      Mental health is a train wreck...
      We tell kids to suck it up, call them names when they are upset & then are shocked just shocked they don't reach out for help.

      Parents are more concerned with making sure their kids have a better life than they had, but the only ruler applied is how much is spent on them, more than seeing if the kid is actually happy.

      We've wasted how much taxpayer money on hearings to see if FB is indeed the devil & how to deal with it...
      How much time have they spent on dealing with the lack of mental health treatment available to citizens?

      link to this | view in chronology ]

      • icon
        sumgai (profile), 2 Nov 2021 @ 6:32pm

        Re: Re:

        How much time have they spent on dealing with the lack of mental health treatment available to citizens?

        Obviously not enough. 74.2 million American voters are proof of that particular lack.

        link to this | view in chronology ]

    • icon
      PaulT (profile), 3 Nov 2021 @ 1:36am

      Re:

      The parents may have some responsibility for missing warning signs, but if you honestly think that suicide can't be caused by clinical and other factors completely beyond their knowledge and control, I suggest you fuck right off.

      link to this | view in chronology ]

  • icon
    nasch (profile), 2 Nov 2021 @ 4:36pm

    Speech

    The recommendations system, and the display of suggested titles, is speech.

    This is a little off topic, but it got me thinking about speech protections. Recommendations such as those by Netflix, YouTube, and Facebook are speech by the platform itself. As such, they are protected by the 1st Amendment, but not by Section 230 - correct? Have there not been a bunch of lawsuits specifically targeting the recommendations, knowing that they cannot be quickly dismissed via 230?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Nov 2021 @ 4:42pm

    Deeply wrongheaded mentality

    I know this is the motivated reasoning of ambulance-chasers after deep pockets but they literally stated that they think it is /the duty/ of a company to try to analyze everything about a user to speculate if they qualify as having frail mental health. Even by the post 9/11 hysterical Orwellianism and helicopter parents that is way out there. Especially since anybody with even a rudimentary understanding of statistics and level of ML accuracy would be able to tell you about how preposterous the levels of false positives and negatives would be. Even if they had patient medical history instead of their watch history!

    link to this | view in chronology ]

    • icon
      sumgai (profile), 2 Nov 2021 @ 7:07pm

      Re: Deeply wrongheaded mentality

      Yeah, the issue here is the veracity of the applicant. How does any website determine the true age of a given viewer? Answer; they don't. They depend on self-honesty, and that's it. (OTOH, those sites dealing with money do take steps to make sure they won't get scammed. It still happens, but at least they try to prevent it. For them, responses to applicants are almost never "instant".)

      BTW, "helicopter parent" has been replaced by "bungie parent". Gotta keep up with the way things move faster and faster on the 'web, eh? ;)

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Nov 2021 @ 5:07pm

    The Biz: We show commercials because they change human behavior and more stuff gets sold!

    Also the Biz: It's impossible for our shows and movies to change human behavior!

    Welcome to clown world.

    link to this | view in chronology ]

    • icon
      Strawb (profile), 3 Nov 2021 @ 1:50am

      Re:

      Yes, because convincing someone to buy a car and having them kill themselves is completely the same thing.

      Dumbass.

      link to this | view in chronology ]

    • icon
      PaulT (profile), 3 Nov 2021 @ 2:59am

      Re:

      You're right, only as stupidly as possible. Commercials can't get you to buy a car if you're not in the market for a car. They might get you to favour a particular brand of vehicle or choose a particular special offer when you are in the market, but not if you aren't. If you live in an inner city apartment with no parking, you don't have a licence and walk or use public transport every day, you're not going to suddenly go out and buy a car. Similarly, it doesn't matter how many times I see a commercial for tampons - I'm never going to pick them up as an impulse purchase for myself.

      Same with shows about suicide. You can't show something to someone who does not have any related issue that might drive them to suicide, you can only potentially trigger people with those tendencies, and even that is dependant on various issues outside of the show itself.

      "Welcome to clown world."

      Population: you

      link to this | view in chronology ]

  • icon
    xebikr (profile), 2 Nov 2021 @ 8:34pm

    I agree that Netflix cannot be held legally responsible for this teen's suicide. However, that does not mean that they bear no responsibility at all. The studies about 'suicide contagion' are extensive and clear (see https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1124845/ for example.) There are guidelines for reporting and media portrayals of suicide to minimize the potential harm, but Netflix did not follow these. Honestly, I was angry way back when "13 Reasons Why" first came out because I knew this would be the result. So, no, not legally responsible, but there is still blood on their hands.

    link to this | view in chronology ]

  • icon
    Scary Devil Monastery (profile), 3 Nov 2021 @ 7:41am

    Well, this is a blast from the past...

    For any casual student of literary history, I'd suggest perusing the hue and cry resulting from that musty old book "Die leiden des jungen Werthers".

    There's a nice wiki entry over the english version - "The sorrows of young Werther".

    Suffice to say that impressionable young people committing copycat suicide isn't a new thing. And today just as it was then it usually turns out that the catalyst was only the final spark propelling an already depressed or damaged person into self-destruction.

    link to this | view in chronology ]

  • icon
    Toom1275 (profile), 3 Nov 2021 @ 11:28am

    13 Reasons Why was a book before being adapted by Netflix, but the lawsuits only start when the latter brings in $$$.

    Hmmm...

    link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 4 Nov 2021 @ 3:43am

      Re:

      "...but the lawsuits only start when the latter brings in $$$."

      We're playing Jeopardy now? I'll take "What is it that makes US tort a textbook example of 'piss-poor law'?" for 20k, then.

      Legal redress should be the last resort. Not the first. It should be considered a potentially expensive disincentive when it comes to committing malfeasance visavi another, not considered a great way to make out like a bandit as long as you can make a judge believe - truthfully or not - that you were wronged.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.