Mother's Lawsuit Attempts To Hold Snapchat, Instagram Responsible For Her Daughter's Suicide
from the tragic,-but-not-actionable dept
In the wake of a tragedy, it's human nature to seek some form of justice or closure. The feeling is that someone should be held accountable for a senseless death, even when there's no one to blame directly. This tends to result in misguided lawsuits, like the multiple suits filed by (far too opportunistic) law firms that seek to hold social media platforms accountable for the actions of mass shooters and terrorists.
The desire to respond with litigation remains even when there's a single victim -- one who has taken their own life. That's the case here in this lawsuit, coming to us via Courthouse News Service. Plaintiff Tammy Rodriguez's eleven-year-old daughter committed suicide. Her daughter was allegedly a heavy user of both Snapchat and Instagram. The connection between the platforms and her daughter's suicide is alluded to and alleged, but nothing in the lawsuit [PDF] shows how either of the companies are directly responsible for the suicide.
Here's how the complaint hopes to achieve these questionable ends:
Defendants have designed Instagram and Snapchat to allow minor users to use, become addicted to, and abuse their products without the consent of the users’ parents, like Tammy Rodriguez.
Defendants have specifically designed Instagram and Snapchat to be attractive nuisances to underage users but failed to exercise ordinary care owed to underage business invitees to prevent the rampant solicitation of underage girls by anonymous older users who do not disclose their real identities, and mass message underage users with the goal of grooming and sexually exploiting minors.
Defendants not only failed to warn Tammy and Selena Rodriguez of the dangers of addiction, sleep deprivation, and problematic use of their applications, but misrepresented the safety, utility, and addictive properties of their products. For example, the head of Instagram falsely testified under oath at a December 8, 2021 Senate Committee hearing that Instagram does not addict its users.
As a result of Selena Rodriguez’s addictive and problematic use of Instagram and Snapchat, she developed numerous mental health conditions including multiple inpatient psychiatric admissions, an eating disorder, self-harm, and physically and mentally abusive behaviors toward her mother and siblings.
As a proximate result of her addiction to Instagram and Snapchat, Selena Rodriguez committed suicide on July 21, 2021. She was eleven years old at the time.
There's the first problem with the lawsuit: "proximate result." While it's possible to show an indirect connection between a person's act and the actions of the services they use, you really can't argue "proximate cause" while also arguing "strict product liability." Either this suicide was the direct result of the design flaws or warning failures or it wasn't. It can't be both strict and proximate.
Inside those claims are further problems. What's stated in the product liability arguments is sometimes directly contradicted by the narrative of the lawsuit. For instance, under the claims about violations of California's Unfair Competition Law, the plaintiff says this:
Defendants engaged in fraudulent and deceptive business practices in violation of the UCL by promoting products to underage users, including Selena Rodriguez, while concealing critical information regarding the addictive nature and risk of harm these products pose. Defendants knew and should have known that their statements and omissions regarding the addictive and harmful nature of their products were misleading and therefore likely to deceive the members of the public who use Defendants’ products and who permit their underage children to use Defendants’ products. Had Plaintiff known of the dangerous nature of Defendants’ products, she would have taken early and aggressive steps to stop or limit her daughter’s use of Defendants’ products.
But the plaintiff did know, as is clearly stated earlier in the lawsuit:
Plaintiff Tammy Rodriguez, Selena’s mother, attempted multiple times to reduce or limit her daughter’s use of social media, which caused a severe reaction by Selena due to her addiction to Defendants’ products. Because Defendants’ products do not permit parental controls, the only way for Tammy Rodriguez to effectively limit access to Defendants’ products would be to physically confiscate Selena’s internet-enabled devices, which simply caused Selena to run away in order to access her social media accounts on other devices.
Plaintiff Tammy Rodriguez attempted to get Selena mental health treatment on multiple occasions. An outpatient therapist who evaluated Selena remarked that she had never seen a patient as addicted to social media as Selena. In the months leading up to Selena’s suicide, she was experiencing severe sleep deprivation that was caused and aggravated by her addiction to Instagram and Snapchat, and the constant 24-hour stream of notifications and alerts Defendants sent to Selena Rodriguez.
So, the plaintiff was aware -- not only by what she had personally observed but by what she had been told by her daughter's therapist. That not only undercuts the arguments made in the state law claims, but the very next paragraph from the narrative of the lawsuit.
Throughout the period of Selena’s use of social media, Tammy Rodriguez was unaware of the clinically addictive and mentally harmful effects of Instagram and Snapchat.
Now, I'm not saying any of this should excuse the slot machine feedback loops that permeate so many social media services. But it's a huge stretch to say social media platforms are directly (or proximately) responsible for violent acts by users, whether they're mass shootings or singular suicides. This lawsuit seems to admit there's no direct link (even though it definitely doesn't want to) by undercutting its own claims with factual assertions about the last several months of this child's life.
While it's true the average person may not understand the personal and psychological aspects of social media addiction, most people are familiar with the symptoms of addiction and, if they're a concerned parent like the one filing this suit, act accordingly. The narrative in this lawsuit attempts to show how dangerous these platforms are, but by doing so, show the plaintiff was well aware of the negative side effects. Just because that knowledge failed to prevent a tragedy does not automatically transfer full responsibility to the internet services her daughter used.
A lawsuit like this may prompt the creation of better parental controls or age verification procedures, but it's unlikely to result in Instagram and Snapchat being ordered to compensate this mother for the tragic death of her child. The connections are too tenuous and the allegations too conclusory to survive a motion to dismiss. The lawsuit's internal contradictions aren't going to help. And, despite some concerning comments from the Ninth Circuit of the supposed overbreadth of Section 230 immunity in recent months, this suicide was ultimately the act of someone who used these services, rather than an act encouraged or abetted by the services being sued.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: intermediary liability, section 230, selena rodriguez, suicide, tammy rodriguez
Companies: instagram, snapchat
Reader Comments
Subscribe: RSS
View by: Time | Thread
This subject is touchy but...
I don't want to finger point anymore than people probably already have but maybe the mom should consider herself to blame? Perhaps you didn't monitor your child's activity enough and allowed them to use too much social media. At some point you have to take responsibility as a parent.
I really feel for the mom, a parent should never have to burry their child. But it does more harm than good to shift the blame elsewhere and doesn't help anybody.
[ link to this | view in thread ]
I would also like to sue Twitter for making me stay on the toilet too long, resulting in rectal prolapse.
[ link to this | view in thread ]
My thought is that this was written by two lawyers who didn't talk to each other. And sure enough, I was correct - one's in SF, the other is in Seattle. Hmmmm....
All I saw was effectively negativity - "should have done this or that", and the like. Nothing concrete, only supposition. Usually not enough to get a conviction. (But as ever, courts sometimes come to unfathomable conclusions.) Lacking anything positive, such as "they did directly do this or that, as a well established known cause", I see no chance of survival in the current form. However, if an amended complaint is permitted.... that'll be interesting to see.
[ link to this | view in thread ]
Re: This subject is touchy but...
Your second paragraph is spot on, but sadly, shifting the blame for being an ineffective parent does help someone - the lawyers. Specifically, their wallets. IMO, these are worse than ambulance chasers, they should've told her right off the bat that her desired suit has no chance of success.
In your first 'graph, I need to point out that TFS repeatedly found, in analysis of the complaint, that the mother did know of the daughter's problems, and did try to do something about them. Again sadly, she just didn't try hard enough. But the 'monitoring' was definitely going on, make no mistake.
Tough love - not everyone is capable of it.
[ link to this | view in thread ]
Proximate Cause / Strict Liability
I agree with almost everything in this article--just going to take issue with one niggling detail. The article says that you cannot argue "proximate cause" while also arguing "strict product liability." I think it's rare that one would have circumstances that allow for both causes of action but it is possible. If you have an inherently dangerous product (e.g., water heater with defective heat sensors or pressure release valves that make it prone to explode), there can be strict liability and there is no need for a plaintiff making that argument to also prove that the defendant's negligent conduct proximately caused the alleged injury. But you can imagine a situation where there was also proximate cause (e.g., imagine the water heater was sold with a label saying "Prone to explode--stay at least 100 feet away at all times" and the retailer had replaved that label with one saying "All reported defects have been repaired").
[ link to this | view in thread ]
From Instagram help page:
Instagram requires everyone to be at least 13 years old before they can create an account (in some jurisdictions, this age limit may be higher). Accounts that represent someone under the age of 13 must clearly state in the account’s bio that the account is managed by a parent or manager
Couldn't be bothered to look up Snapchat but I assume it is similar.
Why was a 10 or 11 year old allowed unmonitored access to these networks?
I am not trying to blame the victim (or her Mum), but something went wrong with the family communications and it probably wasn't really anything to do with Snap or Insta
[ link to this | view in thread ]
proximate result
Regarding "proximate result", this probably doesn't refer to an indirect result but rather the legal concept of proximate cause. See https://en.wikipedia.org/wiki/Proximate_cause Proximate cause is basically a legally recognized cause of an event, as opposed to a "but for" cause.
But for Starbucks taking 10 minutes to make my coffee, I would not have missed the train, so I would not have taken an Uber, so I would not have been in the car crash. So Starbucks' actions were a but for cause of my injuries, but not a proximate cause.
The claim in the matter is that Instagram and Snapchat's actions/inactions/policies are legally recognized causes of the suicide.
[ link to this | view in thread ]
It's bad, but it's not inconsistent.
I think this analysis tries to hard to "gotcha" the pleading.
This is not a winner of a case, IMO, for two reasons:
But I don't think the complaint is internally inconsistent in getting to that point, just that it's not a winning argument.
First, at the complaint stage,
is just plain wrong. As I read the caselaw:
Sikkelee v. Precision Airmotive Corp., 907 F. 3d 701, 710 (3d Cir. 2018) (citations omitted; emphasis mine). See also Colgate v. Juul Labs, Inc., 345 F.Supp.3d 1178, 1193 (N.D. Cal. 2018) (holding that for a California strict liability design defect claim, "a plaintiff has met his burden if he establishes that there was a defect in the manufacture or design of the product and that such defect was a proximate cause of the injury.")
Proximate cause is part of strict liability, AFAIK, unless there's some split on the understanding of the Restatement that I don't know about.
Even if proximate cause was a negligence-only concept, the Federal Rules of Civil Procedure allow pleading in the alternative; see Fed. R. Civ. P. 8(d)(3), " A party may state as many separate claims or defenses as it has, regardless of consistency" (emphasis mine).
The plaintiff can claim that it was both strict liability and negligence at the pleadings stage, and if those can't both be true at the same time, that's not a problem until summary judgment.
Next, I don't think Plaintiff's attempts to get her kid off of the apps once the kid was addicted tank a failure-to-warn claim on straight "these facts don't prove this case" grounds.
The pleadings imply that the warning should have made the parent keep the kid off the apps basically from the moment they were downloaded, sort of like "this is like early 1990's TV depiction of one-hit-and-you're-a-junkie drug addiction; once eyeballs touch the screen you're doomed" level of dangerousness. Parent's gotta get warned before the eyeballs touch the screen, after that first taste the parent didn't prevent their kid's a social media junkie and the damage has been done.
There's room for a Twombly/Iqbal plausibility argument there, that it's ridiculous to say that the apps are that dangerous or if they were that there was any sort of warning that would suffice. So depending on how "liberal" the judge is with products cases, it could be dismissed at the motion to dismiss stage.
But that's not what Tim Cushing said, which is that the pleadings themselves doom the claims by contradicting themselves. The pleadings just push the time for the warning far into the early moments of using the app, possibly to before it's even downloaded. And they don't necessarily harm the design defect claim.
[ link to this | view in thread ]
Who puts the warning on meatspace for parents about how their kids react to their social environment?
[ link to this | view in thread ]
Respectfully I would have to strongly disagree with this statement. The persons family and loved ones are also victims of the event.
[ link to this | view in thread ]
Missing something here ...
This analysis is good, but glosses over an obvious point: Even if the kid were addicted to her various social media accounts, where's the evidence that this directly led to (or even influenced) her decision to kill herself?
I see some mention of correlations, but no attempt at actually showing causation.
[ link to this | view in thread ]
Re: Re: This subject is touchy but...
You've presumed that a solution was possible. That isn't always true. You've presumed that because a bad thing happened there has to be someone left behind to blame. That too isn't always true.
TFS quoted that the mother attempted confiscation of the internet devices, which prompted the child to run away and continue.
How is your comment not equivalent to "parent harder"?
[ link to this | view in thread ]
Once we went from
Daily news
Daily friends meetings at school or after
Daily news paper
1 Shared phone in the house.
To Everyone gets a Phone
Thats directly connected to the net
Thats directly connected to all my friends
That can show me the latest of my Fav Movie stars, Every second of the day.
Used to be called information overload.
[ link to this | view in thread ]
It's always the parents' fault when their offspring kill themselves. It's what they've been raising them to do since the day they were born.
[ link to this | view in thread ]
Re: proximate result
Which will fail, because at no time can anyone find/produce any evidence that either Instagram or Snapchat espouse suicide as a good thing to do. And that's before we get to the willful disregard of the TOS, which will go go a long ways toward sinking all efforts to suck the platforms down the black hole of "think of the grieving mother, she must be compensated". (But do understand, I do feel bad for her. I just don't like seeing the judicial system abused for person gain. (In this case, the gain of one or more lawyers.))
What's happening here is that lacking the moral courage to accept her loss, the mother wants to lash out at "the boogeyman who took my daughter away from me". This is entirely different from the woman in Kansas (is that correct, I don't remember just now) that told her daughter's classmate, via social media, that she should just kill herself... and the girl did that very thing. That situation did call for some retribution, and it was delivered. Said difference being, one is a neutral platform, the other was a natural-born person using a neutral platform.
Nope, I give this case a -1, Mr. Clark - it ain't got no beat that I can dance to!
[ link to this | view in thread ]
Re: Re: Re: This subject is touchy but...
Parent harder..... nice allegory.
After a moment's thought, I can't fault your thinking - "parent harder" is a pretty good simile for 'tough love'. No one said, and I certainly didn't intend to mean, that parenting is easy, but it's all a matter of degree - some children need more attention paid to their behavior than others. Some parents are simply better equipped to provide that attention, and execute corrective action, others aren't up to speed. The evidence is plain for all to see, should one wish to look.
As to my presumptions.... I didn't just presume that a solution is possible, I know that at least one is possible, and likely more than one would've worked out with good results. Experience, history, and the entire "child psychology" field have ample proof of that. But I'm not a psychiatrist, so I'm not going to expound on exactly how to go about finding such a solution, that's better left to the experts.
And why did you presume that I was in favor of finding someone to blame just because a bad thing happened? Did you not read my epistle, wherein I stated that the mother should' realize that there can be no blame to assign, because her daughter is not here to provide evidence of where any such blame should be placed. I spoke of "tough love" as being a way to redirect misbehavior, but that's not assigning blame.
The bottom line is a cold hard truth - self-blame after the fact won't bring her daughter back, and neither will assigning blame elsewhere. It's time for the mother to suck it up and administer some tough love for herself - accept reality, and move on. There are no prizes for playing the "if only I had...." game.
[ link to this | view in thread ]
Re:
Harsh, but you can't say there isn't a grain of truth to this statement. When you consider how parents compare how well their kids are performing to other parents' offspring and prompt an education/pageant arms race, social media was simply doing what parents and relatives have always been doing for generations.
[ link to this | view in thread ]
Re:
Because it is corporations responsibility to protect children.
[ link to this | view in thread ]
"Defendants have designed Instagram and Snapchat to allow minor users to use, become addicted to, and abuse their products without the consent of the users’ parents, like Tammy Rodriguez."
So Insta & Snapchat gave the CHILD the several hundred dollar phone & paid the monthly bill.
We tried to take it away but she ran away to use her accounts elsewhere. Well perhaps you should have told your childs friends parents that no internet access & maybe discuss the bad things you are seeing as the whole peer group could be infected.
I'm sorry your child took her own life, but here is the harsh truth...
The platforms did nothing wrong.
You failed to parent your child over & over, expecting that the same internet where you are warned about predators luring kids away had this magical protective motivation to protect your kid raises questions about why we don't require a license to have kids.
You will win nothing, you will stay in your tiny room blaming the platforms & the courts for not doing the 'right' thing & never deal with the remorse that will creep into your mind from time to time that you failed as a parent.
You bought the phone, you paid the bill, you never talked about expectations & rules until it was a problem.
This didn't happen in a weekend & you did the very least expecting others to do your job parenting & protecting your child.
[ link to this | view in thread ]
Re: Re: Re: Re: This subject is touchy but...
We have no idea how much attention, or what kind, these parents paid to their daughter. Depression is a serious psychiatric illness, and paying more attention to someone doesn't make it go away.
Proof that 100% of suicides could have been prevented if the person's family had done something different? I would be interested in seeing that proof.
If "being an ineffective parent" and "she just didn't try hard enough" were not intended to blame the mother, what did you mean by them?
I have seen phones in both of those cities.
[ link to this | view in thread ]
Re:
When someone is murdered, the family and friends are not murder victims. Similarly, these people are not suicide victims, regardless of how much they are suffering.
[ link to this | view in thread ]
Re: Re: Re: Re: This subject is touchy but...
I deal with chronic depression. When I get deep, my perception of those around me changes. Genuine concern rapidly becomes harassment in my perception. My mother being the worst at this. Years of idiots like you peddling psychiatric advice have fed her that the thing to do is keep reaching out, keep talking to me. And perhaps for some people its helpful. But it has driven me further away from my support structures, poisoned my ability to reach out, and done more harm than good. And this was much worse when I was a child without a fully-developed ability to think rationally.
Fucking jackass. Stop providing medical advice in the guise of good parenting you living embodiment of reckless endangerment.
[ link to this | view in thread ]
Re: Re: Re: Re: This subject is touchy but...
Both of you have taken this way too far, and in the latter case, in a way too personal a direction. Sorry, but I'm not up for discussing this topic in detail.
[ link to this | view in thread ]