Mother's Lawsuit Attempts To Hold Snapchat, Instagram Responsible For Her Daughter's Suicide
from the tragic,-but-not-actionable dept
In the wake of a tragedy, it's human nature to seek some form of justice or closure. The feeling is that someone should be held accountable for a senseless death, even when there's no one to blame directly. This tends to result in misguided lawsuits, like the multiple suits filed by (far too opportunistic) law firms that seek to hold social media platforms accountable for the actions of mass shooters and terrorists.
The desire to respond with litigation remains even when there's a single victim -- one who has taken their own life. That's the case here in this lawsuit, coming to us via Courthouse News Service. Plaintiff Tammy Rodriguez's eleven-year-old daughter committed suicide. Her daughter was allegedly a heavy user of both Snapchat and Instagram. The connection between the platforms and her daughter's suicide is alluded to and alleged, but nothing in the lawsuit [PDF] shows how either of the companies are directly responsible for the suicide.
Here's how the complaint hopes to achieve these questionable ends:
Defendants have designed Instagram and Snapchat to allow minor users to use, become addicted to, and abuse their products without the consent of the users’ parents, like Tammy Rodriguez.
Defendants have specifically designed Instagram and Snapchat to be attractive nuisances to underage users but failed to exercise ordinary care owed to underage business invitees to prevent the rampant solicitation of underage girls by anonymous older users who do not disclose their real identities, and mass message underage users with the goal of grooming and sexually exploiting minors.
Defendants not only failed to warn Tammy and Selena Rodriguez of the dangers of addiction, sleep deprivation, and problematic use of their applications, but misrepresented the safety, utility, and addictive properties of their products. For example, the head of Instagram falsely testified under oath at a December 8, 2021 Senate Committee hearing that Instagram does not addict its users.
As a result of Selena Rodriguez’s addictive and problematic use of Instagram and Snapchat, she developed numerous mental health conditions including multiple inpatient psychiatric admissions, an eating disorder, self-harm, and physically and mentally abusive behaviors toward her mother and siblings.
As a proximate result of her addiction to Instagram and Snapchat, Selena Rodriguez committed suicide on July 21, 2021. She was eleven years old at the time.
There's the first problem with the lawsuit: "proximate result." While it's possible to show an indirect connection between a person's act and the actions of the services they use, you really can't argue "proximate cause" while also arguing "strict product liability." Either this suicide was the direct result of the design flaws or warning failures or it wasn't. It can't be both strict and proximate.
Inside those claims are further problems. What's stated in the product liability arguments is sometimes directly contradicted by the narrative of the lawsuit. For instance, under the claims about violations of California's Unfair Competition Law, the plaintiff says this:
Defendants engaged in fraudulent and deceptive business practices in violation of the UCL by promoting products to underage users, including Selena Rodriguez, while concealing critical information regarding the addictive nature and risk of harm these products pose. Defendants knew and should have known that their statements and omissions regarding the addictive and harmful nature of their products were misleading and therefore likely to deceive the members of the public who use Defendants’ products and who permit their underage children to use Defendants’ products. Had Plaintiff known of the dangerous nature of Defendants’ products, she would have taken early and aggressive steps to stop or limit her daughter’s use of Defendants’ products.
But the plaintiff did know, as is clearly stated earlier in the lawsuit:
Plaintiff Tammy Rodriguez, Selena’s mother, attempted multiple times to reduce or limit her daughter’s use of social media, which caused a severe reaction by Selena due to her addiction to Defendants’ products. Because Defendants’ products do not permit parental controls, the only way for Tammy Rodriguez to effectively limit access to Defendants’ products would be to physically confiscate Selena’s internet-enabled devices, which simply caused Selena to run away in order to access her social media accounts on other devices.
Plaintiff Tammy Rodriguez attempted to get Selena mental health treatment on multiple occasions. An outpatient therapist who evaluated Selena remarked that she had never seen a patient as addicted to social media as Selena. In the months leading up to Selena’s suicide, she was experiencing severe sleep deprivation that was caused and aggravated by her addiction to Instagram and Snapchat, and the constant 24-hour stream of notifications and alerts Defendants sent to Selena Rodriguez.
So, the plaintiff was aware -- not only by what she had personally observed but by what she had been told by her daughter's therapist. That not only undercuts the arguments made in the state law claims, but the very next paragraph from the narrative of the lawsuit.
Throughout the period of Selena’s use of social media, Tammy Rodriguez was unaware of the clinically addictive and mentally harmful effects of Instagram and Snapchat.
Now, I'm not saying any of this should excuse the slot machine feedback loops that permeate so many social media services. But it's a huge stretch to say social media platforms are directly (or proximately) responsible for violent acts by users, whether they're mass shootings or singular suicides. This lawsuit seems to admit there's no direct link (even though it definitely doesn't want to) by undercutting its own claims with factual assertions about the last several months of this child's life.
While it's true the average person may not understand the personal and psychological aspects of social media addiction, most people are familiar with the symptoms of addiction and, if they're a concerned parent like the one filing this suit, act accordingly. The narrative in this lawsuit attempts to show how dangerous these platforms are, but by doing so, show the plaintiff was well aware of the negative side effects. Just because that knowledge failed to prevent a tragedy does not automatically transfer full responsibility to the internet services her daughter used.
A lawsuit like this may prompt the creation of better parental controls or age verification procedures, but it's unlikely to result in Instagram and Snapchat being ordered to compensate this mother for the tragic death of her child. The connections are too tenuous and the allegations too conclusory to survive a motion to dismiss. The lawsuit's internal contradictions aren't going to help. And, despite some concerning comments from the Ninth Circuit of the supposed overbreadth of Section 230 immunity in recent months, this suicide was ultimately the act of someone who used these services, rather than an act encouraged or abetted by the services being sued.
Filed Under: intermediary liability, section 230, selena rodriguez, suicide, tammy rodriguez
Companies: instagram, snapchat