Court Gets An Easy One Right: Section 230 Says Omegle Isn't To Blame For Bad People On Omegle

from the just-the-basics-here dept

Back in 2020, we had a post explaining that Section 230 isn't why Omegle has awful content, and getting rid of Section 230 wouldn't change that. Omegle, if you don't know, is a service that matches people, randomly, into video chats. It's basically the same thing as Chatroulette, which got super famous for a very brief period of time years ago. Both services are somewhat infamous for the unfortunately high likelihood of randomly ending up in a "chat" with some awful dude masturbating on the other side of the screen. But, still, there are a lot of people who like using it just for random chats. I have friends who are entertainers who like to use it to test out material on random people. It has a purpose. But, sure there are some awful people on the site, like many sites. And, content moderation of live video chat is quite a challenge.

For reasons I don't quite understand, some people blame Section 230 for the bad people on Omegle, and there have been a few recent lawsuits that try to get around Section 230 and still hold Omegle liable for the fact that bad people use the site. As others have explained in great detail, if these lawsuits succeed, they would do tremendous harm to online speech. We've discussed all the reasons why in the past -- but pinning liability on an intermediary for speech of its users is the best way to stifle all sorts of important speech online.

So, it's good news to see that one of the first such cases against Omegle was recently dismissed on Section 230 grounds -- and rather easily at that (story first noted by Eric Goldman). The case involved a situation which is, quite clearly, terrible. It involved what's apparently known as "a capper." As explained in the ruling:

Omegle, like many websites, is susceptible to hacking.... According to Plaintiffs, sexual predators have taken advantage of the anonymity that Omegle offers to prey on other users, including children.... Among these predators are “cappers,” who trick children into committing sexual acts over live web feeds while simultaneously recording the encounters....

On March 31, 2020, C.H. was randomly placed in a chatroom with a capper during her first time on Omegle.... C.H. — an eleven-year-old girl at the time — accessed the Omegle platform from her laptop.... She was initially placed in a chatroom with other minors for some time.... C.H. later ended the chat with the minors and was placed in another chatroom.... She was met in the next chatroom with a black screen that began displaying text from the other anonymous user, “John Doe.” ... John Doe informed C.H. that he knew where she lived, and he provided specific details of her whereabouts to prove it.... He threatened to hack C.H. and her family’s electronic devices if she did not disrobe and comply with his demands.... After pleading with John Doe without success, C.H. complied.... John Doe captured screenshots and recorded the encounter.... Immediately after this incident, C.H. informed her parents, who then contacted law enforcement.

Now there is no way to describe this as anything but absolutely horrifying. The dude who did this should be thrown away for a long time. But he is the person committing the horrible crime here, not Omegle. And that's what Section 230 helps clarify. So here, the court dismissed the case against Omegle:

First, Omegle is an ICS provider under Section 230. That is, Omegle is a system that allows multiple users to connect to a computer server via the Internet. 47 U.S.C. § 230(f)(3). ICS providers are afforded immunity under the CDA unless they materially augment or develop the unlawful content at issue. See Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1167-68 (9th Cir. 2008) (“a website helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.”). Indeed, Plaintiffs appear to acknowledge that Omegle is an ICS provider by arguing that “the rapidly evolving legal landscape . . . increasingly holds Internet Service Providers . . . liable for the harms they facilitate and oftentimes create.”...

Nonetheless, a review of the factual allegations confirms that Omegle functions by randomly pairing users in a chatroom and enabling them to communicate in real time. (Doc. # 75 at ¶¶ 33-34). There are no factual allegations suggesting that Omegle authors, publishes, or generates its own information to warrant classifying it as an ICP rather than an ICS provider. Compare Doe v. Mindgeek USA Inc., No. SACV 21-00338-CJC(ADSx), 2021 WL 4167504, at *9 (C.D. Cal. Sept. 9, 2021) (finding that website was an ICP where it actively created programs, curated playlists, and developed private messaging systems to facilitate trafficking of child pornography) with Mezey v. Twitter, Inc., No. 1:18-cv-21069-KMM, 2018 WL 5306769, at *1 (S.D. Fla. July 17, 2018) (granting Twitter CDA immunity where it merely displayed, organized, and hosted user content). Nor are there any factual allegations that Omegle materially contributes to the unlawfulness of the content at issue by developing or augmenting it. See Roommates.com, 521 F.3d at 1167-68. Omegle users are not required to provide or verify user information before being placed in a chatroom with another user. (Doc. # 75 at ¶¶ 37, 50-51). Further, some users, such as hackers and cappers, can circumvent Omegle’s anonymity using the data they themselves collect from other users during their encounters. (Id. at ¶ 38). The Court is persuaded that Omegle’s hosting capabilities for its users, coupled with its lack of material content generation, place it squarely within the definition of an ICS provider under 47 U.S.C. § 230(f)(2).

The plaintiffs tried a bunch of arguments to get around 230, and all of them fail. One key one was arguing that Omegle's design of the platform somehow gives it liability through "negligence", but the court says that doesn't work:

The other claims, Counts V, VII, and VII, confirm that Plaintiffs’ theories of liability against Omegle are rooted in the creation and maintenance of the platform. These claims recognize the distinction between Omegle as an ICS provider and the users, but nonetheless treat Omegle as the publisher responsible for the conduct at issue. Yahoo!, 570 F.3d at 1101-02. This is corroborated in no small part by Count VII, the “ratification/indemnification” claim, where Plaintiffs maintain that child sex trafficking was so pervasive on and known to Omegle that it should be vicariously liable for the damages caused by the cappers and similar criminals.... Through the negligence and public nuisance claims, Plaintiffs allege that Omegle knew or should have known about the dangers that the platform posed to minor children, and that Omegle failed to ensure that minor children did not fall prey to child predators that may use the website....

The CDA bars such claims as they seek to redirect liability onto Omegle for the ultimate actions of their users. See, e.g., Bauer v. Armslist, LLC, No. 20-cv-215-pp, 2021 WL 5416017, at **25-26 (E.D. Wis. Nov. 19, 2021) (dismissing, among others, negligence, public nuisance, aiding and abetting tortious conduct, and civil conspiracy claims, against ICS provider website that was used to facilitate unlawful firearm sales); Kik, 482 F. Supp. 3d at 1249-50 (website where users solicited plaintiff for sexual photographs was immune from sex trafficking, negligence, and strict lability claims where website only enabled user communication); Poole v. Tumblr, Inc., 404 F. Supp. 3d 637, 642-43 (D. Conn. 2019) (content hosting website entitled to immunity from invasion of privacy and negligent infliction of emotional distress claims); Saponaro v. Grindr, LLC, 93 F. Supp. 3d 319, 325 (D. N.J. 2015) (dismissing “failure to police” claim against ICS provider under Section 230). Regardless of form, each of Plaintiffs’ claims ultimately seek to treat Omegle as a publisher or speaker, which are encompassed within Section 230 immunity.

As the court notes, the person who did the wrong thing here was "John Doe," not Omegle:

John Doe’s video feed, his brandishing of C.H.’s personal identifying information, and the threats he subjected her to were not provided by Omegle in any sense.... Merely providing the forum where harmful conduct took place cannot otherwise serve to impose liability onto Omegle.

There was, of course, also a FOSTA claim in the lawsuit. As you'll recall, FOSTA created a new Section 230 exemption for sex trafficking. But, even with that, Omegle is not liable here, as the court notes that a site would need specific knowledge of sex trafficking, not "generalized knowledge" that the platform is sometimes used for sex trafficking.

As analyzed in the recent decision of Doe v. Kik Interactive, Inc., the legislative history of the CDA confirms that generalized knowledge that sex trafficking occurs on a website is insufficient to maintain a plausible 18 U.S.C. § 1591 claim that survives CDA immunity. 482 F. Supp. 3d 1242, 1250 n. 6 (S.D. Fla. 2020). The plaintiff in Kik alleged that multiple users on the Kik website solicited her for sexually explicit photographs. Id. at 1244. She then brought claims against Kik for violations of 18 U.S.C. §§ 1591, 1595, negligence, and strict liability. Id. at 1245-46, 1251. The Kik court found that Kik would not be immune from suit only if it were alleged that Kik had actual knowledge of the underlying incident and had some degree of active participation in the alleged sex trafficking venture. Id. at 1250-51. The Kik plaintiff did not assert actual knowledge or overt participation on behalf of Kik, and instead asserted that Kik had general knowledge of other sex trafficking incidents on the website. Id. at 1251. Thus, the Kik court found that Kik was entitled to Section 230 immunity because plaintiff had not plausibly alleged a claim that would surmount Section 230 immunity. Id.; see also Reddit, 2021 WL 5860904, at *8 (dismissing 18 U.S.C. § 1591 claim for failure to plead that ICS provider knowingly participated in a sex trafficking venture).

The requirement for actual knowledge, as opposed to generalized knowledge, seems to annoy some people, but it's the only reasonable standard. If generalized knowledge were enough to create liability, how would a site respond? It would shut down all sorts of speech, trying to overblock for fear of any liability. Expecting a website to magically figure out how to stop bad people from using it is an impossible task. And, it really takes away from the simple fact that you should hold the people who did the criminal acts liable for the criminal acts and not the providers of the tools they use.

Other such cases should face a similar end. I understand that people are upset that there are bad people on these platforms doing bad things -- and that some kids use these platforms. But there are better ways to deal with that: namely (1) holding those people who actually violate the law responsible for their own criminal acts, and (2) better educating our children on how to use the internet and what to do if they come across a dangerous situation like this.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: bad people, intermediary liability, section 230
Companies: omegle


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 31 Jan 2022 @ 2:30pm

    i think the goal, rather than somewhat particularized suppression of speech, is more along the lines of "thou shalt not exist" (at least after giving me money).

    Of course that is still suppression of all sorts of speech and expression, but some people can't be arsed until it affects them directly.

    link to this | view in chronology ]

    • icon
      Tanner Andrews (profile), 1 Feb 2022 @ 2:49am

      Re: A Good Start, Perhaps

      [goal is to eliminate service provider]

      Sure, and it is a start. But let us follow through on the rule. Eliminating the tools of evil does have the advantage of appearing to discourage evil, and that appearance is valuable in many cases.

      We just have to keep going. Many people use pointed sticks to threaten or injure other people. If we cut down all the trees, there will be no way for these bad persons to obtain the tools to commit their assaunts and batteries.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 31 Jan 2022 @ 2:44pm

    Pretty sure, like many sites, there is a part of the ToS that says minors may not use the site (or, as in some cases, only with parental supervision).

    I wonder how these parents would respond to being prosecuted for CCFA violations (something that should absolutely not be encouraged), or perhaps having their own parental guidance being called into question due to their own negligence.

    It feels that what they are trying to prosecute over could easily come down to them allowing their child to violate a sites ToS, or to them not educating their child about online safety. Either way, if they want to hold Omegle liable, perhaps they should look in a mirror.

    I find it interesting that the kid apparently complied only to THEN go to the cops. Surely it would have been more sensible to ignore it, move on, then report it.

    It is disappointing, if a bit understandable out of awkawardnes, that the parents would never consider telling their child not to do something like that. But by trying to prosecute the service for their failings is disgustingly hypocritical.

    link to this | view in chronology ]

    • icon
      TaboToka (profile), 31 Jan 2022 @ 4:11pm

      Re: Eh, you missed it and assume too much

      I find it interesting that the kid apparently complied only to THEN go to the cops.

      She went to her parents, not the cops.

      Surely it would have been more sensible to ignore it, move on, then report it.

      You assume the kid's parents have rightfully instructed her on what to do, although many an adult would fall for the same thing:

      1) The scumbag said it knew where the kid lived. "and [it] provided specific details of her whereabouts to prove it"

      2) The scumbag said it would hack the kid's "electronic devices" (phones, tablets, laptops, TVs, whatever) unless she did what it said.

      Anyone who was confronted with #1 would freak the fork out, as they should.

      Most folks who are techo-illiterate would panic at #2. A kid of 11 would have no idea how to deal with this.

      I would put more of the onus on the PARENTS, as they need to instruct the kid on how to deal with scum, bullying and threats. They need to take an active role in verifying what sites she's using, as best they can. No way would I let my 11 year old use Omegle unsupervised--that's what allowlists are for, after all.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 31 Jan 2022 @ 5:37pm

        Re: Re: Eh, you missed it and assume too much

        Unfortunately, with how the internet is, that kinda makes it nigh impossible.

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 31 Jan 2022 @ 5:36pm

      Re:

      I wonder how these parents would respond to being prosecuted for CCFA violations (something that should absolutely not be encouraged), or perhaps having their own parental guidance being called into question due to their own negligence.

      Probably state they weren't aware, or likely sue them for having CCFA.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 22 Feb 2022 @ 8:08pm

      Re:

      Your an idiot

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 31 Jan 2022 @ 5:42pm

    Now there is no way to describe this as anything but absolutely horrifying. The dude who did this should be thrown away for a long time. But he is the person committing the horrible crime here, not Omegle. And that's what Section 230 helps clarify.

    Problem is, how would you expect to fix the problem when it's much easier to sue a company?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 31 Jan 2022 @ 5:47pm

      Re:

      Plus, the other problem is, with how some courts tend to act against bad people (giving them plea deals or decide they're not guilty), that leads to the increase of suing companies.

      I know this is rare, but it tends to look like a majority.

      link to this | view in chronology ]

      • icon
        nasch (profile), 1 Feb 2022 @ 8:56am

        Re: Re:

        I know this is rare, but it tends to look like a majority.

        It's not rare at all. Estimates are that 90+ percent of convictions are plea deals.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 31 Jan 2022 @ 8:27pm

    In my opinion, there are a few things which might prevent this sort of thing, and I agree it is vile:

    1) Geographical data associated with IP Addresses should be much more limited than they are now. This is the most common way of scaring someone into thinking they've been "hacked" on the Internet.

    2) Everyone should use VPNs.

    3) You should be taught at an early age not to give your data to anyone on the Internet. Even providers.

    4) Providers should avoid asking for sensitive data. They can't leak what they don't have.

    5) Parents.

    None of these ideas involve poking a hole in Section 230.

    link to this | view in chronology ]

  • icon
    ECA (profile), 1 Feb 2022 @ 11:35am

    I wonder

    If as a USA citizen, I would goto another country and Hurt someone,
    Would the USA Gov be held liable?
    OR the Company I would work for.
    OR my mother and father.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Feb 2022 @ 4:19pm

    Paid creators and section 230

    Here might be a compromise. If creators are being paid for their work, then the companies hosting that content and distribution the payments should be liable for the content. This would remove a HUGE incentive to spreading misinformation and likely only impacts a small percentage of the content that is hosted by the company. I dunno probably a terrible idea, please shoot holes in this.

    link to this | view in chronology ]

    • icon
      nasch (profile), 1 Feb 2022 @ 5:04pm

      Re: Paid creators and section 230

      please shoot holes in this.

      There are probably many issues, but the most obvious one to me is that even Google couldn't afford to take on that kind of liability. So the day after that law passed YouTube would stop paying video creators, and all the channels that rely on that income would shut down.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.