Appeals Court Doubles Down On Dangerous Ruling: Says Website Can Be Blamed For Failing To Warn Of Rapists

from the bad-cases-make-bad-law dept

Back in late 2014, we wrote about a case where the somewhat horrifying details were likely leading to a bad result that would undermine Section 230 of the CDA (the most important law on the internet). Again, the details here are appalling. It involves two guys who would use other people's accounts on a website called "Model Mayhem" to reach out to aspiring models, then lure them to their location in South Florida, drug them, and then film themselves having sex with the drugged women to then offer as online porn. Yes, absolutely everything about this is horrifying and disgusting. But here's where the case went weird. A victim of this awful crime decided to sue the large company Internet Brands, who had purchased Model Mayhem, arguing that it knew about these creeps and had failed to warn users of the service. Internet Brands had argued that under Section 230 it was not liable and the appeals court said no. The case was then reheard en banc (with a large slate of 9th Circuit judges) and they've now, once again, said that Section 230 does not apply.

This case has been a favorite of those looking to undermine Section 230, so those folks will be thrilled by the results, but for everyone who supports an open internet, we should be worried. The rule here is basically that sites are protected from being held liable of actions of their users... unless those users do something really horrible. Then things change. It's further important to note that the two sick creeps who pulled off this scam, Lavont Flanders and Emerson Callum, weren't actually members of the Model Mayhem site. They would just use the accounts of others to reach out to people, so the site had even less control.

To get around the plain language and caselaw history around Section 230, the court has to quite carefully parse its words. It starts out by noting that Internet Brands clearly qualifies for the safe harbors as an internet platform. However, it bends over backwards to reinterpret a key part of CDA 230, that says you cannot treat such a platform "as a publisher or speaker" of information posted by users. Here, the court decides that the law requiring services to warn of potential danger do no such thing:
Jane Doe’s claim is different, however. She does not seek to hold Internet Brands liable as a “publisher or speaker” of content someone posted on the Model Mayhem website, or for Internet Brands’ failure to remove content posted on the website. Jane Doe herself posted her profile, but she does not seek to hold Internet Brands liable for its content. Nor does she allege that Flanders and Callum posted anything to the website. The Complaint alleges only that “JANE DOE was contacted by Lavont Flanders through MODELMAYHEM.COM using a fake identity.” Jane Doe does not claim to have been lured by any posting that Internet Brands failed to remove. Internet Brands is also not alleged to have learned of the predators’ activity from any monitoring of postings on the website, nor is its failure to monitor postings at issue.

Instead, Jane Doe attempts to hold Internet Brands liable for failing to warn her about information it obtained from an outside source about how third parties targeted and lured victims through Model Mayhem. The duty to warn allegedly imposed by California law would not require Internet Brands to remove any user content or otherwise affect how it publishes or monitors such content.
In other words, because the law only compels a form of speech -- i.e., a duty to warn people about creeps on your service -- as opposed to a duty to suppress speech, then Section 230 doesn't apply here. Bizarrely, the court points to the so-called "Good Samaritan" clause in CDA 230 (CDA 230(c)(1)) that further notes that any action that a site takes to moderate content cannot be used to create liability around other content on the site, as further proof for its position:
Jane Doe’s failure to warn claim has nothing to do with Internet Brands’ efforts, or lack thereof, to edit, monitor, or remove user generated content. Plaintiff’s theory is that Internet Brands should be held liable, based on its knowledge of the rape scheme and its “special relationship” with users like Jane Doe, for failing to generate its own warning. Thus, liability would not discourage the core policy of section 230(c), “Good Samaritan” filtering of third party content.
The court also rejects the idea that this ruling might chill free speech by leading to greater monitoring and censorship, basically just tossing it off to the side as unlikely to be a big deal:
It may be true that imposing any tort liability on Internet Brands for its role as an interactive computer service could be said to have a “chilling effect” on the internet, if only because such liability would make operating an internet business marginally more expensive. But such a broad policy argument does not persuade us that the CDA should bar the failure to warn claim. We have already held that the CDA does not declare “a general immunity from liability deriving from third-party content.” Barnes, 570 F.3d at 1100. “[T]he Communications Decency Act was not meant to create a lawless no-man’s-land on the Internet.” Roommates.Com, 521 F.3d at 1164. Congress has not provided an all purpose getout- of-jail-free card for businesses that publish user content on the internet, though any claims might have a marginal chilling effect on internet publishing businesses. Moreover, the argument that our holding will have a chilling effect presupposes that Jane Doe has alleged a viable failure to warn claim under California law. That question is not before us and remains to be answered.
Some will, undoubtedly, argue that this limiting of Section 230 is a good thing, either because they already dislike 230, or because they believe that the behavior described above was so beyond the pale that it's fine to punish the platform for it. This is problematic. No one denies that the two individuals who committed these acts deserve to be in jail (for a long time). But blaming the platform that they used for not posting a warning seems extreme and does seem to confuse how Section 230 is supposed to work. The key point is in accurately putting liability on the parties who caused the action. That wasn't the website and it shouldn't be blamed.

You can now expect lots of cases citing this case as they look for any way to get past Section 230's protections.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: 9th circuit, rape, section 230, warnings
Companies: internet brands, model mayhem


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 2 Jun 2016 @ 9:50am

    Warnings...

    School campuses should have posted warnings about the potential for shootings and convenience stores should post warnings about armed robberies, etc.

    link to this | view in thread ]

  2. icon
    Robert Beckman (profile), 2 Jun 2016 @ 9:54am

    Isn't this just saying that a website TOS just needs to add a section, a la Apple, that bad people may use the site too even though they shouldn't, and that those people might violate any of the crimes in any jurisdiction in the world?

    Wouldn't they be covered then? Of course, it would be a useless warning, but that seems to be the claim.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:01am

    Re:

    that is true with every site, thus the reason for 230; to assure that blame is toward the user and not towards the tool maker

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:02am

    Re: Re:

    and when I say user, I mean the user that is committing the crime. And not toward the tools that they use to commit the crime.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:05am

    Re:

    I'd be surprised if some form of "take care of yourself" wouldn't already exist somewhere on the site, probably buried in the ToS. It seems to me the plaintiff would have wanted some sort of banner on the site saying "well yeah you are welcome to use our site if you insist but just so you know some of our members are getting raped..."

    link to this | view in thread ]

  6. icon
    Whatever (profile), 2 Jun 2016 @ 10:08am

    I think the courts are pretty much bang on with this one, because they didn't buy into the 230 red herring at all.

    This isn't about free speech or safe harbors or anything like that, rather it's about what the company knew was happening (as a general concept) and failed to communicate those concerns to the models on their site.

    I know they could not be specific as the guys were using hijacked accounts, but they could have communicated that such a scam was going on and that models should be extremely careful in this regard.

    Knowing that this was going on (generally and not specifically) and doing nothing to inform their users is negligent IMHO. It's not a section 230 issue, it's general common sense.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:08am

    should be easy enough to get a ruling that opposes this. Just make a government run website the focus and the courts will fall over themselves to say it doesn't apply.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:14am

    Telephone Scams

    There are currently a large number of telephone scams underway targeting unsuspecting subscribers, particularly the elderly. Maybe they can use the reasoning enshrined in this ruling to hold the telephone companies responsible for their losses.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:16am

    Re: Warnings...

    This opinion isn't about the underlying duty to warn. It's about immunity from such an obligation as a website. This only says that the case will move forward to determine whether the facts were sufficient to trigger that duty to warn.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:20am

    Warning

    The inherent dangers of living can lead to death. This site is not liable for such dangers encountered in the real world.
    As such, your continued use of this site will be considered acceptance of the consequences of your own actions.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:23am

    So, horrible things neuter Section 230 now?

    ...unless those users do something really horrible. Then things change.

    And what could be more horrible than copyright infringement?

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:32am

    the US legal system is going further and further down the crapper! what the hell is the matter with these 'judges'? they'll be the first ones to create if other countries follow this lead, again and get caught up in something. absolutely crazy ruling that needs revoking!!

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:46am

    So because she wasn't smart enough to do a Google search about model mayhem or incidents related to drugging and rape section 230 doesn't apply?

    I guess in a world where McDonald's coffee must put a label saying "caution hot" websites must put warnings about misuse too.

    So why didn't she sue her ISP for allowing the website to come to her device?

    It is sad she was victimized but it is only the two jerks that should be punished.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 2 Jun 2016 @ 10:58am

    Re:

    Did she drive a car to meet them? If so, did the car maker warn her of the danger of driving to meet strangers?

    link to this | view in thread ]

  15. identicon
    Anonmylous, 2 Jun 2016 @ 11:13am

    INTERNET WARNING!!!

    Welcome to The Internet! We'd like to take a moment to caution you before you begin that just like in the real world, we have rapists, thugs, thieves, con artists, grifters, junkies, pushers, hookers, murderers, pedophiles, and all manner of people who will seek to take advantage of your naivete and ignorance and trust.

    Please keep this in mind in all dealings and interactions you may have with others while visiting. Thank you and look out behind you, he's got a knife!

    link to this | view in thread ]

  16. icon
    Uriel-238 (profile), 2 Jun 2016 @ 11:20am

    "User" = The user committing the crime.

    It works just as well to blame the end user who is a victim of the crime, given that he or she was informed of the risk.

    People who use crosswalks sometimes get run over by reckless drivers, and shouldn't cross the street if they're uninformed of the risk.

    Also: Guns, motor vehicles and power tools.

    link to this | view in thread ]

  17. icon
    radarmonkey (profile), 2 Jun 2016 @ 11:27am

    Useless warnings

    I was born/raised in California, and I've seen a lot of stupid, knee-jerk reactionary laws passed. We have Proposition 65 warnings that must be posted if a building contains a business that uses chemicals that are known to cause cancer. Nobel enough, right? However, EVERY building uses cleaning agents that contain some chemical that fits the mandated warning. Therefore, EVERY building has the warning, making the warning USELESS.

    People have been warning of 'creeps on the internet' for 20+ years, but people will still fall victim to predators. ANY social network that seeks to connect people, from Model Mayhem to LinkedIn, will have to have a warning to satisfy a ruling like this. Every. Single. Website. People will click past them faster than an EULA, rendering the warnings USELESS.

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 2 Jun 2016 @ 11:30am

    Wait I'm confused. So if I steal the cell phone of someone from a modeling agency. Use it to call someone and tell them I going make them a supermodel, then take them somewhere and rape them, is it the modeling agency or the person who was assigned the phone that is a fault? Or both?

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 2 Jun 2016 @ 2:16pm

    The website is absolutely responsible for protecting its members. There are limits to Section 230. For instance, I run a popular anime and manga website and community. If one of my members informed me that a member was harassing, intimidating or threatening another member ... if I didn't take precautions against the member who was engaging in this behavior, I would have some responsibility for not acting.

    When I've discovered members acting against the best interests of my community, I've banned those members accounts along with their email addresses and IP addresses and also added such restrictions to my site's control panel that prevents those members from accessing my website in the future.

    I'm sure Techdirt has similar policies where it concerns users harassing, intimidating or threatening other users who post on this site. If someone were to warn Mike Masnick that a registered member is behaving in a manner that is not permissable, and that Techdirt and/or Mike Masnick ignored the warnings and allowed the harassment to continue, then the owners of this website could be held liable for that conduct and that Section 230 wouldn't protect this website.

    Section 230 just protects websites from content posted by their users. It doesn't protect websites from criminal activity conducted by their users, especially if they have been made aware of it and have knowledge of it.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 2 Jun 2016 @ 2:27pm

    A warning that people are horrible seems pretty redundant for anyone that is already on the internet.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 2 Jun 2016 @ 2:45pm

    AC, the problem with this is that the website is trying to claim Section 230 of the CDA. However, the website owner was aware that there were rapists prowling its website and failed to adequately warn its own membership that this was happening.

    This has more to do with criminal liability than it does with the CDA. The website owner incorrectly offered the wrong argument and that's why the court ruled against it.

    It's no different than techdirt getting warned that someone was using its website to conduct illegal activities. Techdirt couldn't then argue that its protected under Section 230 of CDA because that would be the wrong argument. It would involve techdirt's knowledge that its website was being used to engage in criminal conduct and they would have no protection against that.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 2 Jun 2016 @ 3:21pm

    Whatever happened to going after and punishing the people that actually do the crimes? That's the way it was when I was a kid but now with the internet it seems like everyone just wants a payday and thinks the platforms/websites they use should know all and be able to keep all the bad people out so they go after the platforms that were used instead.

    link to this | view in thread ]

  23. icon
    That One Guy (profile), 2 Jun 2016 @ 4:49pm

    General knowledge vs Specific

    It's no different than techdirt getting warned that someone was using its website to conduct illegal activities. Techdirt couldn't then argue that its protected under Section 230 of CDA because that would be the wrong argument.

    Pretty much that exact same argument has been pushed by those looking to expand copyright law in the form of expansions to third-party liability, and the reason it fails there is the reason it doesn't work so well here.

    There's a notable difference between general knowledge('Someone is using the site for illegal actions') versus specific knowledge('This particular person is using the site for this particular illegal act'). One carries an obligation on the part of the site if they want to continue to be covered under the law, the other does not. I'll let you figure out which is which.

    Given I really doubt the individuals in this case were polite enough to notify the site, 'Hey, we're going to be using your site in order to facilitate the drugging and raping of people', the best they could have done upon learning that someone had used their site for that(how exactly they'd learn that it was the site as opposed to any other number of possible causes is a bit of a problem) would be a general warning to be careful, and that people don't always match how they present themselves online, which wouldn't likely have accomplished much other than causing a witch hunt and general distrust of the service in general.

    Blame the two individuals responsible for their actions all you want, they certainly deserve it, but expecting sites to know things that they have no reasonable way of knowing, and punishing them for not responding 'properly' on the knowledge they don't have doesn't make sense and helps no-one.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 2 Jun 2016 @ 5:59pm

    Re: General knowledge vs Specific

    Exactly!

    I posted the hypothetical above about stealing a cell phone and using it for rape.

    If that modeling agency had specific knowledge that that phone was going to be used for rape then they would be liable if they didn't let some know.

    However if the just knew that some of the phones that got stolen sometimes got used for rape, then would altering every girl who might use there agency(which would have to be every girls on the planet just in case) or have to tell every person you call, "Hello I'm from a big modeling agency, by the way sometimes people steal our phones and use them to lure girls into rape, would you like to work for us?"

    That said, I'm sure there are some circumstances where the proper thing to do would have been to send out a notice, but that may not even work in this case. Looking at it from the model who got the emails point of view they may never have even gone to the web site. The model may have assumed the email was legitimate and continued all correspondence through that email. In that case the web site may net even be able to do anything.

    Oh the other hand if the situation was something where the web site knew this was happening and got informed of an account that was hacked, they probably should have sent a warning email to all the people that got emails from that hacked account recently. I still don't think I would find liability in this case.

    Just to finish this off. If the web site knew that the account was hacked by the same person who had previously used it to rape someone, or saw that there were email sent out that match the ones used for rape, and did nothing, then there may be a case for liability.

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 2 Jun 2016 @ 6:04pm

    Re:

    --what the hell is the matter with these 'judges'

    Elections/emotions

    They should use logic like a Vulcan but when it comes to subjects like rape, child porn, school shootings etc they have to consider how the irrational emotions of the public will perceive their rulings.

    Being known as judge who found an Internet rape website not guilty makes it kinda difficult to get reelected.

    link to this | view in thread ]

  26. identicon
    Anonymous Coward, 2 Jun 2016 @ 6:43pm

    Re: Useless warnings

    Proposition 65 was a good idea that was quickly turned into a joke by the chemical company lobbies. Instead of trying to fight the notion, they just made it so that everything is known to the state of California to cause cancer. So everything has a warning and the warning is meaningless as you say.

    I'm sure it was far cheaper than campaigning against the proposition.

    link to this | view in thread ]

  27. icon
    BJC (profile), 2 Jun 2016 @ 7:51pm

    Re: General knowledge vs Specific

    I get your argument between general and specific, but according to the statement of the facts in the complaint from the opinion (helpfully posted above), the case is in fact that specific. The plaintiff alleged that the website knew about a year before plaintiff was raped that the two particular guys from Florida who raped her were using the site to rape women, and had been doing so since 2008.

    The complaint goes so far to allege that the current owners of the website sued the prior owners for not telling them about these particular rapists being serial users of the website before selling the site.

    I think the lawyer who drafted the plaintiff's complaint was a pretty smart guy; this was the best-pled case to argue that this was a "post a picture of the known dangerous person in your establishment" case rather than a "moderate comments" case.

    link to this | view in thread ]

  28. identicon
    Ruby, 2 Jun 2016 @ 8:01pm

    arguing that it knew about these creeps and had failed to warn users of the service

    the key word there is "knew". She said the site knew.

    She would need to prove this, but if it is true (and, considering the court ruled in her favor it likely was), if this or any other site became aware that it's site was being use to commit multiple serious, dangerous felonies, why shouldn't it have a duty to warn at-risk users?!

    Again, the argument isn't that it was a hypothetical possibility that something bad could happen, but that it was a known fact to the owners that accounts were being fraudulently accessed and used to lure victims into horrible situations...and the site couldn't be bothered to do so much as send out a single email the warn users of the risk.

    And, frankly, if they did know, the reason they didn't say anything was most likely to avoid negative publicity and/or loss of revenue, due to people avoiding the site on the grounds that it was dangerous.

    People who put their own personal profits above other people's safety deserve to be shot. But I'll settle for legally liable if it's probable that that was what they did.

    link to this | view in thread ]

  29. identicon
    Ruby, 2 Jun 2016 @ 8:08pm

    Re: General knowledge vs Specific

    Well, it looks like there were multiple victims but it took a while to ID the suspects, likely because they were drugged.

    In the mean time it wouldn't take much investigation to determined that all the victims were lured to the place they were abducted from by someone on the same site (and follow up with the people that allegedly contacted them and discover that they never did).

    So, then it would be know that the same predator(s) was accessing other people's accounts on the same site to lure victims.

    Now, if the site was notified that this was happening and didn't warn users and someone else was victimized...yeah, I can see them being reasonably held liable for not bothering to send out a damn email.

    link to this | view in thread ]

  30. icon
    freedomfan (profile), 2 Jun 2016 @ 11:35pm

    Re: Re: General knowledge vs Specific

    I am not seeing how that puts this case into the "specific" category. Yes, the site owners knew that there were a couple scumwads that visited the site using other people's accounts to cause mayhem. Even if the site knew it was the same two perps each time (and I have no idea if they did), the site still doesn't have a specific suspect to warn about, since the perps contacted the models from different accounts each time. So, all that existed was a general knowledge that the site could be used for perps' scheme and no warning to give anyone except that "Someone who contacts you about modeling might not be who they claim to be." Doesn't seem very useful.

    link to this | view in thread ]

  31. icon
    Whatever (profile), 2 Jun 2016 @ 11:53pm

    Re: "User" = The user committing the crime.

    Actually, the crosswalk is the admission of risk, this is a spot where you can more safely cross the road - but not completely safely, because you know it's inherently dangerous - you have been warned all your life.

    link to this | view in thread ]

  32. icon
    That One Guy (profile), 3 Jun 2016 @ 12:39am

    Re: Re: General knowledge vs Specific

    But did they know who, and more importantly which accounts? Without at least the latter bit of information the knowledge is still general, not specific. 'Someone is using our site for illegal activity' doesn't exactly give them anything to go by beyond a general warning that some scum are using the service but they have no idea who they are or what accounts they use.

    Should they have provided a general warning? Probably, but like I said the information they had seemed to have been so scarce that it would have been general to the point of being near useless.

    Unless I'm missing something significant they had no real information to go on beyond that someone was using their service for something illegal, and if that's enough to bypass 230 then the 'protection' it affords sites is going to take a beating as you can be sure others will argue that general knowledge, which was not considered sufficient before to trigger liability is now.

    link to this | view in thread ]

  33. icon
    BJC (profile), 3 Jun 2016 @ 5:15am

    Re: Re: Re: General knowledge vs Specific

    I think we're arguing two different things.

    I'm arguing that the complaint was specific enough to invoke California's tort of "failing to warn of a harm you have some foreknowledge of," which is somewhat particular to California law.

    I get the sense you're arguing that Cal. law is dumb. This may be true; I have often disagreed with the choices of the Cal. Supreme Court and legislature.

    But the "dumbness" of the law doesn't matter for CDA 230. Only whether the liability comes from being a "publisher" of material on the internet. And if it's a super-general "there might be crime" warning requirement, that doesn't really intersect with CDA 230.

    link to this | view in thread ]

  34. icon
    BJC (profile), 3 Jun 2016 @ 5:21am

    Re: Re: Re: General knowledge vs Specific

    I don't think this case is that broad with regards to CDA 230, because it involves some really particular facts.

    Here, the complaint says the website owners knew that someone was trolling the website to commit actual violent crimes against the users, and in fact knew enough to provide specific details about the scheme (like, "be really careful about randos who want you to go to Florida"). Furthermore, this is under California law, which is the most liberal of jurisdictions on this particular issue, and unless every case brought in California gets Cal. law (not likely for NY residents, etc.), this is limited.

    link to this | view in thread ]

  35. identicon
    Wendy Cockcroft, 3 Jun 2016 @ 7:44am

    Re: "User" = The user committing the crime.

    People who use crosswalks are generally aware of the risk, however slight, that this might be the day a drunk (or otherwise confused) driver might put the pedal to the metal and run them over.

    Similarly, the modelling world is fraught with danger from creeps and perverts luring naive women and girls who hope to develop a career into their evil clutches. I've heard loads of horror stories about it, which makes me wonder why the so-called respectable modelling agencies don't get held responsible for exposing the women and girls in their care to the possibility that some perve might take advantage of them.

    Women need to be a bit cynical and take precautions when availing themselves of potential opportunities, i.e. bring a friend, etc. We shouldn't have to, but perves will be perves. This applies to anyone who goes to places they don't know to transact business with total strangers. It might be legit, but be aware that it might not be.

    link to this | view in thread ]

  36. icon
    Ninja (profile), 3 Jun 2016 @ 8:59am

    Re: INTERNET WARNING!!!

    just like in the real world

    Should be emphasized. You don't see road operators being blamed for drugs going through their network. Same with transportation companies. You don't close markets because a girl was raped in their toilet. You only do that when the owners actually assisted in the crime.

    But hey, it's on the Internet, let's just throw logic and reason through the window, yes?

    link to this | view in thread ]

  37. icon
    Ninja (profile), 3 Jun 2016 @ 9:02am

    Re:

    You seem to be mistaking different situations:

    "Hey, site! Some rapists seem to be using your site to do rape-y stuff."

    OR

    "Hey, site! John Doe has used and still uses your site to do rape-y stuff, here's evidence."

    OR

    "hey, site! You are actively assisting rapists in their rape-y stuff or engaging in rape-y stuff yourself!"

    Can you see the difference? Even the second case has nothing to do with the site. The authorities are the ones that must act. The site must assist but it's not their job to be the police.

    link to this | view in thread ]

  38. icon
    Ninja (profile), 3 Jun 2016 @ 9:03am

    Re: Re:

    mistaking = misunderstanding. My bad.

    link to this | view in thread ]

  39. icon
    BJC (profile), 3 Jun 2016 @ 9:37am

    Re: Re:

    I'm pretty sure, given the way the 9th Cir. wrote the decision, that California imposes civil liability to people in the second instance.

    And so I'm wondering whether we're arguing about what the law is, or what the law should be.

    If it's what the law should be, carry on -- I think California tort law is too plaintiff-friendly generally. But arguing about what the underlying law should be doesn't really help when talking about how the law as it currently is interfaces with CDA 230.

    And here's why I think the CDA 230 defense doesn't quite work here.

    Let's say that California had a law saying that all providers of a service, whether in the real world or electronic, paid or unpaid, etc., had to provide a puppy to every 1000th customer free of charge or face civil liability from the customer who wanted a puppy and didn't get one. Let's further say, to get all our facts in a row, that it was challenged on all relevant state and federal grounds by the United States Chamber of Commerce and both the California Supreme Court and the United States Supreme Court upheld the law so that, no matter how dumb and counterproductive this puppy-providing law is, it only legislative action could change it.

    Someone signs on to a CDA-covered service in California as the 1000th user. Doesn't get a puppy. Immunity?

    I would say not, because the duty exists independently of the fact of the service.

    link to this | view in thread ]

  40. icon
    Ninja (profile), 3 Jun 2016 @ 10:29am

    Re: Re: Re:

    "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" (47 U.S.C. § 230)"

    The law says specifically that the provider is not responsible for what the users do so you do have a point but it doesn't apply to this case. The law specifically exempts the provider from liability for what their users do. Go after the rapists, not the service.

    link to this | view in thread ]

  41. icon
    Griffdog (profile), 3 Jun 2016 @ 10:55am

    and then they twist it against you

    So, Kim Dotcom provided a search tool to enable copyright owners to find pirated files and submit take-down requests. But the DOJ argues that the presence of the tool also aids people looking to pirate the same material. Therefore Kim must be liable for the copyright violations because he provided a tool that made file sharing easier. Damned if you do, and damned if you don’t. Put up a warning on your website, and see if some non-Californian prosecutor doesn’t try to argue that since you’re aware of crime on your site you must be complicit in its commission. No warning? Then you're liable for not providing a warning.

    What do you mean, you didn't know there was a problem? Everyone knows that there are dangers on the Internet. Yes, that same Everyone that you need to provide the warning to.

    link to this | view in thread ]

  42. icon
    Uriel-238 (profile), 3 Jun 2016 @ 11:08am

    Informed consent

    It's a rant crusade of mine how often young people (and other vulnerables, such as newly naturalized immigrants) are pushed into making big decisions (join the military, have babies, etc.) before they have the time and motive to get themselves informed.

    Looking at the whole click-wrap phenomenon, and the contracts we're expected to sign for essentials like cell phones or internet service, it appears to me our entire economy is based on forcing people into odious contracts before they know what they're doing and then exploiting their predicament before they're locked in.

    Rather than facilitate our customer's lives, we instead seek to entrap them and bleed them like parasites.

    (Regarding the modeling world, the creepiest stories I heard were the recruiters outside anorexia hospitals. Not all exploitation of these poor girls is sexual.)

    link to this | view in thread ]

  43. icon
    John85851 (profile), 3 Jun 2016 @ 11:58am

    Re: Useless warnings

    The warnings aren't useless. Sure, users ignore and click-through them, but the real purpose is legal: it's so the site's lawyers can say that the warning was in the TOS that the user agreed to, so the user can't sue.

    link to this | view in thread ]

  44. icon
    BJC (profile), 3 Jun 2016 @ 12:47pm

    Re: Re: Re: Re:

    I think there's a reasonable argument that prohibiting treatment "as the publisher or speaker of any information provided by another information content provider" is different than prohibiting all liability for the actions of such publisher or speaker.

    The critical words here are "treated as" -- if I say that you have an independent duty that the original party could never be held liable for, then I'm not treating you as that party.

    I've seen this in other liability contexts, such as where a hospital that can't legally be sued for "medical malpractice" is sued for "negligent supervision" of the staffers that committed the malpractice.

    link to this | view in thread ]

  45. icon
    Bergman (profile), 3 Jun 2016 @ 3:38pm

    Re: Warnings...

    Doesn't this mean that if someone gets raped in the building the court meets in and they fail to post warning signs, the judges could be sued for failing to do so?

    link to this | view in thread ]

  46. identicon
    Anonymous Coward, 3 Jun 2016 @ 4:31pm

    Re: INTERNET WARNING!!!

    That's a perfect description of lawyers and politicians.

    link to this | view in thread ]

  47. identicon
    Anonymous Coward, 3 Jun 2016 @ 8:13pm

    Re:

    what the hell is the matter with these 'judges'?

    One of the primary unwritten jobs of a judge (because it would be politically incorrect to write it) is to enforce political correctness.

    link to this | view in thread ]

  48. identicon
    Anonymous Coward, 3 Jun 2016 @ 8:42pm

    Re:

    If one of my members informed me that a member was... blah blah blah

    Learn to read. These guys weren't members.

    link to this | view in thread ]

  49. identicon
    Wendy Cockcroft, 4 Jun 2016 @ 3:43am

    Re: Informed consent

    That's what happens in a world where hyper-individualism is the norm. How do you promote traditional values (I'm conservative), which were created by and for the community, when "community" only has value as long as it's considered useful? I'm of the opinion that individuals do best in a healthy community environment; excessive individualism stops us caring about each other's wellbeing, and that's ultimately harmful for everyone.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.