Another Day, Another Horrible Ruling That Undermines The First Amendment And Section 230
from the and-wtf-california dept
Not sure what's going on in California, but it's been suddenly issuing a bunch of really bad rulings concerning Section 230 of the CDA (the most important law on the internet). As we've explained many times, Section 230 says that online services cannot be held liable for actions of their users (and also, importantly, that if those platforms do decide to moderate content in any way, that doesn't impact their protections from liability). This is massively important for protecting free speech online, because it means that platforms don't have to proactively monitor user behavior out of fear of legal liability and they don't feel the need to over-aggressively take down content to avoid being sued.Over and over again the courts have interpreted Section 230 quite broadly to protect internet platforms. This has been good for free speech and good for the internet overall (and, yes, good for online companies, which is why some are so against Section 230). But, as we've been noting, Section 230 has been under attack in the past year or so, and all of a sudden courts seem to be chipping away at the protections of Section 230. Last week we wrote about a bad appeals court ruling that said Section 230 did not protect a website from being sued over failing to warn users of potential harm that could come from some users on the site. Then, earlier this week, we wrote about an even worse ruling in San Mateo Superior Court (just a block away from my office...) exempting publicity rights from Section 230.
And now, Eric Goldman points our attention to an even worse ruling coming out of California state's appeals court for the First Appellate district. In this ruling, the court determines that Yelp can be forced to delete reviews that the court found defamatory (though entirely based on a default judgment, where the defendant didn't show up in court). In previous cases most courts have found that even if content is found to be defamatory, a third party website cannot be forced to delete it, because of the pesky First Amendment.
In this case, the court doesn't care. The background of the case involves a lawyer, Dawn Hassell, who sued a former client, Ava Bird, who allegedly posted negative reviews of Hassell's work. Hassell sued, Bird ignored, and the court ruled for Hassell as a default judgment. As part of this it also ordered Yelp to remove the reviews. Yelp protested. The court then twists itself into all kinds of questionable knots to ignore both Section 230 and the First Amendment. The court first questions whether or not Yelp can even make the First Amendment argument, seeing as it's also claiming that it's not the author of the content in question. Of course, that totally misses the point: it's not necessarily just about the content in the review, but also Yelp's First Amendment rights in presenting content on its website.
In order to claim a First Amendment stake in this case, Yelp characterizes itself as a publisher or distributor. But, at other times Yelp portrays itself as more akin to an Internet bulletin board—a host to speakers, but in no way a speaker itself. Of course, Yelp may play different roles depending on the context. However, in this context it appears to us that the removal order does not treat Yelp as a publisher of Bird’s speech, but rather as the administrator of the forum that Bird utilized to publish her defamatory reviews.But, uh, the administrator of a forum still has separate First Amendment rights in determining how they present things in their forum. That's kind of how it works. As Eric Goldman notes:
What the hell is an “administrator of the forum,” and what legal consequences attach to that status? We’re not talking about the free speech rights of a janitor with a mop. This case involves a curator of speech–and even if the curator is just “administrating,” telling a curator how to administrate raises significant speech interests that deserve more respect than this court gave it.The court then suggests that the First Amendment doesn't apply because Yelp has no right to question a court.
To the extent Yelp has ever meant to contend that an injunction requiring Bird to remove defamatory statements from the Internet injuriously affects Yelp, we disagree. Yelp’s claimed interest in maintaining Web site as it deems appropriate does not include the right to second-guess a final court judgment which establishes that statements by a third party are defamatory and thus unprotected by the First Amendment.Yikes! That of course, ignores the actual issue at play -- especially the fact that the finding of defamation was on default, rather than through an actual adversarial process.
But the really scary part is how the court gets around Section 230. Goldman refers to it as "jujitsu" and that's a pretty apt analogy:
Yelp argues the authority summarized above establishes that the removal order is void. We disagree. The removal order does not violate section 230 because it does not impose any liability on Yelp. In this defamation action, Hassell filed their complaint against Bird, not Yelp; obtained a default judgment against Bird, not Yelp; and was awarded damages and injunctive relief against Bird, not Yelp.Okay... but then it's ordering Yelp to remove the reviews, despite being a non-party. And if Yelp does not remove the reviews, then it's in contempt of court, which means that yes, the court is absolutely applying liability. But, no, says the court, because [reasons].
If an injunction is itself a form of liability, that liability was imposed on Bird, not Yelp. Violating the injunction or the removal order associated with it could potentially trigger a different type of liability which implicates the contempt power of the court.Got that. It's not liability because it's "a different type of liability." WHAT?!? Where in the law does it say that "a different type of liability" (with no clear definition) is allowed? The court clarifies by muddying the waters some more:
In our opinion, sanctioning Yelp for violating a court order would not implicate section 230 at all; it would not impose liability on Yelp as a publisher or distributor of third party content.This makes no sense at all.
Separately, the court keeps relying on the fact that Yelp itself was not sued by Hassell, and that all other cases involved service providers that were parties to the case. But that leads to ridiculous results:
As we have pointed out, Hassell did not allege any cause of action seeking to hold Yelp liable for Bird’s tort. The removal order simply sought to control the perpetuation of judicially declared defamatory statements. For this reason, Yelp seriously understates the significance of the fact that Hassell obtained a judgment which establishes that three reviews Bird posted on Yelp.com are defamatory as a matter of law, and which includes an injunction enjoining Bird from repeating those three reviews on Yelp.com. Indeed, that injunction is a key distinction between this case and the CDA cases that Yelp has cited, all of which involved allegations of defamatory conduct by a third party, and not a judicial determination that defamatory statements had, in fact, been made by such third party on the Internet service provider’s Web site.But under that standard, the court has just offered up a huge hole to avoid Section 230: just don't name the service provider, and then you can force the service provider to take down the content. If that stands, very bad things will happen as a result. As Goldman points out in response to this, the court is simply wrong:
So the court is flat-out wrong. While I believe it’s correct that none of the cases were posed as contempt proceedings, the actions in both Blockowicz and Giordano also came after lower court findings of defamation. And in any case, WTF? Is the court saying that Section 230 preempts a direct lawsuit against a UGC site seeking injunctive relief, but it’s totally OK to reach the same result by not naming the UGC site in the lawsuit and then enforcing an injunction via contempt proceedings?Goldman goes on to note how this ruling will create all kinds of mischief opportunities:
Step 1: sue the content poster for defamation in California state court. Do not sue the UGC site because (a) they are immune under Section 230, or (b) they might decide to fight substantively.Goldman also warns that this ruling may not be easy to overturn. Yelp can (and should) appeal to the state Supreme Court, but there's no guarantee it will take the case. There are legislative solutions, but those are unlikely as well. But for the time being, this ruling is a ticking time bomb. It can and will be abused. We see so many attempts to censor content by abusing copyright law, and now California has given people a playbook for how to abuse defamation law to do the same thing.
Step 2: take advantage of loose service of process rules and or otherwise hope the poster doesn’t appear in the case. For example, non-California residents aren’t likely to fight in a California court even if they get notice.
Step 3: get a default judgment finding defamation. If the user does make an appearance, a stipulated judgment with the user could reach the same result.
Step 4: seek an injunction requiring removal by the UGC site. Once the judge accepts the service of process and concludes the defendant didn’t show, the judge will probably do just about whatever the plaintiff asks. With the default judgment, the plaintiff can then use the coercive effect of contempt to force the UGC site to remove the content so long as the UGC site is under California’s jurisdictional reach–which most UGC sites are.
Voila! A right to be forgotten in the US, despite the First Amendment and Section 230.
As an added bonus, in the same lawsuit, the plaintiff can target multiple items of unwanted content by claiming it’s also written by the defendant or someone working in concert with the defendant. For example, I don’t believe it was ever confirmed that Birdzeye and JD are the same person, but consistent with the less-stringent approach deployed by judges when faced with default proceedings, the court treats both reviews as if the author(s) of the opinions was in court. If, in fact, JD is a different person, then Hassell successfully scrubbed JD’s content without ever suing the actual author or serving proper notice on the author. As you can see, there’s a great collateral damage potential here.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: ava bird, california, cda 230, censorship, dawn hassell, defamation, default, first amendment, prior restraint, reviews, section 230
Companies: yelp
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
If that court order leads to liability based on the actions of a user, then, yeah, actually, it was designed to do exactly that.
Section 230's over and done with.
Many other courts disagree with you.
Now the question is whether the court has the authority to order defamatory material removed, which there's no argument it does.
There's PLENTY of argument on that front. Not all courts agree, but many have found that an injunction is an inappropriate remedy for defamation due to 1A issues.
If you look back at case law, there's a long string of decisions saying that yes the courts can order a non-party to remove (to the best of their ability) material that's been found to be defamatory by the court.
There's a long string of cases finding the exact opposite.
[ link to this | view in chronology ]
Re: Re:
There's certainly a long string of cases saying you can't sue the site to get the material taken down, but that's a different question from having the site take the material down after the poster's been sued and the plaintiff won a ruling in their favor.
[ link to this | view in chronology ]
Re: Re: Re:
However the judges this case can't find yelp accountable nor part of any punishment. I assume yelp allows posters to delete their own reviews. So even taking that the case was decide by default action the judge can only order the reviewer to remove it and hold him/her in contempt if they refuse to do it. Otherwise why didn't the judge just order all ISPs to filter that one post for everyone visiting the site too. I mean if yelp won't do it and by his own twisted logic he can force them to, then why can't he force other companies that deal with traffic to and from that listing to comply?
Because they are not responsible for the poster's article, see CDA 230 for more info.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
But they can and will be punished if they leave it up, so the idea that there's no liability involved is ridiculous. If there really is no liability involved then they could leave the comment up and not a thing would happen to them, but given that is not the case they are very much being held liable for a comment posted by another, thanks to a court case that they were't involved in until the very end.
[ link to this | view in chronology ]
Re: Re:
A) it was due to a default judgment
or
B) the user should be compelled to take down the post, not Yelp
or
C) all of the above?
I don't see the problem with A - a default judgment is just as valid as any other judgment as far as I know; it's not the plaintiff's fault if the defendant didn't show up. I agree with B, which leads to an interesting question - what if a platform doesn't provide a mechanism to delete posts? Techdirt, for example. I can't be compelled to delete a defamatory comment on Techdirt because I simply have no way to do it. Could Techdirt be compelled to remove it in that case? Or should I just be jailed for contempt of court until I can convince you to do it for me?
Or is it totally inappropriate to compel anyone to remove the post, and the sanction should be monetary damages instead? What about ongoing damages due to the defamatory speech remaining there, is that a valid legal theory? Could I be fined $X per day in perpetuity?
[ link to this | view in chronology ]
Absolutely, TKnarr
But it is unremarkable that if a court finds something defamatory, the courts can request that a publisher (in this case, Yelp) cease publishing it.
We aren't talking a preliminary injunction here (that would be pretty bad), we are talking about a final ruling in a court case finding content to be defamatory. I don't see 1st-amendment implications here at all. Defamatory content is not protected speech.
I agree that Yelp could argue that it needs to receive the request from the user instead of the court, but that is a distinction with very little difference.
Let's apply this to a physical case: A billboard publisher accepts a billboard saying: "SirWired Stole $1M from starving orphans and spent it on hookers and blow" Now, I'm certainly going to sue whoever wrote such a thing. When I win, I'm absolutely going to get an order to have the billboard taken down! The information isn't merely painful (like a news story reporting on the existence of the billboard), it's just a flat-out lie. There's no 1st-amendment implications here. It would be unremarkable for a judge to order the billboard company to take it down, even if there were months left on the display contract.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Person A and person B have some type of argument in a store parking lot. The store isnt sued, but was just the location of the argument.
Can a judge then tell the store it has to spend millions rebuilding its parking lot to prevent arguments in it?
[ link to this | view in chronology ]
Re:
The only way your scenario could realistically come about is if the judge found that the local codes required or encouraged a parking lot design that directly contributed to arguments in a way that wasn't lawfully allowed and ordered the locality to change it's codes to remove the unlawful aspect. The store'd then have to rebuild the parking lot to comply with the new local codes as a result, but that wouldn't as far as the law is concerned be directly connected to that case.
[ link to this | view in chronology ]
Its cali...
Just drive down to your nearest liberal college and get a few signatures to amend the 1st so that no one can say anything nasty! whooooooooooo!
[ link to this | view in chronology ]
Yeah, ain't it great?
[ link to this | view in chronology ]
Seriously, it "protested"?
You had the perfect opportunity to write that Yelp yelped, and didn't take it?
[ link to this | view in chronology ]
"The removal order does not violate section 230 because it does not impose any liability on Yelp."
Since "The removal order does not violate section 230 because it does not impose any liability on Yelp," Yelp respectfully rejects the order and encourages the court to follow through on it's promise to "not impose any liability on Yelp".
[ link to this | view in chronology ]
Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
[ link to this | view in chronology ]
Re: Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
How is it a liability for Yelp to have to remove something that was legally deemed defamatory? If they can't withstand something like that, then they need to find a new business model.
[ link to this | view in chronology ]
Re: Re: Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
Triggered by a comment created by the original poster, which they are now being held accountable for in the form of pending punishment if they don't do something about it. You know, the very thing that 230 was designed to address, holding sites accountable for the content posted there by those that use the site, rather than holding the original poster responsible.
How is it a liability for Yelp to have to remove something that was legally deemed defamatory?
Via a default judgement, which is rather like winning a race because no-one else showed up. Anyone can win a case, no matter how weak it is if the other side is a no-show for whatever reason, so you'll excuse me if I'm not too impressed by the fact that they won here and don't put much weight behind the idea that just because they won it means the comment are actually defamatory.
As for how it's a liability, they're threatened with punishment for something posted by a user on their service, making them liable for what someone else has done. As noted in the article if the ruling stands then it becomes trivial to bypass the protections afforded sites under 230 simply by suing the original poster, not naming the service or involving them in the lawsuit, and then holding the site responsible for the content only at the very end. Something that wasn't allowed under the law before(suing the site to get them to remove content posted by another) is now not only allowed it's laughably easy since the site only gets to object once everything is done with on the legal end and their objections are easy to dismiss as a result.
[ link to this | view in chronology ]
Re: Re: Re: Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
If someone posts content that is found to be defamatory by a court then the court can order them to remove it, but the entire point of 230 is to keep sites/services from being punished or otherwise held liable for the actions of their users.
If you're trying to hold a site liable for comments or content posted by it's users, then yeah, that's kind of the entire point of the law. If all it takes to bypass the law is simply not naming the site/service in question until after the trial is over with then you essentially gut 230, as what wasn't allowed before, suing a site to get content posted by another removed, holding them liable for the actions of their users is now perfectly fine.
[ link to this | view in chronology ]
Re: Re: Re: Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: "The removal order does not violate section 230 because it does not impose any liability on Yelp."
Can't find or contact them? Too bad, guess you'd better go back to court to charge them with contempt.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Starting from the finish and working backwards
They'd decided ahead of time that yes, Yelp absolutely could be held liable for content posted on their service and the rest was just an attempt to explain why the law that protects Yelp doesn't, including such gems as 'We're not forcing any liability on you, but you'll be punished if you don't remove the content.'
[ link to this | view in chronology ]
Speaking of Defamatory Content
And, as Mike said:
"...suddenly issuing a bunch of really bad rulings concerning Section 230 of the CDA..."
Incompetent lawyer, Dawn Hassell, and some willfully ignorant justice (does this asshat have a name?) for the First Appellate district of California will deserve all of the public mocking and derision they elicit.
I can only hope to hear (with glee!) some wails of butt-hurt over Goldman's and Masnick's remarks.
[ link to this | view in chronology ]
What do you expect...
[ link to this | view in chronology ]
Re: What do you expect...
It's so funny how these a-holes in SillyCon Valley think they should have their own special set of lenient laws.
[ link to this | view in chronology ]
Re: Re: What do you expect...
[ link to this | view in chronology ]
Re: Re: What do you expect...
[ link to this | view in chronology ]
Rulings, judgements, and crying faces
FTFY
Let's see. A default judgement is still an enforceable judgement. Your line of logic here seems to be that a site that MIGHT be protected by section 230 can somehow trump a court's judgement. Yelp isn't somehow special and able to ignore the courts.
Think of this as the revenge against anonymous posters. If you want to post anonymously, then you basically lose your right to appear in court, which leads to default judgements, which leads to this. When you push your free speech rights to their very limits, you pull the legal blanket off of other things. If you want to be entirely anonymous, the legal system has only one way to deal with you: default judgements.
This is also a clear result of section 230. Going after the site directly (or trying to get user information from the site) is a losing battle. The law has created the ultimate legal safe harbor that permits all sorts of slanderous, unsubstantiated claimed to go unchecked, and these "service providers" (who profit from those posts) can smirk and give everyone the legal middle finger. So the only way to address it is to do what has been done in this case: get a judgement, and then apply the judgement.
(oh, and this post was made right after "Anonymous Coward, Jun 9th, 2016 @ 1:50pm
Speaking of Defamatory Content" but will likely be censored until June11th or so. Techdirt engages in their own form of prior restraint).
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Step 1: Sue deceased person you "found" to be responsible for comment you don't like.
Step 2: Deceased person doesn't show.
Step 3: Default judgement awarded.
Step 4: Force site to remove comment.
[ link to this | view in chronology ]
Cue Streisand Effect
[ link to this | view in chronology ]
Reading this hurt my brain.
[ link to this | view in chronology ]