Anti-Vaxxer Sues Facebook, In The Middle Of A Pandemic, For 'In Excess' Of $5 Billion For Shutting Down His Account
from the that's-now-how-any-of-this-works dept
When I write about this new lawsuit, filed on behalf of "retired MMA fighter" Nick Catone, against Facebook for removing his account over his anti-vaccine posts, you may expect that it was filed pro se. However, somewhat shockingly, there's an actual lawyer, James Mermigis, who filed this dumpster fire of an awful complaint. Mermigis does not appear to have any experience in internet law, and boy does it show. His various profiles online list his experience in divorce law, real estate law, and personal injury law. His own Twitter feed is basically all just wacky anti-vax nonsense, and, late last year, he was quoted as representing people trying to block a NY law removing a religious exemption for vaccines. We've gone over this many times before, but spewing junk science and angry rants that are literally putting tons of people in danger is no way to go through life, and it's certainly no way to file a lawsuit. Especially not in the midst of a pandemic where a vaccine sure would be nice.
But, alas.
The filing is bad, and Catone and Mermigis should feel bad about it. It will be quickly dismissed under CDA 230, even though (hilariously) it claims that Facebook's moderation of Catone's account "violates" CDA 230 (which is not a thing, as you cannot "violate" CDA 230). This lawsuit is like a collection of misunderstood tropes about internet law. It starts with this:
As the United States Supreme Court noted in Packingham v. North Carolina.... Facebook is part of the "vast democratic forum of the Internet." Packingham extended the concept of a quintessential public forum from parks and physical spaces to cyberspace.
Packingham is kind of the go to citation from bad lawyers trying to argue that having your content moderated on Facebook violates the 1st Amendment. It's been tried many, many times, and it has always failed because Packingham does not say what these people want it to say. Packingham said that the state cannot pass a law that kicks people off of the internet. It says nothing about private social media companies removing idiots spewing misinformation from their own sites.
Indeed, an even more recent Supreme Court ruling, in Manhattan Community Access v. Halleck, not only shoots down the idea that content moderation on private social media websites is subject to the 1st Amendment, it spells it out in big flashing letters that it's a bad idea to even try to make that argument because private companies are not the state. Packingham only applies to the state.
But that's not going to stop Mermigis. He goes on for a while about how big Facebook is, then rewrites history to suggest Facebook really only started doing content moderation after people were upset about... Cambridge Analytica and the 2016 election? Of course, the Cambridge Analytica issue wasn't a content moderation issue, so much as a privacy and data sharing issue, but hey, someone's trying to make a case out of very, very little. I'll just include this paragraph and point out that Facebook's community standards and content policy team dates back many, many, many years before 2016:
To assuage an angry public and ultimately to protect its own financial interests, Facebook announced plans to create and enforce so-called "community standards" for content published on its site. These standards are directed toward speech that Facebook regards as inimical to a "safe environment."
So, again, that's not when or why Facebook put in place community standards. Also, the final sentence of this paragraph basically admits that Facebook's moderation efforts are in good faith, which makes this an even easier CDA 230 dismissal than most.
Even more hilarious, the complaint whines that Facebook's community standards are too vague. But, uh, yeah. That's the point. When you have multiple billions of people posting content on your site, the rules need to be vague, because every day there are millions of "edge" cases that need to be looked at and have decisions made on whether or not the content is appropriate. That's why CDA 230 lets sites decide for themselves how to moderate. The complaint is literally making the case for why it should be thrown out on 230 grounds.
Among the content that Facebook finds "objectionable" is bullying and harassment. Facebook does not provide a definition for what bullying or harassment is. However it does provide a broad definition that may cover almost anything: "Bullying and harassment happen in many places and come in many different forms, from making threats to releasing personally identifiable information, to sending threatening messages, and making unwanted malicious contact."
[....]The standards is hopelessly vague. As Facebook itself notes "[c]ontext and intent matter, and we allow people to share and reshare posts if its clear that something was shared in order to condemn or draw attention to bullying and harassment."
Facebook reserves the right to remove the "offensive" posts without notifying the user or giving the user an opportunity to clarify or edit his post. Moreover, Facebook reserves the right either temporarily or permanently to disable an account for violation of its "community standards" policy.
Uh, yeah. It reserves that right. And it has every right to, and if you don't like it, don't use Facebook. But not only did Catone use Facebook, it appears that he tried to build a local gym business based entirely on Facebook. There is a bit of a tragic backstory here, in that Catone lost an infant son, and seems to believe that vaccines had something to do with it, and thus sometimes posts typical anti-vax content. That's what appears to have lead to the suspension of his account -- especially since Facebook has ramped up its removals of anti-vax nonsense in the last few months.
The problem here is that Catone (1) seemed to rely solely on Facebook for building up business for his new gym, and (2) mixed that account with posting his anti-vax screeds. So now he's blaming the fact that he was (reasonably, and well within Facebook's rights) banned from the site for trouble getting business going to his gym.
Plaintiff, Nick Catone MMA & Fitness, has used Facebook as the main way to grow and advertise the fitness center. In 2019, Plaintiff spent $15,564.17 in advertising. Plaintiff is currently spending $1800-2000 per month advertising with Facebook.
Plaintiff purchased a 32,000 square foot building for his fitness center in 2018 and Facebook has been a huge part of his financial growth. Plaintiff needs Facebook to showcase his fitness center.
Uh, yeah, that's not how any of this works. I need Facebook to give me a pony, but the pony just ain't showing up. Unless Catone signed some sort of contract with Facebook in which Facebook promised to "showcase" his fitness center, he has no rights to speak of here. Catone, it appears, made the poor business decision to exclusively focus on using Facebook to build his business. Incredibly, it appears that Catone failed to set up an alternative means of running his business, relying entirely on Facebook, according to the lawsuit:
As a direct and proximate result of the acts and omissions of the Defendants, Plaintiff can no longer operate his business. Plaintiff cannot check messages, reply to posts or access his business page. The censorship threatens his livelihood as he invests $30,000 per month to run his business and has no access to run his business as he runs it through Facebook....
If Facebook does not immediately reinstate Plaintiff's account and access to this account, Plaintiff stands to lose an unconscionable amount of money and may lose his business that he has invested millions of dollars in.
Nick, I think I see the root of your problem, and it ain't Facebook. If you set up your entire business there, didn't set up your own website or email or alternative way to get in touch with you... that seems to be indicative of your own bad business decisions. And you don't get to sue others over those. That's not how any of this works.
Also fun is the way in which Catone's posts that got his account in trouble are described:
Like many of his fellow citizens, the Plaintiff, Nick Catone is a thinker who, regardless of whether he is right or wrong, loves to share his thoughts and hear the thoughts of others. He regularly posts on Facebook about his deceased infant son and the vaccines that contributed to the death of his son, seeking to engage in debate the community of friends whose respect he has gained.
The Plaintiff, Nick Catone, used Facebook for open discussion regarding the safety and effectiveness of vaccines. Plaintiff felt that should be and [sic] open discussion to debating the merits of this serious public question.
Nick may think that, but that doesn't mean Facebook needs to host it.
Also, Nick, Mark Zuckerberg didn't personally decide to censor your to "deflect attention" from Facebook scandals, even if your lawsuit claims that's what's going on:
Upon information and belief, Mr. Zuckerberg harbors political ambitions beyond his role as principal of Facebook. His decision to categorically censor the speech of concerned citizens including that of Nick Catone is intentional and is inspired by ill-will, malice, and a desire to deflect attention from himself and Facebook's practice of surreptitiously mining data for profit from consumers who believe they are receiving a free service devoted primarily to their welfare.
I'm sitting here trying to figure out the galaxy brain explanation for how "censoring" "thinkers" as part of an apparently malicious campaign, deflects attention from totally unrelated Facebook scandals or somehow helps his apparent political ambitions. I guess I'm not a "thinker" because I just don't see it.
Anyway, claims. We've got 'em. They're not good, but they exist. According to the lawsuit it violates Section 230 of the CDA if you moderate:
The Communications Decency Act provides immunity from civil liability for materials published on interactive computer service sites. The provision of immunity was intended to avoid "content-based" chilling of freedom of speech in the "new and burgeoning Internet medium." Section 230 was enacted, in part, to preserve the robust nature of speech on the Internet. These principles were clear articulated in Zeran v. America Online....
Yeah, Mermigis, you gotta keep reading beyond that, because the way in which CDA 230 protects free speech online is by not allowing you to sue them for their moderation choices, because such dumb lawsuits would chill the ability to host any content online. I mean, dammit, you're a lawyer, at least read part (c)(2) of CDA 230 where it outright explains that you can't sue an internet company over its moderation choices -- which you admitted earlier were clearly in good faith.
The next bit is just nonsense. I know there are other lawsuits out there (mostly those stupidly claiming "bias" in takedowns) but they all fail and this one will too, because this is not the law. It's the opposite of what the law says and no court has ever come close to this interpretation in dozens upon dozens of cases involving CDA 230.
Facebook enjoys immunity from suit under Section 230 of the CDA as a Congressionally mandated means of ensuring free and robust speech on the Internet. The privileged status necessary entails a corresponding responsibility to achieve the very goal for which Congress granted immunity: to wit, the preservation of free speech on a quintessential public forum.
No. That's not what the law says, not what it intended, not what it means, and no court has ever interpreted it that way because the law actually explicitly states the reverse -- that in order to support family friendly spaces on the internet, platforms face no liability for the moderation choices they make -- including booting off people spewing pseudo-science hogwash that puts people in harms way.
Facebook's enjoyment of immunity from civil liability for the material it transmits on the Internet transforms its editorial decision-making process into management of a constructive public trust.
The manner and means by which the defendants have banned the Plaintiff from engaging in free speech on Facebook are a violation of the CDA and constitute a willful and wanton violation of the terms of the constructive public trust.
That's gibberish. It is not what the law says. And, again, dude, CDA 230 is an immunity provision. You can't "violate" it.
From there, we get into more gibberish: claiming that Facebook moderation violates the 1st Amendment. Again, this argument has been rejected numerous times, and many of those times the argument was made more competently than it was made here (and it's never been made competently, since it's legally nonsense). Facebook is a private company. It's not the government. Its actions around moderation literally cannot violate the 1st Amendment. To try to get around this, the complaint actually tries to argue that the CDA turns Facebook into a state actor. I only wish I were kidding.
The CDA's grant of immunity is integral to the government's purpose of promoting freedom of speech on the Internet. As such, the symbiosis between Facebook and the United States government transforms Facebook's action into state action under the doctrine enunciated in Burton v. Wilmington Parking Authority....
Uh, no. The Buron case is not even remotely analogous (that involved a government parking lot and a strip of retail stores that the government leased out to a coffee shop to help generate revenue to pay for the parking garage). That, uh, is nothing like a private company moderating its own space. And, honestly, we've got the Manhattan Cable case from literally last year that seems a hell of a lot more on point. Let's quote from the Supreme Court ruling from last summer:
when a private entity provides a forum for speech, the private entity is not ordinarily constrained by the First Amendment because the private entity is not a state actor. The private entity may thus exercise editorial discretion over the speech and speakers in the forum. This Court so ruled in its 1976 decision in Hudgens v. NLRB. There, the Court held that a shopping center owner is not a state actor subject to First Amendment requirements such as the public forum doctrine....
The Hudgens decision reflects a commonsense principle: Providing some kind of forum for speech is not an activity that only governmental entities have traditionally performed. Therefore, a private entity who provides a forum for speech is not transformed by that fact alone into a state actor. After all, private property owners and private lessees often open their property for speech. Grocery stores put up community bulletin boards. Comedy clubs host open mic nights. As Judge Jacobs persuasively explained, it “is not at all a near-exclusive function of the state to provide the forums for public expression, politics, information, or entertainment.”
To argue against that -- that the CDA by itself automatically turns any internet forum into a state actor -- is laughable beyond belief. It's not an argument a lawyer should be making.
From there we get some silly add-on claims about "fraud," "implied warranty," "intentional and malicious interference" etc. These are the kind of toss-in extra claims one adds to try to add heft to an already weak complaint. They are not well argued and are barely comprehensible.
And how can I leave out the damages request for in excess of $5 billion. Even that is argued in a weird, roundabout way. Rather than just asking for $5 billion like your average complaint, this one spends a bunch of paragraphs talking about totally unrelated things:
In April 2019, Facebook set aside a sum of $5 billion to use to pay an anticipated fine by the Federal Trade Commission involving systematic breaches of consumer privacy. Even so, the Defendants forecast significant profits.
Apparently Mermigis' research failed to find out that after setting aside that sum in April, the FTC went ahead and issued that fine against Facebook in July and Facebook paid up. Crack research there, buddy. Either way, what does that have to do with anything? Apparently, that now sets the floor for any damages for one Mr. Nick Catone:
Punitive damages in a sum sufficient to punish and deter Facebook for violating the First Amendment, the Communications Decency Act, for engaging in fraud, unfair or deceptive trade practices, intentional and malicious interference with prospective economic advantage and breaching the implied warranty of fair dealing. Because a sum of $5 billion appears to be insufficient to deter Facebook, the plaintiffs ask the jury for a sum significantly in excess of that amount.
Good luck, champ. Oh, and for what it's worth, I see that the punitive damages statement includes some for "unfair and deceptive trade practices" but as far as I can tell, they never actually claim that in the lawsuit -- which is a bold strategy. Anyway, this complaint should be a case study in how not to internet law. I assume the courts may be a bit slow to act, seeing as we're dealing with a pandemic, and not have time for a guy who wants to demand that private internet sites host his speech regarding evil vaccines, but this case will be dismissed in time. Perhaps by then we'll have a vaccine for COVID-19. That would be nice.
Filed Under: 1st amendment, anti-vax, cda 230, content moderation, james mermigis, nick catone, section 230, vaccines
Companies: facebook