Why SESTA Is Such A Bad Bill
from the so-much-damage dept
We've been talking quite a bit about SESTA -- the Stop Enabling Sex Traffickers Act -- and why it's so problematic, but with hearings today, I wanted to dig in a bit more closely with the text to explain why it's so problematic. There are a large number of problems with the bill, so let's discuss them one by one.
Undermines the incentives to moderate content and to work with law enforcement:
This remains the biggest issue for me: the fact that the bill is clearly counterproductive to its own stated goals. When people talk about CDA 230, they often (mistakenly) only talk about CDA 230(c)(1) -- which is the part that says sites are immune from liability. This leads many people to (again, mistakenly) claim that the only thing CDA 230 is good for is absolving platforms from doing any moderation at all. But this actually ignores the equally important part of the same section: CDA 230(c)(2) which explicitly encourages platforms to moderate "objectionable" content, by noting that good faith efforts to moderate and police that content have no impact on your protection from liability in part (1).
In other words: as currently stated, CDA 230 says that you're encouraged to moderate your platform and takedown bad content, because there's no increase in legal liability if you do so. Indeed, it's difficult to find a single internet platform that does zero moderation. Most platforms do quite a bit of moderation, because otherwise their platforms would be overrun by spam. And, if they want people to actually use their platforms, nearly every site (even those like 4chan) tend to do significant moderation out of public pressure to keep certain content off. Yet, under SESTA you now face liability if you are shown to have any "knowledge" of violations of federal sex trafficking laws. But what do they mean by "knowledge"? It's not at all clear, as it just says "knowledge." Thus, now if a site, for example, discovers someone using its platform for trafficking and alerts authorities, that's evidence of "knowledge" and can be used against them both in criminal charges and in civil lawsuits.
In other words, somewhat incredibly, the incentive here is for platforms to stop looking for any illegal activity on their sites, out of fear of creating knowledge which would make them liable. How does that help? Indeed, platforms will be incentivized not to do any moderation at all, and that will create a mess on many sites.
The vague "knowledge" standard will be abused:
This is sort of a corollary to the first point. The problematic language in the bill is this:
The term ‘participation in a venture’ means knowing conduct by an individual or entity, by any means, that assists, supports, or facilitates a violation...
But what do they mean by "knowing conduct"? Who the hell knows. We already know that this is going to get litigated probably for decades in court. We have some similar problems in the DMCA's safe harbors, where there have been legal battles going on many years over whether the standard is "general knowledge" v. "specific knowledge" and what is meant by "red flag knowledge." And in SESTA the language is less clear. When people have attempted to pin down SESTA's sponsors on what the standard is for knowledge, they've received wildly varying answers, which just means there is no standard, and we'll be talking about lawsuits for probably decades before it's established what is meant by "knowledge." For companies, again, the best way to deal with this is to not even bother doing any moderation of your platform whatsoever, so you can avoid any claim of knowledge. That doesn't help at all.
The even vaguer "facilitation" language will be massively abused:
In that same definition of "participation in a venture" what may be even more problematic than the vague "knowledge" standard, is the vaguer claim that an entity "by any means, that assists, supports or facilitates a violation..." of sex trafficking laws, meets the standard of "participation in a venture." All three of those terms have potential problems. Assisting sounds like it requires proactive action -- but how do you define it here. Is correcting typos "assisting"? Is having an automated system suggesting keywords "assisting"? Is autocompleting search "assisting"? Because lots of sites do things like that, and it doesn't give them any actual knowledge of legal violations. How about "supporting"? Again, perfectly benign activities can be seen as "supporting" criminal behavior without the platform being aware of it. Maybe certain features are used in a way that can be seen as supporting. We've pointed out that Airbnb could be a target under SESTA if someone uses an Airbnb for sex trafficking. Would the fact that Airbnb handles payment and reviews be seen as "supporting"?
But the broadest of all is the term "facilitating." That covers basically anything. That's flat out saying "blame the tool for how it's used." Almost any service online can be used to "facilitate" sex trafficking in the hands of sex traffickers. I already discussed Airbnb above, but what about if someone uses Dropbox to host sex trafficking flyers? Or what if a sex trafficker creates advertisements in Google Docs? Or what if a pimp creates a blog on Wordpress? What if they use Skype for phone calls? What if they use Stripe or Square for payments? All of those things can be facilitation under this law, and the companies would have no actual knowledge of what's going on, but would face not only criminal liability but the ability of victims to sue them rather than the actual traffickers.
This is the core problem: this bill targets the tools rather than the law breakers.
Punching a hole in CDA 230 will be abused:
This is one that seems to confuse people who don't spend much time looking at intermediary liability protections, how they work and how they'll be abused. It's completely normal for people in that situation to not recognize how widely intermediary liability is used to stifle perfectly legitimate speech and activity. However, we know damn well from looking at the DMCA, in particular, that when you set up a process by which there might be liability on a platform, it's regularly abused by people angry about content online to demand censorship. Indeed, we've seen people regularly admit that if they see content they dislike, even if there's no legitimate copyright claim, they'll "DMCA it" to get it taken down.
Here, the potential problems are much, much worse. Because at least within the DMCA context, you have relatively limited damages (compared to SESTA at least -- the monetary damages in the DMCA can add up quickly, but at least its only monetary and it's limited to a ceiling of $150,000 per work infringed). With SESTA, criminal penalties are much more stringent (obviously) which will create massive incentives for platforms to cave immediately, rather than face the risk of criminal prosecution. Similarly, the civil penalties show no upper bound under the law -- meaning the potential monetary penalty may be significantly higher.
The chilling effects of criminal charges:
Combine all of this and you create massive chilling effects for any online platforms -- big or small. I already explained earlier why the new incentives will not be to help law enforcement or to moderate content at all, for fear of creating "knowledge" but it's even worse than that. Because, for many platforms, the massive potential liability from SESTA will mean they don't create any kind of platform at all. A comment feature on a website would become a huge liability. Any service that might conceivably be used by anyone to "facilitate" sex trafficking creates the potential for serious criminal and civil liability, which should be of great concern. It would likely lead to many platforms not being created at all, just because of the potential liability. For ones that already exist, some may shutter, and others may greatly curtail what the platform allows.
State Attorneys General have a terrible track record on these issues:
In response to the previous point, some may point out (correctly!) that the existing federal law already exempts federal criminal charges -- meaning that the DOJ can go after platforms if it finds that they're actively participating in sex trafficking. But, for as much as we rag on the DOJ, they tend not to be in the business of going after platforms just for the headlines. State AGs, on the other hand, have a fairly long history of doing exactly that -- including directly at the behest of companies looking to strangle competitors.
Back in 2010 we wrote about a fairly stunning and eye-opening account by Topix CEO Chris Tolles about what happened when a group of State Attorneys General decided that Topix was behaving badly. Despite the fact they had no legal basis for doing so, they completely ran Topix through the ringer, because it got them good headlines. Here's just a snippet:
The call with these guys was actually pretty cordial. We walked them through how we ran feedback at Topix, that how in January 2010, we posted 3.6M comments, had our Artificial Intelligence systems remove 390k worth before they were ever even put up, and how we had over 28k feedback emails and 210k user flags, resulting in over 45k posts being removed from the system. When we went through the various issues with them, we ended up coming to what I thought was a set of offers to resolve the issues at hand. The folks on the phone indicated that these were good steps, and that they would circle back with their respective Attorneys’ General and get back to us.
No good deed goes unpunished
So, after opening the kimono and giving these guys a whole lot of info on how we ran things, how big we were and that we dedicated 20% of our staff on these issues, what was the response. (You could probably see this one coming.)
That’s right. Another press release. This time from 23 states’ Attorney’s General.
This pile-on took much of what we had told them, and turned it against us. We had mentioned that we required three separate people to flag something before we would take action (mainly to prevent individuals from easily spiking things that they didn’t like). That was called out as a particular sin to be cleansed from our site. They also asked us to drop the priority review program in its entirety, drop the time it takes us to review posts from 7 days to 3 and “immediately revamp our AI technology to block more violative posts” amongst other things.
And, remember, this was done when the AGs had no legal leverage against Topix. Imagine what they would do if they could hold the threat of criminal and civil penalties over the company?
Similarly, remember how leaked Sony emails revealed that the MPAA deliberately set up Mississippi Attorney General Jim Hood with the plan to attack Google (with the letter Hood sent actually being written by MPAA outside lawyers?). If you don't recall, Hood used claims that, because he was able to find illegal stuff via Google, it meant he could go on a total fishing expedition into how it handled much of its business.
In the Sony leak, it was revealed that the MPAA viewed a NY Times article about the value of lobbying state AGs as a sort of playbook to cultivate "anti-Google" Attorneys General, who it could then use to target and take down companies the MPAA didn't like (remember, this was what the MPAA referred to, unsubtly, as "Project Goliath").
Do we really want to empower that same group of AGs with the ability to drag down lots of other platforms with crazy fishing expeditions, just because some angry Hollywood (or other) companies say so?
Opening up civil lawsuits will be abused over and over again:
One of the big problems with SESTA is that it will open up internet companies to getting sued a lot. We already see a bunch of cases every year where people who are upset about certain content online, target lawsuits at those sites just out of anger. The lawsuits tend to get thrown out, thanks to CDA 230, but lawyers keep trying creative ideas to get around CDA 230, adding in all sorts of frivolous attempts. So, for example, after the decision in the Roommates case -- in which Roommates.com got dinged for activity not protected by CDA 230 (specifically its own actions that violated fair housing laws) -- lots of people cite the Roommates case as an example of why their own argument isn't killed off by CDA 230.
In other words, if you give private litigants a small loophole to get around CDA 230, they try to jump in and expand it to cover everything. So if SESTA becomes law, you can expect lots of these lawsuits where people will go to great lengths to argue just about any lawsuit is not protected by 230, because of supposed sex trafficking occuring via the site.
Small companies will be hurt most of all:
There's this weird talking point making the rounds, that the only one really resisting SESTA is Google. We've discussed a few times why this is wrong, but let's face it: of all the companies out there, Google is probably best positioned (along with Facebook) to weather any of this. Both Google and Facebook are used to massive moderation on their platforms. Both companies have built very expensive tools for moderating and filtering content, and both have built strong relationships with politicians and law enforcement. That's not true for just about everyone else. That means, SESTA would do the most damage to smaller companies and startups, who simply cannot invest the resources to deal with constant monitoring and/or threats from how people use their platform.
Given all of these reasons, it's immensely troubling that SESTA supporters keep running around insisting that the bill is narrowly tailored and won't really impact many sites at all. It suggests either a willful blindness to the actual way the internet works (and how people abuse these systems for censorship) or a fairly scary ignorance level, with little interest in getting educated.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: cda 230, congress, grandstanding, intermediary liability, knowledge, moderation, sesta
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re:
Smells fishy
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Good luck when 'coded messages' enter the mix
Scenario 1:
Let's say that this creep was instead pimping out his kid, got caught, and then told the police what he was doing on your site to meet customers.
At that moment, the police come knocking...
Scenario 2:
Let's say that some rival business/crazy person on the internet (remember the sex ring Clinton had in a pizza place?) decided it didn't like your site and leaked a 'tip' to cops about a potential pimp selling his wares on your site... At that moment, the police come knocking...
In either scenario, you would not have had knowledge. NOW you have knowledge. You have two options: shutter your web site or figure out who the hell is selling TVs and who is a pimp... All I have to say is good luck... Just a shame you have to stop your passion of used electronics swapping down the river because of sex crimes you had nothing to do with.
[ link to this | view in chronology ]
Why do you so distrust the US Senate/Congress to make rational decisions on SESTA ?
Why do you see things so differently than many highly experienced legislators ?
Are not CDA issues a critical responsibility of the Federal Government?
[ link to this | view in chronology ]
Re:
"Why do you see things so differently than many highly experienced legislators?"
Good joke! Have my funny vote.
[ link to this | view in chronology ]
Re: Re:
sarcasm rarely works in simple text formats like blog comments, but seems fun to attempt occasionally
(P.S. -- has Masnick ever clearly stated his overall political viewpoint anywhere?
..where do you think he sits in the political spectrum? )
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
Because of their history
"Why do you see things so differently than many highly experienced legislators"
Because they are legislators, highly experienced in various nefarious endeavors.
"Are not CDA issues a critical responsibility of the Federal Government?"
Is CDA exclusive to federal court proceedings?
[ link to this | view in chronology ]
Civil remedies
From Sec. 3(a)(2)(B) of the bill, amending CDA Section 230(e)
(Emphasis.)
Here's a convenient link for—
18 USC § 1595 - Civil remedy
[ link to this | view in chronology ]
It would be awesome to see coordinated responses from tech companies laying all their American workforces and moving abroad.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Heartless scumbag.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
Stupid troll.
[ link to this | view in chronology ]
Re:
And the abusers should be prosecuted to fullest extent of the law. No one has said otherwise.
But that's not what this bill is about. If you look beyond the grandstanding of "save the children", this bill is about blaming the wrong parties.
It's like trying to blame Ford for making windowless white vans or Nestle for making candy because they sometimes are used by pedophiles looking for victims.
[ link to this | view in chronology ]
Re: Re:
Actually, under this law you could.
If Ford knowingly builds a van, that is then used by a pedo, they are liable. The definition does not say "With intent to", it is the action of building the van that assists the pedo with abducting a kid that triggers the liability.
Any post-signed-into-law limiting of these vague terms would require multiple court cases over the course of several years. This will cost the government, companies, and average Joes millions to fix.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Pretty clear what the REAL goal is
And there it is - the real goal: to shut down the disruptive services enabled by the internet. The old gatekeepers are behind this bill in a clear last-ditch effort to bring down the new industries putting them out of business.
[ link to this | view in chronology ]
Fire in a theatre
”You can't yell fire in a theatre.”
—California Attorney-General Xavier Becerra, in the S.1693 hearing, just a few minutes ago.
[ link to this | view in chronology ]
Re: Fire in a theatre
People that "think" they know shit, but don't.
or... TD in general. We have the same problem that Congress does. Political Bias that keeps us all stuck on stupid, because survival is greater if you are an attached lying corruption bastard while death is far more imminent if you are an unattached honest person.
[ link to this | view in chronology ]
Re: Re: Fire in a theatre
[ link to this | view in chronology ]
DOOM! It's teh end of teh internets, I tells ya!
My, you're hot on this.
"the fact that the bill is clearly counterproductive to its own stated goals." -- 1) It's not a "fact" because you say so, simply feeble propaganda there. 2) It's false notion that removing privileges from liability for "internet corporations" and letting them be like all others is bad. Those privileges would not exist except that corporations simply BOUGHT "law" tailored for their gain.
"the only thing CDA 230 is good for is absolving platforms from doing any moderation at all." -- Yup. You know it, that's why you deny early. Corporations have made tons of money without responsibility. I hope that era is over.
[ link to this | view in chronology ]
Re: DOOM! It's teh end of teh internets, I tells ya!
No, your notion is the one that is false.
CDA 230 puts internet companies on equal footing with "all others" by not making the tool provider responsible for how the tool gets used.
We don't make Ford responsible when someone speeds. We don't make McDonalds responsible because people get fat. We don't make Smith & Wesson responsible when someone gets shot. We don't make Exxon responsible when an arsonist uses gasoline to start a fire. Etc, etc, ad nauseam.
You are the one who thinks that because it's an "internet corporation" it should be treated differently than "all others".
[ link to this | view in chronology ]
Re: Re: DOOM! It's teh end of teh internets, I tells ya!
Actually... we did try to make McDonald's responsible for people getting fat. A couple of different times. Just saying. Stupid things do happen.
[ link to this | view in chronology ]
Re: Re: DOOM! It's teh end of teh internets, I tells ya!
[ link to this | view in chronology ]
Re: Re: Re: DOOM! It's teh end of teh internets, I tells ya!
[ link to this | view in chronology ]
The rest is your usual tendentious and repetitive.
BTW: the length limit (at least from TOR) has been lowered, so now I must shorten and make more comments!
[ link to this | view in chronology ]
typo
(As in: weather the storm)
[ link to this | view in chronology ]
Re: typo
[ link to this | view in chronology ]
Re: Re: typo
Wayback Machine's Sep 19, 2017 15:25:25 UTC capture has “whether”.
Latest Google cache (currently Sep 20, 2017 00:56:48 “GMT”) has “weather”.
[ link to this | view in chronology ]
Techdirt Comments
[ link to this | view in chronology ]