Lindsey Graham's Sneak Attack On Section 230 And Encryption: A Backdoor To A Backdoor?
from the if-it-aint-broke dept
Both Republicans and Democrats have been talking about amending Section 230, the law that made today’s Internet possible. Most politicians are foggy on the details, complaining generally about “Big Tech” being biased against them (Republicans), “not doing enough” about harmful content (Democrats, usually), or just being too powerful (populists on both sides). Some have promised legislation to amend, while others hope to revoke Section 230 entirely. And more bills will doubtless follow.
Rather than get mired in the specifics about how tinkering with Section 230 could backfire, Sen. Lindsey Graham is circulating a draft bill called the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019” — the “EARN IT Act of 2019,” leaked by Bloomberg yesterday. Democratic Sen. Richard Blumenthal has apparently been involved in drafting.
At first blush, the bill may seem uncontroversial: it would create a presidential commission of experts to “develop recommended best practices for providers of interactive computer services regarding the prevention of online child exploitation conduct.” Who could argue with that? Indeed, given how little lawmakers understand online content moderation, getting analysis and recommendations from real experts about Section 230 is probably the only way out of the increasingly intractable, empty debate over the law.
But what Graham’s bill would actually do is give the Attorney General a blank check to bypass Congress in cracking down on Internet services in ways that may have little to do with child sexual abuse material (CSAM). Specifically, the bill would:
-
Amend Criminal Law & Section 230: Section 230 has never shielded operators of websites and Internet services from federal criminal prosecution for CSAM. But the Graham bill would create broad new legal risks by lowering the (actual) knowledge requirement from “knowingly” to “recklessly” (which would include an after-the-fact assessment of what the company “should have known”) and amending Section 230 to authorize both criminal prosecution and civil suits under state law. For the first time, operators could be sued by plaintiff’s lawyers in class-action suits for “reckless” decisions in designing or operating their sites/services.
-
Condition Section 230 Immunity: The commission’s (a) recommended “best practices” would quickly become (b) conditions for invoking Section 230 immunity against greatly expanded liability for CSAM — immunity so vital to the operation of many online services that (c) the conditions would be tantamount to legal mandates.
As drafted, Graham’s bill entails a shocking abandonment of the most basic principles of how administrative agencies make rules — based on the fiction that the “best practices” wouldn’t be effectively mandatory — by allowing the AG to bypass Congress on other controversial issues like mandatory age verification or even encryption. As I told Bloomberg: “The absolute worst-case scenario could easily become reality: DOJ could effectively ban end-to-end encryption.” Signal, Telegram and Whatsapp all could no longer exist in their current form. All would be required to build in backdoors for law enforcement because all could be accused of “recklessly” designing their products to make it impossible for the operators or law enforcement to stop CSAM sharing. The same could happen for age verification mechanisms. It’s the worst kind of indirect regulation. And because of the crazy way it's done, it could be hard to challenge in court.
The rhetorical premise of the “EARN IT” Act — that Section 230 was a special favor that tech companies must continually “earn” — is false. Republicans have repeatedly made this claim in arguing that only “neutral” platforms “deserve” Section 230’s protections, and Democrats likewise argue that website operators should lose Section 230’s protections if they don’t “do more” to combat disinformation or other forms of problematic speech by users.
Congress has never conditioned Section 230 in the way Graham’s bill would do. Section 230, far from being a special favor or subsidy to tech companies, was crafted because, without its protections, website operators would have been discouraged from taking active measures to moderate user content — or from hosting user-generated content altogether, often referred to as the “moderator’s dilemma.”
Here’s how Graham’s monstrous, Rube-Goldberg-esque legal contraption would work in practice. To understand which services will be affected and why they’d feel compelled to do whatever DOJ commands to retain their Section 230 immunity, we’ll unpack the changes to criminal law first.
Step #1: Expanding Legal Liability
Graham’s bill would amend existing law in a variety of ways, mostly paralleling SESTA-FOSTA: while the 2018 law expanded the federal prostitution law (18 U.S.C. § 1591, 2421A), the Graham bill focuses on “child exploitation” imagery (child porn). (Note: To help prosecutors prosecute sex trafficking, without the need for any amendment to Section 230, TechFreedom supported toughening 18 U.S.C. § 1591, 2421A to cover trafficking of minors when FOSTA was a stand-alone bill — but opposed marrying FOSTA with SESTA, the Senate bill, which unwisely amended Section 230.) Specifically, the Graham bill would:
-
Create a new civil remedy under 18 U.S.C. § 2255 that extends to suits brought against an “interactive computer service” for reckless § 2252 violations;
-
Amend Section 230(e) to exclude immunity for state criminal prosecution for crimes coextensive with § 2252; and
-
Amend Section 230(e) to exclude immunity for civil causes of action against an “interactive computer service” pursuant to other state laws if the underlying claim constitutes a violation of § 2252 (or by operation of § 2255(a)(1)). Most notably, this would open the door to states to authorize class-action lawsuits brought by entrepreneurial trial lawyers — which may even be a greater threat than criminal prosecution since the burden of proof would be lower (even though, in principle, a civil plaintiff would have to establish that a violation of criminal law had occurred under Section 2252).
The Graham bill goes further than SESTA-FOSTA in two key respects:
-
It would lower the mens rea (knowledge) requirement from “knowingly” to “recklessly,” making it considerably easier to prosecute or sue operators; and
-
Allow for state criminal and civil prosecution for hosting child exploitation imagery that could violate § 2252).
In a ploy to make their bill seem less draconian, SESTA-FOSTA’s sponsors loudly proclaimed that they preserved “core” parts of Section 230’s immunity. Graham will no doubt do the same thing. Both bills leave untouched Section 230(c)(2)(A)’s immunity for “good faith” content removal decisions. But this protection is essentially useless against prosecutions for either sex trafficking or CSAM. In either case, the relevant immunity would be Section 230(c)(1), which ensures that ICS operators are not held responsible as “publishers” for user content. The overwhelming majority of cases turn on that provision — and that is the provision that Graham’s bill conditions on compliance with the AG’s “best practices.”
Step #2: How a “Recommendation” Becomes a Condition to 230
The bill seems to provide an important procedural safeguard by requiring consensus — at least 10 of the 15 commissioners — for each recommended “best practice.” But the chairman (the FTC chairman or his proxy) could issue his own “alternative best practices” with no minimum level of support. The criteria for membership ensure that he’d be able to command at least a majority of the commission, with the FTC, DOJ and Department of Homeland Security each getting one seat, law enforcement getting two, prosecutors getting two more — that’s seven just for government actors — plus two more for those with “experience in providing victims services for victims of child exploitation” — which makes nine reliable votes for “getting tough.” The remaining six Commissioners would include two technical experts (who could turn out to be just as hawkish) plus two commissioners with “experience in child safety” at a big company and two more from small companies. So the “alternative” recommendations would almost certainly command a majority anyway.
More importantly, it doesn’t really matter what the Commissioners recommend: the Attorney General (AG) could issue a radically different set of “best practices” — without public comment. He need only explain why he modified the Commission’s recommendations.
What the AG ultimately issues would not just be recommendations. No, Graham’s bill would empower the AG to enact requirements for enjoying Section 230’s protections against a range of new civil lawsuits and from criminal prosecutions related to “child exploitation” or “child abuse” — two terms that the bill never defines.
Step #3: How Conditioning 230 Eligibility Amounts to a Mandate
Most websites and services, especially the smallest ones, but even the largest ones, simply couldn’t exist if their operators could be held civilly liable for what their users do and say — or if they could be prosecuted under an endless array of state laws. But it’s important to stress at the outset that Section 230 immunity isn’t anywhere near as “absolute” or “sweeping” as its critics claim. Despite the panic over online sex trafficking that finally led Congress, in 2018, to pass SESTA-FOSTA, Section 230 never hindered federal criminal prosecutions. In fact, the CEO of Backpage.com — the company at the center of the controversy over Section 230 — pled guilty to facilitating prostitution (and money laundering) the day after SESTA-FOSTA became law in April 2018. Prosecutors didn’t need a new law, as we stressed at the time.
Just as SESTA-FOSTA created considerable new legal liability for websites for sex trafficking, Graham’s bill does so for CSAM (discussed below) — which makes Section 230 an even more critical legal shield and, in turn, makes companies more willing to follow whatever requirements might be attached to that legal shield.
How Broad Could the Bill’s Effects Be?
Understanding the bill’s real-world effects depends on three separate questions:
-
What counts as “child exploitation” and “child abuse?”
-
Which companies would really need Section 230 protection against new, expanded liability for CSAM?
-
What could be the scope of the AG’s conditions to 230 liability? Must they be related to conduct covered by Section 230?
What Do We Mean by “Child Exploitation” and “Child Abuse?”
The bill’s title focuses on “child exploitation” but the bill also repeatedly talks about “child abuse” — without defining either term. The former comes from the title of 18 U.S.C. § 2252, which governs the “visual depiction involves the use of a minor engaging in sexually explicit conduct” (CSAM). The bill directly invokes that bedrock law, so one might assume that’s what Graham had in mind. There is a federal child abuse law but it’s never mentioned in the bill.
This lack of clarity becomes a significant problem because, as discussed below, the bill is so broadly drafted that the AG could mandate just about anything as a condition of Section 230 immunity.
Which Websites & Services Are We Talking About?
Today, every website and Internet service operator faces some legal risk for CSAM. At greatest risk are those services that allow users to communicate with each other in private messaging or groups, or to share images or videos, because this is how CSAM is most likely to be exchanged. Those who traffic in CSAM are known to be highly creative in finding unexpected places to interact online — just as terrorist groups may use chat rooms in video games to hold staff meetings.
It’s hard to anticipate all the services that might be affected by the Graham bill, but it’s safe to bet that any messaging, photo-sharing, video-hosting or file-sharing tool would consider the bill a real threat. At greatest risk would be services that cannot see what their users do because they offer end-to-end encryption. They risk being accused of making a “reckless” design decision if it turns out that their users share CSAM with each other.
What De Facto Requirements Are We Talking About?
Again, Graham’s bill claims a narrow scope: “The purpose of the Commission is to develop recommended best practices for providers of interactive computer services regarding the prevention of online child exploitation conduct.”
The former term (ICS) is the term Section 230 uses to refer to covered operators: a service, system or software that “provides or enables computer access by multiple users to a computer server.” You might think the Graham bill’s use of this term means the bill couldn’t be used to force Apple to change how it puts E2EE on iPhones — because the iPhone, unlike iMessage, is not an ICS. You might also think that the bill couldn’t be used to regulate things that seem unrelated to CSAM — like requiring “fairness” or “neutrality” in content moderation practices, as Sen. Hawley has proposed and Graham has mentioned repeatedly.
But the bill won’t actually stop the AG from using this bill to do either. The reason is the same in both cases: this is not how legislation normally works. In a normal bill, Congress might authorize the Federal Communications Commission to do something — say, require accessibility features for disabled users of communications services. The FCC could then issue regulations that would have to be reasonably related to that purpose and within its jurisdiction over “communications.” As we know from the 2005 American Library decision, the FCC can’t regulate after the process of “communications” has ended — and thus had no authority to require television manufacturers to build in “broadcast flag” technology on their devices to ensure that, once the device received a broadcast signal, it could not make copies of the device unless authorized by the copyright holder.
But that’s not how Graham’s bill would work. A company that only makes devices, or installs firmware or operating system on them, may not feel compelled to follow the AG’s “best practices” because it does not operate an ICS, and, as such, could not claim Section 230 content (and is highly unlikely to be sued for what its users do anyway). But Apple or Google, in addition to doing these things, also operate multiple ICSes. Nothing in the Graham bill would stop the AG from saying that Apple would lose its Section 230 immunity for iMessage, iCloud or any ICS if it does not build in a backdoor on iPhones for law enforcement. Apple would likely comply. And even if Apple resists, smaller companies with fewer legal resources would likely cave under pressure.
In fact, the Graham bill specifically includes, among ten “matters addressed,” the “retention of evidence and attribution or user identification data relating to child exploitation or child sexual abuse, including such retention by subcontractors” — plus two other prongs relating to identifying such material. While these may appear to be limited to CSAM, the government has long argued that E2EE makes it impossible for operators either to identify or retain CSAM — and thus that law enforcement must have a backdoor and/or that operators must be able to see everything their users do (the opposite of E2EE).
Most of the “matters addressed” pertain to child exploitation (at least in theory) but one other stands out: “employing age limits and age verification systems.” Congress tried to mandate minimum age limits and age verification systems for adult materials back in the Child Online Protection Act (COPA) of 1998. Fortunately, that law was blocked in court in a protracted legal battle because adults have a right to access sensitive content without being subjected to age verification — which generally requires submitting a credit card, and thus necessarily entails identifying oneself. (The court also recognized publishers’ rights to reach privacy-sensitive users.)
Rep. Bart Stupak’s (D-MI) ‘‘Online Age Verification and Child Safety Act’’ of 2009 attempted to revive age verification mandates, but died amidst a howl of protest from civil libertarians. But, like banning E2EE, this is precisely the kind of thing the AG might try to mandate under Graham’s bill. And, critically, the government would argue that the bill does not present the same constitutional questions because it is not a mandate, but rather merely a condition of special immunity bestowed upon operators as a kind of subsidy. Courts should protect us from “unconstitutional conditions,” but given the state of the law and the difficulty of getting the right parties to sue, don’t count on it.
These “matters addressed” need not be the only things the Commission recommends. The bill merely says the “[t]he matters addressed by the recommended best practices developed and submitted by the Commission … shall include [the ten things outlined in the bill].” The Commission could “recommend” more — and the AG could create whatever conditions to Section 230 liability he felt he could get away with, politically. His sense of shame, even more than the courts or Congress, would determine how far the law could stretch.
It wouldn’t be hard to imagine this AG (or AGs of, sadly, either party) using the bill to reshape moderation practices more generally. Republicans increasingly argue that social media are “public fora” to which people like Alex Jones or pseudo-journalistic outlets like Gateway Pundit have First Amendment rights of access. Under the same crazy pseudo-logic, the AG might argue that, the more involved the government becomes in content moderation through whatever conditions he imposes on Section 230 immunity, the more essential it is that website operators “respect the free speech rights” of users. Ultimately, the Commission would operate as a censorship board, with murky but enormous powers — and the AG would be the ultimate censor.
If this sounds like a crazy way to make law, it is! It’s free-form lawmaking — not “we tell what you must do” (and you can raise constitutional objections in court) but rather “we’re not gonna tell you what to do, but if you don’t want to be sued or prosecuted under vague new sex trafficking laws, you’d better do what we tell you.” Once the Commission or the AG strays from “best practice” recommendations that strictly related to CSAM, then the floodgates are open to politically motivated back-door rulemaking that leave platforms with no input and virtually no avenue for appeal. And even if the best practices are related to CSAM, the way the Commission makes what amounts to law will still be unprecedented, secretive, arbitrary and difficult to challenge in court.
Other Key Aspects of How the Bill Would Work
The bill would operate as follows:
-
The bill allows 90 days Commissioners to be appointed, 60 days for the Commission’s first meeting, and 18 months to make its first set of recommendations — 25 months in total. The leaked draft leaves blankr the window in which the AG must issue his “best practices.”
-
Those would de facto requirements would not become legally valid until publication in the Federal Register — which usually takes a month but which sometimes drags out indefinitely.
-
Operators would have 1 year to submit a written certification of their compliance.
-
If, say, the next administration drags its feet and the AG never issues “best practices,” the bill’s amendments to Section 230 and criminal law go into effect four years after enactment — creating sweeping new liability for CSAM and removing Section 230’s protections.
-
The Commission and AG will go through the whole farce again at least every two years.
The bill also grants DOJ broad subpoena power to determine whether operators are, in fact, living up to their certification of compliance with the AG’s “best practices.” Expect this power to be used aggressively to turn tech companies inside out.
Conclusion
In the end, one must ask: what problem is the Graham bill trying to solve? Section 230 has never prevented federal criminal prosecution of those who traffic in CSAM — as more than 36,000 individuals were between 2004 and 2017. Website operators themselves already have enormous legal liability for CSAM — and can be prosecuted by the Department of Justice for failing to cooperate with law enforcement, just as Backpage executives were prosecuted under Federal sex trafficking before SESTA-FOSTA (and plead guilty).
The Graham bill seems to be designed for one overarching purpose: to make services that offer end-to-end encryption effectively illegal, and ensure that law enforcement (and the intelligence agencies) has a backdoor into every major communications platform.
That would be outrageous enough if it were done through a direct mandate, but doing it in the roundabout way Graham’s bill proposes is effectively a backdoor to a backdoor. Unfortunately, that doesn’t mean the bill might not suddenly move quickly through Congress, just as SESTA did. Be ready: the “Cryptowars” may finally turn very, very hot.
Filed Under: encryption, fosta, lindsey graham, privacy, section 230, sesta