The EARN IT Act Creates A New Moderator's Dilemma
from the moderate-perfectly-or-else dept
Last month, a bipartisan group of U.S. senators unveiled the much discussed EARN IT Act, which would require tech platforms to comply with recommended best practices designed to combat the spread of child sexual abuse material (CSAM) or no longer avail themselves of Section 230 protections. While these efforts are commendable, the bill would cause significant problems.
Most notably, the legislation would create a Commission led by the Attorney General with the authority to draw up a list of recommended best practices. Many have rightly explained that AG Barr will likely use this new authority to prohibit end-to-end encryption as a best practice. However, less discussed is the recklessness standard the bill adopts. This bill would drastically reduce free speech online because it eliminates the traditional moderator’s dilemma and instead creates a new one: either comply with the recommended best practices, or open the legal floodgates.
Prior to the passage of the Communications Decency Act in 1996, under common law intermediary liability, platforms could only be held liable if they had knowledge of the infringing content. This meant that if a platform couldn’t survive litigation costs, they could simply choose not to moderate at all. While not always a desirable outcome, this did provide legal certainty for smaller companies and start-ups that they wouldn’t be litigated into bankruptcy. This dilemma was eventually resolved thanks to Section 230 protections, which prevent companies from having to make that choice.
However, the EARN IT Act changes that equation in two key ways. First, it amends Section 230 by allowing civil and state criminal suits against companies who do not adhere to the recommended best practices. Second, for the underlying Federal crime (which Section 230 doesn’t affect), the bill would change the scienter requirement from actual knowledge to recklessness. What does this mean in practice? Currently, under existing Federal law, platforms must have actual knowledge of CSAM on their service before any legal requirement goes into effect. So if, for example, a user posts material that could be considered CSAM but the platform is not aware of it, then they can’t be guilty of illegally transporting CSAM. Platforms must remove and report content when it is identified to them, but they are not held liable for any and all content on the website. However, a recklessness standard turns this dynamic on its head.
What actions are “reckless” is ultimately up to the jurisdiction, but the model penal code can provide a general idea of what it entails: a person acts recklessly when he or she “consciously disregards a substantial and unjustifiable risk that the material element exists or will result from his conduct.” But what’s worse, the bill opens the platform’s actions to civil cases. Federal criminal enforcement normally targets the really bad actors, and companies that comply with reporting requirements will generally be immune from liability. However with these changes, if a user posts material that could potentially be considered CSAM, despite no knowledge on the part of the platform, civil litigants could argue that the moderation and detection practices of the companies, or lack thereof, constituted a conscious disregard of the risk that CSAM will be shared by users.
When the law introduces ambiguity into liability, companies tend to err on the side of caution. In this case, that means the removal of potentially infringing content to ensure they cannot be brought before a court. For example, in the copyright context, a Digital Millennium Copyright Act safe-harbor exists for internet service providers (ISPs) who “reasonably implement” policies for terminating repeat infringers on their service in “appropriate circumstances.” However, courts have refused to apply that safe-harbor when a company didn’t terminate enough subscribers. This uncertainty about whether a safe-harbor applies will undoubtedly lead ISPs to act on more complaints, ensuring they cannot be liable for the infringement. Is it "reckless" for a company not to investigate postings from an IP address if other postings from that IP address were CSAM? What if the IP address belongs to a public library with hundreds of daily users?
This ambiguity will likely force platforms to moderate user content and over-remove legitimate content to ensure they cannot be held liable. Large firms that have the resources to moderate more heavily and that can survive an increase in lawsuits may start to invest the majority of moderation resources into CSAM out of an abundance of caution. As a result, this would leave less resources to target and remove other problematic content such as terrorist recruitment or hate speech. Mid-sized firms may end up over-removing user content that in any way features a child or limit posting to trusted sources, insulating them from potential lawsuits that could cripple the business. And small firms, who likely can’t survive an increase in litigation could ban user content entirely, ensuring nothing on the website hasn’t been posted without vetting. These consequences, and the general burden on the First Amendment, are exactly the type of harms that drove courts to adopt a knowledge standard for online intermediary liability, ensuring that the free flow of information was not unduly limited.
Yet, the EARN IT Act ignores this. Instead, the bill assumes that companies will simply adhere to the best practices and therefore retain Section 230 immunity, avoiding these bad outcomes. After all, who wouldn’t want to comply with best practices? Instead, this could force companies to choose between vital privacy protections like end-to-end encryption or litigation. The fact is there are better ways to combat the spread of CSAM online which don’t require platforms to remove key privacy features for user.
As it stands now, the EARN IT Act solves the moderator’s dilemma by creating a new one: comply, or else.
Jeffrey Westling is a technology and innovation policy fellow at the R Street Institute, a free-market think tank based in Washington, D.C.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: attorney general, best practices, cda 230, earn it, earn it act, encryption, moderator's dilemma, recklessness, section 230
Reader Comments
The First Word
“Hmm. It actually may be worse than that, because it appears to apply beyond what you'd think of as "platforms".
The recklessness and "best practices" requirements are applied to all providers of "interactive computer services". The definition of "interactive computer service" is imported by reference from 230. That definition is:
The part about "system... that enables computer access" sweeps in all ISPs and telecommunication carriers, as well as operators of things like Tor nodes. And "access software provider" brings in all software tools and many non-software tools, including open source projects.
Under 230, those broad definitions are innocuous, because they're only used to provide a safe harbor. An ISP or software provider is immunized if it doesn't actually know about or facilitate specific content. No ISP and almost no software provider has any actual knowledge of what passes through or use its service, let alone editing the content or facilitating its creation, so they get the full safe harbor, with minimal or no actual cost to them. And anyway nobody has been after them on 230-ish issues, so including them doesn't hurt.
Under EARN-IT, those same definitions would be used to impose liability, so now those parties actually get burdens from being inside the definition. That's worse than a repeal of 230. It doesn't just remove a safe harbor; it opens an avenue for positive attack.
This commission could decide that it's a "best practice" for ISPs to block all traffic they can't decrypt. Or it could decide that it's a "best practice" not to provide any non-back-doored encryption software to the public, period.
Or, since those might generate too much political backlash at the start, it could start boiling the frog on the slippery slope by, say, deciding that it's a "best practice" not to facilitate meaningfully anonymous communication, effectively outlawing Tor, I2P, and many standard VPN practices.
Then it could start slowly expanding the scope of that, possibly even managing to creep into banning all non-back-doored encryption, without ever making any sudden jump that might cause a sharp public reaction.
Back on the platform side, over time the rules could easily slide from the expected (and unacceptable) "best practice" of not building any strong encryption into your own product, to the even worse "best practice" of trying to identify and refuse to carry anything that might be encrypted. Start by applying it to messaging, then audio/video conferencing, then file storage... and then you have precedents giving you another avenue to push it all the way to ISPs.
Subscribe: RSS
View by: Time | Thread
Shouldn't that be comply, or refuse to accept user content even if it means shutting up shop. How do sites like GitLab, or Thingiverse survive if they have to examine everything posted, including contents of zip files etc?
[ link to this | view in chronology ]
“My fellow senators, we must be seen doing something!”
“But what if the thing we do ends up destroying the Internet?”
“Good! Without the Internet, those assholes on Twitter can’t call me names for trying to destroy the Internet.”
“I don’t think—”
“Nor should you.”
[ link to this | view in chronology ]
Corporatism at its Finest
Large corporations love regulations like this because they are the only ones who can comply with the regulations. One of the awesome features of the internet is that small companies can compete against large ones based on the merits. Year 2001 Google could compete against Microsoft and win. Large corporations hate that. So they will support things like the Earn It Act. Only the big boys like Facebook and Google will survive.
[ link to this | view in chronology ]
Re: Corporatism at its Finest
I thought large corporations wrote regulations like this ...
[ link to this | view in chronology ]
Re: Corporatism at its Finest
Could they survive an assault by the attorneys general of many states, coordinated by Hollywood?
[ link to this | view in chronology ]
Re: Re: Corporatism at its Finest
I think you are missing the point, Google and Facebook are so large that they are the only ones with a possibility of surviving, if they can't then everyone smaller is guaranteed to be unable to.
[ link to this | view in chronology ]
Re: Re: Corporatism at its Finest
coordinated by Hollywood
Proof or GTFO!
Just kidding: we know how that propaganda monstrosity works.
They distribute the child pornography, then they try to get it back, much like the good old days of J. Edgar Hoovers prime, blackmailing gays over dick pics and love letters, but with the new and improved anti-hetero gay mafia at the helm, flipping the script.
[ link to this | view in chronology ]
They haven't been able to repeal 230 directly, so they're trying to use this as an end run around 230.
[ link to this | view in chronology ]
Has the senate judiciary committee voted on it yet
I looked at the senate judiciary committees website. No vote on this yet. Just wondering does techdirt know something we do not? When is the Easter recess as well?
[ link to this | view in chronology ]
Re: Has the senate judiciary committee voted on it yet
Right now COVID-19 is sucking all the air out of the room plus the senate has taken off for a month-long recess due to return on April 23rd but that may not even happen and extend to May.
The bill hasn't garnered alot of co-sponsors yet aside from the ones who are usually behind bills such as these but once this pandemic passes and things resume to semi-normalcy it will be there waiting.
[ link to this | view in chronology ]
Re: Re: Has the senate judiciary committee voted on it yet
Tho that may take awhile and its also a election year so its not likely to pass before the election but they may try to pass it during lame duck.
[ link to this | view in chronology ]
Can something still be a "recommended best practice" if there are legal ramifications for not following it? Is every law really just a "recommended best practice" since you can choose the legal ramifications instead if you want?
[ link to this | view in chronology ]
Re:
That is an interesting question. Even though 'enacted' by a law, could those best practices be challenged as not law? When presented in courts, can the defense go on and on about how those so called 'best practices' are not in fact law and therefore cannot be applied as law? How about the defense shows how a 'best practice' is not in fact a best practice. How about the defense showing that the 'best practices' imposed by a singularity such as the Attorney General, who actually knows nothing about 'best practices', are gifts to the organizations that paid to get his boss elected?
I could go on, but unlike regulatory agencies who are required to have comment periods prior to rule making, it appears that these 'best practices' will be imposed by fiat.
[ link to this | view in chronology ]
Re: Re:
And no one may dare question what best practices really are, in regards to "best practices" imposed by morons with an agenda.
[ link to this | view in chronology ]
Re:
No but it would be much harder to get these recomended best practices passed as laws as the people wouldn't stand for it. But if they pass a bill designed to combat CP that allows non elected persons to state what can and cannot be done, it's less likly that people will notice they are losing rights.
[ link to this | view in chronology ]
Re: Re:
Well that was changed in the introduced version where it states that once the commission settles on a set of "Best Practices" they must be drafted as a bill and passed in the normal albeit truncated manner akin to Trade Promotion Authority when negotiating trade deals.
So given that these "best practices" will be codified by an official law of congress I'd say they could be.
[ link to this | view in chronology ]
Re:
Ah, but there's where the gross dishonesty and weasel words comes into play(with appropriate apologies to actual weasels for being lumped in with politicians), for you see it's not that sites are punished if they don't follow the 'best practices', they simply aren't EARNing the privilege of 230 protections, hence the bill's name.
Thanks to the constant refrain of spinning 230 protections as an extra privilege that online platforms get versus those poor offline ones(rather than the truth that 230 simply codifies that online platforms have the same protections as offline ones), you can be sure that it will be argued that there's no punishment being handed out, sites are simply being treated equally, and if they want to EARN that special protection back all they have to do is follow a few simple rules, which of course will be trivial if they really want to follow them.
[ link to this | view in chronology ]
Re: Re:
The legal punishment doesn't come from removing the section 230 protections it comes from what 230 protects from. Any way you slice it those who would be punished without section 230 are punished legally if they ignore the recommendations and not punished legally if they follow them.
[ link to this | view in chronology ]
Are construction companies liable for criminals using roads to drive to their targets, or to transport contraband? What about the auto manufacturers who build the cars that the criminals use on said roads? What about the gas stations that provide fuel for said vehicles? What about the oil companies that sell gas to the various stations? What about the geologists who sell their services to help the oil companies locate resources to extract? What about the colleges that educate the geologists? What about, literally everyone else on Earth, who in some way indirectly affect all of these processes?
[ link to this | view in chronology ]
Re:
What about the politicians that accept lobbying cash (bribes) that allow all of the above to happen?
[ link to this | view in chronology ]
Re:
And that is the kicker and direct refutation of the idea that sites should have to earn 230 protection: For any offline company it is already well understood that you can't sue the platform/company for what a third-party uses their service/product for, the only thing 230 does is make it clear that that same protection against liability applies to online platforms as well.
[ link to this | view in chronology ]
Hmm. It actually may be worse than that, because it appears to apply beyond what you'd think of as "platforms".
The recklessness and "best practices" requirements are applied to all providers of "interactive computer services". The definition of "interactive computer service" is imported by reference from 230. That definition is:
The part about "system... that enables computer access" sweeps in all ISPs and telecommunication carriers, as well as operators of things like Tor nodes. And "access software provider" brings in all software tools and many non-software tools, including open source projects.
Under 230, those broad definitions are innocuous, because they're only used to provide a safe harbor. An ISP or software provider is immunized if it doesn't actually know about or facilitate specific content. No ISP and almost no software provider has any actual knowledge of what passes through or use its service, let alone editing the content or facilitating its creation, so they get the full safe harbor, with minimal or no actual cost to them. And anyway nobody has been after them on 230-ish issues, so including them doesn't hurt.
Under EARN-IT, those same definitions would be used to impose liability, so now those parties actually get burdens from being inside the definition. That's worse than a repeal of 230. It doesn't just remove a safe harbor; it opens an avenue for positive attack.
This commission could decide that it's a "best practice" for ISPs to block all traffic they can't decrypt. Or it could decide that it's a "best practice" not to provide any non-back-doored encryption software to the public, period.
Or, since those might generate too much political backlash at the start, it could start boiling the frog on the slippery slope by, say, deciding that it's a "best practice" not to facilitate meaningfully anonymous communication, effectively outlawing Tor, I2P, and many standard VPN practices.
Then it could start slowly expanding the scope of that, possibly even managing to creep into banning all non-back-doored encryption, without ever making any sudden jump that might cause a sharp public reaction.
Back on the platform side, over time the rules could easily slide from the expected (and unacceptable) "best practice" of not building any strong encryption into your own product, to the even worse "best practice" of trying to identify and refuse to carry anything that might be encrypted. Start by applying it to messaging, then audio/video conferencing, then file storage... and then you have precedents giving you another avenue to push it all the way to ISPs.
[ link to this | view in chronology ]
Re:
... oh, and even if you weren't a company with any significant infrastructure, they could also come after you for providing software for P2P or other decentralized solutions. "Protocols, not platforms" only works if somebody's allowed to provide the software to speak the protocol...
[ link to this | view in chronology ]
Re:
It occurs to me that Congress itself has in-house email systems, and all you'd need to do to have standing to sue Congress as a whole is if someone on their system sends or receives email that contains content that might be illegal.
As for best practices, what would stop the commission from defining 'extremism' to be something that cannot be transmitted under best practices, with 'extremism' defined as holding political beliefs different than the currently-elected political majority party?
[ link to this | view in chronology ]
CSAM
I'm not a troll, I just play one online sometimes...
Republicans and certain Christian groups notwithstanding, is there any actual evidence that CSAM online is a serious problem? I've never run across any and I lurk in some pretty sketchy areas of the web. The current porn tubes' fascination with "incest" storylines are as fake as the rest of porn storylines. Even back when Instagram was for porn I didn't see any obvious CSAM. If this really a problem that needs new laws or is it just Government and Churchs looking to censor the Internet?
[ link to this | view in chronology ]
"The fact is there are better ways to combat the spread of CSAM online which don’t require platforms to remove key privacy features for user."
FTFY: The fact is there aren't better ways to push the false encryption restrictions online without requiring corporations to forgo possible privacy revenue from user.
[ link to this | view in chronology ]
'If you ignore all the evidence my argument is great!'
Last month, a bipartisan group of U.S. senators unveiled the much discussed EARN IT Act, which would require tech platforms to comply with recommended best practices designed to combat the spread of child sexual abuse material (CSAM) or no longer avail themselves of Section 230 protections. While these efforts are commendable, the bill would cause significant problems.
And if you believe any of the above I've got some bridges to sell you. As noted in previous articles on the trainwreck of a bill the tools to deal with CSAM already exist, they simply aren't used.
This, much like FOSTA, is both a PR stunt and a way to undercut 230/encryption(though FOSATA was only aimed at 230) since going at those directly has so far failed to work, so treating it as an honest attempt to combat CSAM is already giving it far more legitimacy than it deserves.
[ link to this | view in chronology ]
Platforms outside the United States would be not subject to this.
If the coming depression results in certain parts of the Untied States Breaking away, platforms in these new countries will not have to comply with the laws of the remaining United States.
[ link to this | view in chronology ]