Signal Speaks Out About The Evils Of The EARN IT Act
from the speak-out,-in-encrypted-fashion dept
Signal, the end-to-end encrypted app maker, doesn't really need Section 230 of the Communications Decency Act. It can't see what everyone's saying via its offering anyway, so there's little in the way of moderation to do. But, still, it's good to see it come out with a strong condemnation of the EARN IT Act, which as been put forth by Senators Lindsey Graham, Richard Blumenthal, Dianne Feinstein, and Josh Hawley as a way to undermine both Section 230 of the CDA and end-to-end encryption in the same bill. The idea is to effectively use one as a wedge against the other. Under the bill, companies will have to "earn" their 230 protections, by putting in place a bunch of recommended "best practices" which can be effectively put in place by the US Attorney General -- the current holder of which, Bill Barr, has made clear that he hates end-to-end encryption and thinks its a shame the DOJ can't spy on everyone. And this isn't just this administration. Law enforcement officials, such as James Comey under Obama, were pushing this ridiculous line of thinking as well.
To be clear, the EARN IT Act might not have a huge direct impact on a company like Signal -- since it doesn't really much rely on 230 protections (though it might at the margins). But it's good to see that it recognizes what a terrible threat the EARN IT Act would be:
It is as though the Big Bad Wolf, after years of unsuccessfully trying to blow the brick house down, has instead introduced a legal framework that allows him to hold the three little pigs criminally responsible for being delicious and destroy the house anyway. When he is asked about this behavior, the Big Bad Wolf can credibly claim that nothing in the bill mentions “huffing” or “puffing” or “the application of forceful breath to a brick-based domicile” at all, but the end goal is still pretty clear to any outside observer.
However as Signal makes clear, getting rid of end-to-end encryption is much more likely to harm everyone, without providing much help to law enforcement in the first place:
Bad people will always be motivated to go the extra mile to do bad things. If easy-to-use software like Signal somehow became inaccessible, the security of millions of Americans (including elected officials and members of the armed forces) would be negatively affected. Meanwhile, criminals would just continue to use widely available (but less convenient) software to jump through hoops and keep having encrypted conversations.
There is still time to make your voice heard. We encourage US citizens to reach out to their elected officials and express their opposition to the EARN IT bill. You can find contact information for your representatives using The Electronic Frontier Foundation’s Action Center.
Stay safe. Stay inside. Stay encrypted.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: communications, earn it, encryption, intermediary liability, secrecy, section 230
Companies: signal
Reader Comments
Subscribe: RSS
View by: Time | Thread
Hey, look over here, this is actually important
The difficulty will be getting the electorate to focus on this issue rather than a certain pandemic that is wandering around. On the good side, folks sequestered at home might have time to deal with it. On the other side, getting their attention might be a problem.
[ link to this | view in thread ]
Re: Hey, look over here, this is actually important
I can actually see a scenario where the pandemic could help in the fight against weakened encryption.
The increase in remote working requires strong encryption to protect the company information. Weakening VPN encryption will put those companies at risk of industrial espionage, so it could lead to more companies coming out against this bill if they realize how much their risk level will increase.
[ link to this | view in thread ]
I think our politicians are the ones that need to earn it.
[ link to this | view in thread ]
So basically criminals would just use "unauthorized" software thats end to end.
Does this crazy old man, who really should retire ASAP think that criminals will go "welp, this application isn't legal anymore..better uninstall it" ?
He needs to go to an old folks home and stay there.
[ link to this | view in thread ]
Re: Re: Hey, look over here, this is actually important
And also right now it seems the bill may not come up to vote or even pass for a while since congress is preoccupied with the coronavirus so its not likely to pass before the election.
[ link to this | view in thread ]
I don't see how you can ban encryption from a physical standpoint still.
I also don't see which provisions of the bill are specifically being protested against.
[ link to this | view in thread ]
Re:
On the off chance you're being serious...
I don't see how you can ban encryption from a physical standpoint still.
... maybe because that's a nonsensical phrasing of what's actually going on, much like saying that you don't see how someone can tell the temperature of math.
On the encryption front what they can do however is issue some 'best practices' that, while they don't flat out say 'no encryption', have the effect of punishing a platform for using encryption via removal of 230 protections, heavily incentivizing them not to implement non-broken encryption. Given a choice between 'having working encryption, but also liable for everything posted on the platform' or 'no real encryption, but only liable for what those running the platform post' the cost/benefit analysis is heavily skewed towards the latter.
I also don't see which provisions of the bill are specifically being protested against.
See above, but more generally all of it. Attempting to attack 230 and encryption directly have so far failed, so dishonest individuals are now trying to do so indirectly via this bill, hoping that if they pretend it's about 'protecting the children!'(it's not) the majority of politicians will be too spineless to vote against it.
There's also the problem that the core premise of bill, that companies need to earn protection against liability regarding actions of third-parties, is massively flawed.
Ford doesn't have to EARN protection against liability for someone using one of their vehicles in a hit and run.
Walmart doesn't have to EARN protection against liability if someone buys a knife in one of their stores and goes and stabs someone with it.
The NYT doesn't have to EARN protection against liability if someone buys/finds a copy of their paper and writes defamatory content in it.
... and online platforms should not have to EARN protection against liability if someone posts something illegal/objectionable, entirely without the involvement of the platform beyond simply offering a platform to post on.
[ link to this | view in thread ]
Re: Re:
That's not quite true.
Ford has to have seat belts and airbags in theory.
Walmart isn't supposed to sell certain knives, in theory.
The NYT might have to issue a public retraction if they forged it well enough in theory.
If they can manage to protect the children then it's still fine but I don't see laws being enforced very well.
[ link to this | view in thread ]
That is precisely the point. Nobody intends to enforce this law at all well. It will be enforced selectively: to suit some prosecutor's need for publicity, or to avenge some unfortunately-elected politician's feelings over something mean someone wrote about him on the intertubes.
You might say this law is designed to be enforced badly.
[ link to this | view in thread ]
Re: Re: Re:
... I honestly can't tell if you missed the point on purpose or somehow all three examples someone went right over your head, so let me condense them all into one to point that will hopefully get it across better: Offline companies are not held responsible if someone uses their product for illegal/objectionable actions that they had no involvement with other than providing the product in the first place, and as such it's absurd to act as though online companies should need to earn the right to the very same protections.
[ link to this | view in thread ]
Re: Re: Hey, look over here, this is actually important
That's actually a very important point to make. Even once COVID-19 is in the rearview the rise of tele-confrencing and working from home will only increase cause now companies are seeing they don't HAVE to pay to fly out their employees or have them come in when it could be done on the cheap and we know businesses will cut whatever corners they can to make the maximum amount of profit.
[ link to this | view in thread ]
Re: Re: Re:
All you examples are things that involve deliberate decisions by the companies. They can therefore be held responsible for their own actions. However web sites that allow user generated content do not decide what the user post, but can only react post action to that post. Therefore they should not be held liable for user generated content unless they fail to act on a court order.
They can voluntarily react to posted content by taking down anything that they object to for any reason, but that should not create any liability for them of anything they miss, or fail to see a reason to take down.
Making sites responsible for user generated content if they fail to follow arbitrary guidelines is politicians introducing censorship by the back door, and should be especially worrying in a country with a strong religious fundamentalist contingent amongst the politicians.
[ link to this | view in thread ]