As we've discussed for many years, Homeland Security and the Justice Department have convinced too many courts that there is some sort of 4th Amendment "exception" at the border, whereby Customs and Border Patrol agents (CBP) are somehow allowed to search through your laptops, phones, tablets and more just because, fuck it, they can. Now bipartisan pairs in both the Senate and the House have introduced a new bill that would require that CBP get a warrant to search the devices of Americans at the border. On the Senate side, the bill is sponsored by Senators Ron Wyden and Rand Paul, and in the House, it's Reps. Blake Farenthold and Jared Polis. Honestly, it's absolutely ridiculous that this kind of bill is even needed in the first place, because the 4th Amendment should just take care of it. But with DHS and the courts not properly appreciating the 4th Amendment's requirment for a warrant to do a search, here we are. Here's a short summary of the bill as well, that notes:
The government has asserted broad authority to search or seize digital devices at the border
without any level of suspicion due to legal precedent referred to as the “border search
exception” to the Fourth Amendment’s requirement for probable cause or a warrant.
Until 2014, the government claimed it did not need a warrant to search a device if a person had
been arrested. In a landmark unanimous decision, the Supreme Court (in Riley v. California)
ruled that digital data is different and that law enforcement needed a warrant to search an
electronic device when a person has been arrested.
This bill recognizes the principles from that decision extend to searches of digital devices at the
border. In addition, this bill requires that U.S. persons are aware of their rights before they
consent to giving up online account information (like social media account names or passwords)
or before they consent to give law enforcement access to their devices.
That last part is especially important, given how eager Homeland Security has been to start demanding social media passwords as you deplane. Unfortunately, the bill as written only applies to "US Persons" as defined here, meaning that it may not be of much help for a new DHS proposal, also revealed this week, to more aggressively pursue phone and social media searches of foreigners. This is a bad idea for a whole host of reasons we've already discussed, but the short version is that it's bad for security, it's bad for tourism, it's bad for Americans' safety (because other countries will reciprocate). It's just a bad, bad idea.
At the very least, this new bill would block this from happening for American citizens or otherwise legal aliens, but it should go much further. And, of course, who knows if this bill will get any traction, or get signed by the President.
The FBI announced (without going into verifiable detail) that it had implemented new minimization procedures for handling information tipped to it by the NSA's Prism dragnet. Oddly, this announcement arrived nearly simultaneously with the administration's announcement that it was expanding the FBI's intake of unminimized domestic communications collected by the NSA.
So, which was it? Was the FBI applying more minimization or was it gaining more raw access? The parties involved have so far refused to offer any further details on either of the contradictory plans, save for vague assurances about the lawfulness of both options.
We respectfully request you confirm whether the NSA intends to routinely provide intelligence information-collected without a warrant-to domestic law enforcement agencies. If the NSA intends to go down this uncharted path, we request that you stop. The proposed shift in the relationship between our intelligence agencies and the American people should not be done in secret. The American people deserve a public debate. The United States has a long standing principle of keeping our intelligence and military spy apparatus focused on foreign adversaries and not the American people.
The letter points out that while Congress has granted the NSA "extraordinary authority" to conduct warrantless surveillance and harvest massive amounts of data, it has not done so for domestic intelligence and law enforcement agencies. But that deliberate limitation of powers has been undone by the administration's expansion. It may be indirect -- requiring the assistance of the NSA -- but it accomplishes the same purpose: giving warrantless surveillance and bulk collection powers to domestic agencies by proxy.
The letter -- sent to the heads of a variety of Congressional committees -- pulls no punches in its comparative depiction of this overreach.
We believe allowing the NSA to be used as an arm of domestic law enforcement is unconstitutional. Our country has always drawn a line between our military and intelligence services, and domestic policing and spying. We do not -- and should not -- use U.S. Army Apache helicopters to quell domestic riots; Navy Seal Teams to take down counterfeiting rings; or the NSA to conduct surveillance on domestic street gangs.
What's most amazing about the administration's move is that it followed -- directly -- two and a half years of NSA document leaks, their accompanying protests, lawsuits and backlash, the passage of the USA Freedom Act and an intense debate over the lawfulness of the PATRIOT Act. Add to that the fact that it was dropped right in the middle of a heated legal battle that has shown the FBI to be both grasping for power and incapable of telling the truth -- and it clearly shows the administration is so insulated from the collateral damage of a decade-plus of constantly expanding surveillance powers as to be completely unable to detect shifts in tone.
Legislators in two states have proposed (largely unworkable) bans on the sale of encrypted phones, citing (of course) concerns about all the criminals who might get away with something if law enforcement can't have near immediate access to the entire contents of their phones.
Congressmen Ted Lieu (D-Calif.) and Blake Farenthold (R-Texas) have introduced what they call the Ensuring National Constitutional Rights of Your Private Telecommunications (ENCRYPT) Act of 2016. It’s an attempt, Lieu and Farenthold wrote in a letter to their colleagues, to address “[c]oncerns over the privacy, security and technological feasibility of a ‘backdoor’ into encrypted devices for the government and law enforcement” by making encryption a federal issue and keeping individual states from trying to ban it.
Update: We've been informed that it's not just Lieu and Farenthold, but also Reps. Suzan Delbene and Mike Bishop.
Not only would such bans/backdoors make device usage less safe for users, but the lack of unified stance on phone encryption would turn phone sales in the US into a logistical nightmare, to the detriment of all involved.
“We are deeply concerned,” Lieu told the Daily Dot in a phone interview, “that a patchwork system with different encryption requirements in every state would not only undermine national security—it would also threaten the competitiveness of American companies and dampen innovation.”
Whether this will go anywhere remains to be seen. It would appear few legislators are willing -- at least as this point -- to tell the FBI to stop asking for backdoors or bans. Alarmingly, despite the ongoing discussion bringing more evidence to the surface that such actions are not only bad ideas, but pretty much impossible to implement without doing away with encryption entirely, it seems like more legislators are moving towards the FBI's line of thinking.
Unfortunately, that is often the nature of the political business, where fear nearly always trumps rational thinking. For too many, it's perfectly acceptable that thousands of phone users be left open to attacks than one criminal suspect go free.
For many years now, we've talked about the importance of a federal anti-SLAPP law, that would protect the First Amendment. As we've explained, it is not uncommon for people to abuse our judicial system to file a lawsuit against someone for saying things that they don't like, knowing that no matter how frivolous, the threat (and cost) of the lawsuit is often enough to get them to shut up. That's why such "Strategic Lawsuits Against Public Participation" (SLAPP) are so popular. As it stands, anti-SLAPP laws are a complete hodgepodge of state laws. Some states have no anti-SLAPP laws. Others have weak ones. And a few have strong ones (though even some of those are under attack).
While there have been some attempts in the past, it appears that some in Congress are trying, once again, to create a federal anti-SLAPP law. This one has been introduced by Reps. Blake Farenthold and Anna Eshoo (with co-sponsorship from Reps. Darrell Issa, Jared Polis and Trent Franks).
The SPEAK FREE Act of 2015, will protect citizens from frivolous lawsuits that target their First Amendment Rights. Based on the Texas Citizens Participation Act, this bill will prevent bad actors from using a lawsuit to silence public opinion simply because they don’t agree with it. These lawsuits, known as SLAPPs (Strategic Lawsuits Against Public Participation), pose a threat not only to free speech, but to the modern information economy. Protecting our right to free speech drives economic opportunity by paving the way to new forums for expression, like YouTube, or by facilitating the rise and fall of products or services through competition and honest buyer feedback.
The SPEAK FREE Act will provide a federal backstop to state Anti-SLAPP laws by creating a process similar to that in Texas and California, where expensive court proceedings are delayed and claims can be dismissed if the defendant can show that a SLAPP suit cannot succeed on the merits.
The full text of the bill can be seen at that link (or below), and it does appear to be similar to the ones in Texas and California, making it much easier to dismiss bogus SLAPP suits, to halt discovery and to get awarded attorneys fees for such SLAPP suits. Also, unlike some state laws, it is not limited to just speech about the government, which is important. While there may be some specifics within the bill that are worth tweaking, overall, it seems clearly modeled on the very successful, and well-thought out bills already in place in Texas and California. It would be a huge boost to freedom of expression to have this become law.
Seeing as how some rather wealthy folks have been trying to kill off anti-SLAPP laws in states already, expect to see a lot of FUD come out about this attempt to put in place an anti-SLAPP law that protects free expression across the entire country.
Non-disparagement clauses are one of the stupidest things any company can enact. In most cases, it's almost impossible to enforce them, no matter how artfully crafted. Most aren't. Most non-disparagement clauses found lying around the internet have been lazily copied and pasted from pre-existing bad ideas instituted by other companies.
On top of the legally-dubious aspects, there's the potential for severe backlash -- something that completely underscores the futility of these half-assed gag orders. Instead of heading off criticism, these clauses tend to invite negative reviews, often from internet denizens who've never patronized the company they're bashing at multiple review sites.
But still, these clauses persist. Up until recently, the court of public opinion has had to do most of the heavy lifting when it came to the enforcement of these clauses. Last September, California became the first state in the nation to ban non-disparagement clauses, threatening violators with fines up to $10,000 (for repeated violations).
Today, Representatives Darrell Issa, Eric Swalwell, Blake Farenthold, and Brad Sherman (two Democrats and two Republicans) jointly proposed in Congress the Consumer Review Freedom Act of 2015, which would ban non-disparagement clauses nationally. We at Public Citizen have litigated cases against the use of such clauses (for instance in the KlearGear case, as well as the Cox case). The proposed bill, which is similar to one introduced last session (by Democrats only), also prohibits a business from imposing a clause requiring consumers to sign away their intellectual property rights in communications about the business. We've challenged that type of clause, too. Today's bill authorizes enforcement by the Justice Department and by state attorneys general.
The proposed law includes fines that could add up to real money fairly quickly.
The Attorney General shall bring an action against any business who violates subsection (d) for a civil penalty of not more than $16,000 for each day that the business so requires the use of such a contract by a distinct person.
No company will be penalized for existing clauses but will be expected to remove them as soon as possible. One year after the bill's enactment, any clauses still in existence (or new clauses enacted past this point) will subject companies to daily fines.
As noted above, the bill would also prevent IP landgrabs by companies that allow them to issue DMCA takedown notices targeting critical review they now "own" thanks to the fine print on Terms and Conditions pages.
How effective a law like this will be in keeping the KlearGears of the nation in line is open to debate. Some shady companies may maintain their US "presence" for only as long as it remains beneficial to them. In the wake of the default judgment against KlearGear for its bogus non-disparagement clause, the company -- which had claimed to be located at multiple locations throughout the US over the past several years -- suddenly revealed itself to be a very French corporation and thus out of reach of the $350,000 fee.
Hopefully, the law -- if implemented -- will deter future companies from ambushing unhappy customers with egregious fees and damaged credit records. The fines mooted here should act as a deterrent, especially when pursuing these fines doesn't appear to hinge on the enforcement of these questionable clauses, but rather their mere existence.
Yesterday, the House Oversight Committee held a hearing over this whole stupid kerfuffle about mobile encryption. If you don't recall, back in the fall, both Apple and Google said they would start encrypting data on mobile devices by default, leading to an immediate freakout by law enforcement types, launching a near exact replica of the cryptowars of the 1990s.
While many who lived through the first round had hoped this would die a quick death, every week or so, we see someone else in law enforcement demonizing encryption, without seeming to recognize how ridiculous they sound. There was quite a bit of that in the hearing yesterday, which you can sit and watch in its entirety if you'd like:
Thankfully, there were folks like cryptographer Matt Blaze and cybersecurity policy expert Kevin Bankston on hand to make it clear how ridiculous all of this is -- but it didn't stop law enforcement from making their usual claims. The most ridiculous, without a doubt, was Daniel Conley, the District Attorney from Suffolk County, Massachusetts, whose opening remarks were so ridiculous that it's tough to read them without loudly guffawing. It's full of the usual "but bad guys -- terrorists, kidnappers, child porn people -- use this" arguments, along with the usual "law enforcement needs access" stuff. And he blames Apple and Google for using a "hypothetical" situation as reason to encrypt:
Apple and Google are using an unreasonable, hypothetical narrative of government intrusion as the rationale for the new encryption software, ignoring altogether the facts as I’ve just explained them. And taking it to a dangerous extreme in these new operating systems, they’ve made legitimate evidence stored on handheld devices inaccessible to anyone, even with a warrant issued by an impartial judge. For over 200 years, American jurisprudence has refined the balancing test that weighs the individual’s rights against those of society, and with one fell swoop Apple and Google has upended it. They have created spaces not merely beyond the reach of law enforcement agencies, but beyond the reach of our courts and our laws, and therefore our society.
The idea that anything in mobile encryption "upends" anything is ridiculous. First, we've had encryption tools for both computers and mobile devices for quite some time. Apple and Google making them more explicit hardly upends anything. Second, note the implicit (and totally incorrect) assumption that historically law enforcement has always had access to all your communications. That's not true. People have always been able to talk in person, or they've been able to communicate in code. Or destroy communications after making them. There have always been "spaces" that are "beyond the reach of law enforcement."
But to someone so blind as to be unaware of all of this, Conley thinks this is somehow "new":
I can think of no other example of a tool or technology that is specifically designed and allowed to exist completely beyond the legitimate reach of law enforcement, our courts, our Congress, and thus, the people. Not safe deposit boxes, not telephones, not automobiles, not homes. Even if the technology existed, would we allow architects to design buildings that would keep police and firefighters out under any and all circumstances? The inherent risk of such a thing is obvious so the answer is no. So too are the inherent risks of what Apple and Google have devised with these operating systems that will provide no means of access to anyone, anywhere, anytime, under any circumstance.
As Chris Soghoian pointed out, just because Conley can't think of any such technology, it doesn't mean it doesn't exist. Take the shredder for example. Or fire.
During the hearing, Conley continued to show just how far out of his depth he was. Rep. Blake Farenthold (right after quizzing the FBI on why it removed its recommendation on mobile encryption from its website -- using the screenshot and highlighting I made), asked the entire panel:
Is there anybody on the panel believes we can build a technically secure backdoor with a golden key -- raise your hand?
No one did -- neither DA Conley nor the FBI's Amy Hess:
But, just a few minutes later, Conley underscored his near absolute cluelessness by effectively arguing "if we can put a man on the moon, we can make backdoor encryption that doesn't put people at risk." Farenthold catalogs a variety of reasons why backdoor encryption is ridiculously stupid -- and even highlights how every other country is going to demand their own backdoors as well -- and asks if anyone on the panel has any solutions. Conley then raises his hand and volunteers the following bit of insanity:
I'm no expert. I'm probably the least technologically savvy guy in this room, maybe. But, there are a lot of great minds in the United States. I'm trying to figure out a way to balance the interests here. It's not an either/or situation. Dr. Blaze said he's a computer scientist. I'm sure he's brilliant. But, geeze, I hate to hear talk like 'that cannot be done.' I mean, think about if Jack Kennedy said 'we can't go to the moon. That cannot be done.' [smirks] He said something else. 'We're gonna get there in the next decade.' So I would say to the computer science community, let's get the best minds in the United States on this. We can balance the interests here.
No, really. Watch it here:
As Julian Sanchez notes, this response is "all the technical experts are wrong because AMERICA FUCK YEAH."
This is why it's kind of ridiculous that we continue to let technologically clueless people lead these debates. There are things that are difficult (getting to the moon) and things that are impossible (arguing we only let "good people" go to the moon.) There are reasons for that. This isn't about technologists not working hard enough on this problem. This is a fundamental reality in that creating backdoors weakens the infrastructure absolutely. That's a fact. Not a condition of poor engineering practices.
And, really, this idea of "getting the best minds" in the computer science community to work on this, I say please don't. That's like asking the best minds in increasing food production to stop all their work and spend months trying to research how to make it rain apples from clouds in the sky. It's not just counterproductive and impossible, but it takes away from the very real and important work they are doing on a daily basis, including protecting us from people who actually are trying to do us harm. That a law enforcement official is actively asking for computer scientists and cybersecurity experts to stop focusing on protecting people and, instead, to help undermine the safety of the public, is quite incredible. How does someone like Conley stay in his job while publicly advocating for putting the American people in more danger like that?
Last year, we wrote about Rep. Blake Farenthold introducing a small, but important piece of copyright legislation, the You Own Devices Act (YODA), which just says that if you buy some piece of computerized equipment, you can sell it with any included software, without having to get permission from the software provider. As we noted, the reality is that this is just making it clear that the first sale doctrine applies to computer equipment too -- which shouldn't need a new law, but some tech companies (especially in the networking space) feel otherwise.
Farenthold has now reintroduced YODA, this time with Rep. Jared Polis as a sponsor as well (giving the bill that necessary "bi-partisan" shine). It's unfortunate that these kinds of bills are even necessary, but such is the state of copyright laws today, that they often mean the devices you buy, you don't even really own.
Also, kudos to Farenthold for playing on the YODA name in his tweet announcing the new version of the bill:
We just wrote about an audio equipment manufacturer trying to argue that it was criminal for someone to resell their products. While this was obviously crazy, never underestimate the lengths that some companies will go through these days to try to block people from selling products they (thought they had) legally bought. And guess what tool they're using to block you from actually owning the products you bought? Why copyright, of course. It's yet another example of how copyright is often used to block property rights rather than to create them.
This has become especially popular among telco/networking equipment manufacturers. These companies ship hardware with software included -- and then argue that you can't actually do anything with that hardware -- such as fix it or sell it -- without their approval, because doing so would violate their copyright on the software. Earlier this year, there was a big lawsuit in which Avaya had sued a company for copyright infringement for merely servicing Avaya equipment. Many other equipment manufacturers have terms of service or "transfer" policies that either effectively block such sales, or (more commonly) include a bunch of hoops that everyone has to jump through just to sell the products you thought you owned. All because of the software that comes with the hardware. While this has mostly been focused on big enterprise systems, it's not much of a stretch to think about how it might eventually apply elsewhere. With so many products being computerized these days, there will be software in lots of different hardware products -- and imagine the havoc those companies could create if they tried to block the sale of these products based on copyright.
Of course, as we've discussed for years, in copyright there's the right of first sale, which is supposed to let you sell your individual copy of a copyrighted work (it's why you can resell a copy of a book you own, for example). But many companies have been trying to chip away at that right, and at least some in Congress want to stop this practice. Rep. Blake Farenthold -- who I only just found out is an EFF member! -- has now introduced a new bill called the You Own Devices Act, or YODA. While I tend to hate silly names for bills, this simple bill is an important reminder that when you buy a product, even if it has copyrighted software included in it, you should own it. The key part of the bill:
...if a computer program enables any part of a machine or other product to operate, the owner of the machine or other product is entitled to transfer an authorized copy of the computer program, or the right to obtain such copy, when the owner sells, leases, or otherwise transfers the machine or other product to another person. The right to transfer provided under this subsection may not be waived by any agreement.
Realistically, this is just reinforcing the first sale doctrine, and it's ridiculous that it needs to be reinforced, but hopefully it can block out some of the questionable shenanigans by some companies.
The bill further makes sure that even if someone sells or transfers such equipment, that the new owners are still allowed to receive updates and security patches:
Any right to receive modifications to the computer program... relating in whole or in part to security or error correction that applied to the owner of the machine or other product... shall apply to the person to whom the machine or product and the copy of the computer program are transferred.
While it's ridiculous enough that this bill is even needed, it's nice to see at least some good copyright reforms popping up.