Report On Device Encryption Suggests A Few Ways Forward For Law Enforcement
from the time-to-dial-back-the-apocalyptic-narrative dept
Another paper has been released, adding to the current encryption discussion. The FBI and DOJ want access to the contents of locked devices. They call encryption that can be bypassed by law enforcement "responsible encryption." It isn't. A recent paper by cryptograpghy expert Riana Pfefferkorn explained in detail how irresponsible these suggestions for broken or weakened encryption are.
This new paper [PDF] was put together by the National Academies of Science, Engineering, and Medicine. (h/t Lawfare) It covers a lot of ground others have and rehashes the history of encryption, along with many of the pro/con arguments. That said, it's still worth reading. It raises some good questions and spends a great deal of time discussing the multitude of options law enforcement has available, but which are ignored by FBI officials when discussing the backdoors/key escrow/weakened encryption they'd rather have.
The paper points out law enforcement now has access to much more potential evidence than it's ever had. But that might not always be a good thing.
The widespread use of cloud storage means that law enforcement has another potential source of evidence to turn to when they do not have access to the data on devices, either because the device is unavailable or the data on the device is encrypted. Not all of this digital information will be useful, however. Because storage is cheap or even free, people keep all sorts of non-noteworthy electronic documents forever.
What's unsaid here is law enforcement should be careful what it wishes for. Encryption that allows government on-demand access may drown it in useless data and documents. If time is of the essence in cases where law enforcement is seeking to prevent further criminal activity, having a golden key may not move things along any faster. I'm sure the FBI and others would prefer access all the same, but this does point to a potential negative side effect of cheap storage and endless data generation.
And the more access law enforcement has, the more chances there are for something to go horribly wrong on the provider's end.
How frequently might vendors be asked to unlock phones? It is difficult to predict the volume of requests to vendors, but a figure in the tens of thousands per year seems reasonable, given the number of criminal wiretaps per year in the United States and the number of inaccessible devices reported by just the FBI and Manhattan District Attorney’s Office. As a result, each vendor, depending on its market share, needs to be able to handle thousands to tens of thousands of domestic requests per year.
Such a change in scale, as compared to the software update process, would necessitate a change in process and may require a larger number of people authorized to release an unlock code than are authorized to release a software update, which would increase the insider risk.
The paper also runs down stats provided by the FBI and the Manhattan DA's office. It notes the overall number of unlockable phones has continued to rise but points out these numbers aren't all that meaningful without context.
In November 11, 2016, testimony to this committee, then-Federal Bureau of Investigation (FBI) General Counsel James Baker reported that for fiscal year 2016, the FBI had encountered passcodes on 2,095 of the 6,814 mobile devices examined by its forensic laboratories. They were able to break into 1,210 of the locked phones, leaving 885 that could not be accessed. The information Baker presented did not address the nature of the crimes involves nor whether the crimes were solved using other techniques.
[...]
Although existing data clearly show that encryption is being encountered with increasing frequency, the figures above do not give a clear picture of how frequently an inability to access information seriously hinders investigations and prosecutions.
It goes on to note that we may never see this contextual information. Any attempt to collect this data would be hindered by law enforcement's reluctance to provide it, and there are currently no visible efforts being made by agencies to determine just how often encryption stymies investigations. Whatever would actually be reported would be tainted by subjective assessments of encryption's role in the investigation. However, without more context, the endless parade of locked device figures is nothing more than showmanship in service to the greater goal of undermining encryption.
The paper helpfully lists several options law enforcement can pursue, including approaching cloud services for content stored outside of locked devices. It also points out the uncomfortable fact that law enforcement doesn't appear to be making use of tools it's always had available. One of these options is compelled production of passwords or biometric data to unlock phones. While the Fifth Amendment implications of compelled password production are still under debate, it's pretty clear fingerprints or retinas aren't going to receive as much Constitutional protection.
On top of that, there's the fact that a number of device owners have already voluntarily provided copies of encryption keys, and these can likely be accessed by law enforcement using a standard warrant or an All Writs Act order.
[M]any storage encryption products today offer key escrow-like features to avoid data loss or support business record management requirements. For example, Apple’s full disk encryption for the Mac gives the user the option to, in effect, escrow the encryption key. Microsoft Windows’ BitLocker feature escrows the key by default but allows users to request that the escrowed key be deleted. Some point to the existence of such products as evidence that key recovery for stored data can be implemented in a way that sensibly balances risks and benefits at least in certain contexts and against certain threats. In any case, data that is recoverable by a vendor without the user’s passcode can be recovered by the vendor for law enforcement as well. Key escrow-type systems are especially prevalent and useful where the user, or some other authorized person such as the employer, needs access to stored data.
The report also claims law enforcement "had not kept pace" with the increase of digital evidence. It posits the problem is a lack of funding and training. Training is almost certainly a problem, but very few law enforcement agencies -- especially those at the federal level -- suffer for funding or expertise. This might be due to bad assumptions, where officials believed they would always have full access to device contents (minus occasional end user initiative on encryption). When it became clear they wouldn't, they began to seek solutions to the problems. This put them a few steps behind. Then there are those, like Manhattan DA Cy Vance and FBI Director Chris Wray, who are putting law enforcement even further behind by pushing for legislation rather than focusing their efforts on keeping officers and agents well-supplied and well-trained.
While the report does suggest vendors and law enforcement work together to solve this access "problem," the suggestions place the burden on vendors. One suggested fix is one-way information sharing where vendors make law enforcement aware of unpatched exploits, allowing the government (and anyone else who discovers it) to use these vulnerabilities to gain access to communications and data. It's a horrible suggestion -- one that puts vendors in the liability line of fire and encourages continued weakening of device and software security.
The report also points out the calls for harder nerding have been at least partially answered. The proposed solutions aren't great. In fact, one of them (running lawful access keys and software update keys through the same pipeline) is terrible. But it's not as though no one on the tech side is trying to come up with a solution.
Several individuals with backgrounds in security and systems have begun to explore possible technical mechanisms to provide government exceptional access. Three individuals presented their ideas to the committee.
• Ernie Brickell, former chief security architect, Intel Corporation, described ways that protected partitions, a security feature provided by future microprocessor architectures, could be used to provide law enforcement access to devices in their physical possession, provide remote access by law enforcement, or provide key escrowed cryptography for use by applications and nonescrowed cryptography for a set of “allowed” applications.
• Ray Ozzie, former chief technical officer and former chief software architect, Microsoft Corporation, argued that if a user trusts a vendor to update software, the user should be able to trust the vendor to manage keys that can provide exceptional access. He proposed that this extension of the trust model used for software updates could be used to provide government exceptional access to unlock mobile devices. Ozzie also provided the committee with materials describing how this approach could be extended to real-time communications such as messaging.
• Stefan Savage, professor of computer science and engineering, University of California, San Diego, described how phone unlock keys could be stored in hardware and made available via an internal hardware interface together with a “proof-of-effort” lock that together would require physical possession and a time delay before law enforcement could unlock a device.
The report points out these are only suggestions and have yet to be rigorously examined by security professionals. But their existence belies the narrative pushed by the FBI in its search for a federal statutory mandate. There are experts trying to help. Unfortunately, every solution proposed is going to require a sacrifice in device security.
The problem is complex, if you choose to believe it's a problem. It may be troublesome that law enforcement can't have access to device contents as easily as they could five years ago, but it's not the threat to public safety anti-encryption enthusiasts like Chris Wray and Cy Vance make it out to be. Encryption use has gone up while crime rates have remained steady or decreased. The emphasis on cellphones as the ultimate investigative goldmine is misplaced. Plenty of options remain and law enforcement spent years solving crimes without having one-stop access to communications and personal documents. An ancient discovery known as "fire" has put evidence out of reach for hundreds of years, but no one's asking the smart guys at Big Match to come up with a solution. Things are harder but they're not impossible. What is impossible is what Wray and others are asking for: secure compromised encryption.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: doj, encryption, fbi, going dark, responsible encryption
Reader Comments
The First Word
“I'll say it again
Encryption is either:1. Secure
2. Insecure
It's a binary choice. Not a sliding scale. Like being pregnant. You are or you are not. There is no try.
If encryption is secure, then hackers cannot break it -- but neither can government.
If encryption is insecure, then government can break it -- but so can hackers.
Subscribe: RSS
View by: Time | Thread
So yet another claim of LEO ignorance justifying brech or rights?
cell phone data encourages guilt by association, it's like reading your daughters journal, you learn things but maybe you kill someone you should not have.
[ link to this | view in thread ]
Um....no.
This is a false equivalence. Doubly so given Microsoft's history of "updating software" in ways that compromise user security, invade user privacy, disable functionality, and expose the user to attacks.
[ link to this | view in thread ]
> "The widespread use of cloud storage means that law enforcement has another potential source of evidence to turn to when they do not have access to the data on devices"
Many cloud providers are also unable to decrypt stored data (or claim to be unable to do so).
> "Because storage is cheap or even free, people keep all sorts of non-noteworthy electronic documents forever."
E.g. the NSA?
> "How frequently might vendors be asked to unlock phones?"
Why are you asking vendors in the first place? Their priority is their customers, not their customer's adversaries. If a vendor can unlock a phone, so can criminals. If you want to unlock it, do it yourself.
> "Ernie Brickell, former chief security architect..."
> "Ray Ozzie, former chief technical officer and former chief software architect..."
Keyword: former
> "...could be used to provide law enforcement access to devices in their physical possession, provide remote access by law enforcement..."
Replace law enforcement with Russian government / criminal gang / bored hacker and the statement is still true.
> "...if a user trusts a vendor to update software, the user should be able to trust the vendor to manage keys that can provide exceptional access"
There's a big difference with 'keep my software working' and 'let anyone in the world access my data'. Also: Don't mess with the software update process. It's already hard enough keeping people up to date.
> "...phone unlock keys could be stored in hardware and made available..."
...made available to anyone who wants them.
> "Unfortunately, every solution proposed is going to require a sacrifice in device security."
By which you mean "a complete sacrifice in device security".
> "An ancient discovery known as "fire" has put evidence out of reach for hundreds of years, but no one's asking the smart guys at Big Match to come up with a solution."
I wonder how many people saw this little tidbit? :)
[ link to this | view in thread ]
Big Match are in league with big Anarchism trying to stop the war
This is an example of how Big Match distorts free speech to get what it wants... no.. wait that is how the state distorts free speech
[ link to this | view in thread ]
Police Harder
Guess what: There's no shortcuts in life. Do your job properly or there will be bad consequences down the road.
[ link to this | view in thread ]
but they don't say
They really do appear to think we are that dumb, and looking at social media, I'd have to agree, most are that dumb.
[ link to this | view in thread ]
Good one
[ link to this | view in thread ]
Re: Police Harder
[ link to this | view in thread ]
Police are trained to lie and dominate all situations
we have been here far to long stop thinking any party in any country will represent you
START OVER
[ link to this | view in thread ]
Re: Um....no.
[ link to this | view in thread ]
Re: Re: Police Harder
RESISTANCE IS FUTILE!
[ link to this | view in thread ]
[ link to this | view in thread ]
All these solutions ignore the elephant in the room that if these measures are introduced, there is nothing stopping Russia, China, Iran or any other country also demanding the right to decrypt any phone.
Americans really need to stop view the world from an American only perspective.
[ link to this | view in thread ]
I'll say it again
1. Secure
2. Insecure
It's a binary choice. Not a sliding scale. Like being pregnant. You are or you are not. There is no try.
If encryption is secure, then hackers cannot break it -- but neither can government.
If encryption is insecure, then government can break it -- but so can hackers.
[ link to this | view in thread ]
Re: Re: Um....no.
Users don't "trust" MS to update their computers, they reluctantly or unknowingly tolerate it, weighing the consequences of an unpatched Windows vs the malware out there. There's NO trust involved.
[ link to this | view in thread ]
Re: Re: Re: Um....no.
[ link to this | view in thread ]
Re:
That doesn't mean anything prevents them from gaining the ability to encrypt it. If they wrote the encryption software Alice is using, they can make the next version leak Alice's key. (This isn't entirely hypothetical—Hushmail somehow delivered decrypted messages to the US government.)
[ link to this | view in thread ]
Re: Re: Re: Re: Um....no.
Wikipedia gives the definition: "a trusted system is a system that is relied upon to a specified extent to enforce a specified security policy. This is equivalent to saying that a trusted system is one whose failure would break a security policy".
Replace "system" with "person" or "company", as needed. If Microsoft can defeat your security, you're trusting Microsoft, whether they're trustworthy or not.
(If you never update your Microsoft OS, that's not strictly true; you'd be trusting the code they wrote in the past, rather than the company.)
[ link to this | view in thread ]
Dangerous lack of nuance.
To put it more simply: It's not the fault of your window bars (encryption) when someone smashes in a weak front door (device)- the window bars have nothing to do with the strength of the front door, they are mutually responsible for security.
Failing to note this distinction is confusing the underling issues, and grossly misinforming people who make false assumptions based on true but poorly contextualized statements like the one you've just made.
Real world security is not binary- that's absurd- it's a metric shit-ton harder to break into an expert setup security focused computer, then an old unpatched winXP box, but neither one is immune, or 'secure' in an absolute binary sense.
Newsflash: The device your using to read this is vulnerable to multiple 0days- it would be utterly ridiculous to assume that's not the case... spend some time reading CVE's- no matter what you run, it's always been the case, and even if somehow all an OS's codebase magically where scrubbed to secure perfection, the presence of un-auditable ring -3 hardware means the potential for backdoor's can NEVER be eliminated, until consumers demand an end to this practice.
[ link to this | view in thread ]
Go to all the people talking about "responsible" encryption and ask them to design a hypothetical "responsible" door that can only be opened by the homeowner and law enforcement, but that will keep out criminals. Most people understand doors and locks, so ask them to describe how a secure door would work that will be impervious to bad guys but that cops can open if they need to.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: Um....no.
Just force all of the vendors to support this "exceptional access requirement" or the software is illegal to possess or distribute.
At that point even if you trust the vendor, for the sake of argument let's say you got the OS from kernel.org, then your trust is still being violated by the government. As the government forced itself between the vendor and you and inserted what many would refer to as a rootkit into the software.
Now you might say that being able to swoop into your house on blackhawks in the middle of the night, armed to the teeth ready to force compliance out of you, means you "trust" the government as well, but some would say that's tyranny just the same.
Eventually you have to ask yourself: At what point does correlation without causation mean something to you?
[ link to this | view in thread ]
we all are know that report device encryption suggests few ways forward law enforcement. people are understand, how to safe a device by encryption. Many people hack a device but encryption you device is safe. you can safe your device very well, you use ipad customer service.
[ link to this | view in thread ]