MIT Tech Review Tries To Blame Apple Encryption For Wrongful Arrest
from the innocent-people-getting-locked-up-because-of-encryption,-etc. dept
Brian Bergstein should know better. As the executive editor of the MIT Technology Review with fifteen years of technology journalism under his belt, he really shouldn't be asking "What if Apple is Wrong?" -- at least not in the way he does.
Bergstein glosses over the security implications of requiring phone manufacturers to hold the decryption keys for devices and services and instead presents his argument as an appeal to emotion. Those on Apple's side -- including Apple CEO Tim Cook -- are given only the briefest of nods before alarmists like Manhattan District Attorney Cy Vance are given the stage.
Bergstein does at least ask an interesting question: what if exonerating evidence is locked up in a phone? But his test case for "What if Apple is wrong?" doesn't apply as well as he seems to hope it does.
Devon Godfrey was killed in his apartment in 2010 -- and police arrested the wrong person. Somehow, Bergstein wants to blame the police screwing up on Apple. Investigators had only a week to pull evidence together to present to a grand jury. Some of that evidence happened to be located on a passcode-locked iPhone. But the evidence ultimately compiled and used has nearly nothing to do with that locked phone.
Cell phones had been found in Godfrey’s apartment, including an iPhone that was locked by its passcode. Arnold recalls doing what he always did in homicides back then: he obtained a search warrant for the phone and put a detective on a plane to Cupertino, California. The detective would wait in Apple’s headquarters and return with the data Arnold needed. Meanwhile, investigators looked more closely at the apartment building’s surveillance video, and Arnold examined records sent by Godfrey’s wireless carrier of when calls and texts were last made on the phones.A text message and a contact list, both of which are usually backed up to cloud storage, where they can be accessed without cracking the phone or breaking its encryption. As James Comey himself has pointed out while making an argument against Apple's stance in several ongoing All Writs-involved cases, law enforcement can access iCloud contents without breaking phone encryption.
With this new evidence in hand, the case suddenly looked quite different. From the wireless carrier, Arnold saw that someone—presumably Godfrey—had sent a text from the iPhone at a certain time. But the recipient of that text had used a disposable “burner” phone not registered under a true name. So who was it? The iPhone itself had the crucial clue. Arnold could see that Godfrey referred to the person by a nickname. People who knew Godfrey helped police identify the man who went by that nickname. It was not the man who was originally arrested. It was Rafael Rosario—who also appeared in the apartment surveillance footage. Rosario confessed and later pleaded guilty.
“Today, Apple encrypts the iCloud but decrypts it in response to court orders,” he said. “So are they materially insecure because of that?”The frequency of the backups will vary from person to person, but this still gives investigators access to plenty of information supposedly "stored" in an uncrackable phone.
Comey later reiterated this point, saying, “I see Apple today encrypting the iCloud and decrypting it in response to court orders. Is there a hole in their code?”
From there, the argument against Apple only gets worse, as the arguments themselves are sourced from the sort of people who'd rather see insecure devices than face obstacles when prosecuting suspects. Cy Vance, of course, has argued for outright encryption bans.
Vance also loves a good appeal to emotion.
Vance makes no dramatic claims about “going dark,” preferring a measured, lawyerly form of argument. When I tell him that his statistics on inaccessible iPhones don’t yet impress many computer scientists, he makes a facial expression equivalent to a shrug. “Some people have made the determination that not being able to do the kinds of work we do is an acceptable collateral damage,” he says. “I’m not sure how the individual would respond if someone close to him or her were the victim of a crime and the case might depend on the ability to access a phone. Easy to say, unless it’s you. We deal with a lot of victims. We talk to the people it’s actually happened to.”The assumption is that everyone loves locking cops out of phones until they're a crime victim. But this assertion is just as false as Comey's exaggerated laments about "going dark." But even in the most famous case involving a locked iPhone -- one that involved an apparent act of terrorism manifesting itself as a mass shooting -- the relatives of victims were far from unanimous in their support of the FBI's efforts. Two people who lost close relations in the shooting -- including a mother who lost her son -- spoke out against the FBI's efforts to undermine cell phone security.
Her son was killed in the San Bernardino, Calif., massacre — but Carole Adams agrees with Apple that personal privacy trumps the feds’ demands for new software to break into iPhones, including the phone of her son’s killer.Then there's the belief -- offered by Vance, Comey and others -- that law enforcement should have access to communications simply because they have a warrant. But what isn't acknowledged is that this is unprecedented access. Texting/messaging has largely replaced telephone calls and face-to-face conversations.
The mom of Robert Adams — a 40-year-old environmental health specialist who was shot dead by Syed Rizwan Farook and his wife — told The Post on Thursday that the constitutional right to privacy “is what makes America great to begin with.”
Prior to the advent of texting, these conversations could not have been recorded without a wiretap warrant, which is a last resort effort that has to be carried out in real time. What law enforcement has access to now -- if not walled off by encryption -- are hundreds or thousands of conversations it never would have had access to before, even with a search warrant, which does not cover the interception of communications. And it's a technique that would be almost completely useless to investigators after a criminal act like a murder has been committed. The fact that a murder victim had a phone in the house would have prompted detectives to look at call records -- something they can still do without breaking a phone's encryption. What was said during those phone calls would still remain a mystery, warrant or no. So, law enforcement isn't as far behind technology as it likes to pretend it is.
Bergstein, along with Lawfare's Susan Hennessey (who Bergstein quotes), both claim a corporation can't possibly decide what's best for Americans.
So is Apple ultimately fighting to uphold personal privacy and civil liberties? Or is it fighting for the right to sell any kind of phone it thinks its customers want while other people deal with the negative consequences? If it’s the latter, that’s understandable; like any public company, Apple is obligated to maximize its value to its shareholders. But society is not necessarily best served by letting Apple make whatever phones are optimal for its chosen business strategy, which is to create a shiny mobile vault that people will trust with every aspect of their lives.But somehow they both feel it's perfectly acceptable for another party with a vested interest in total access to make that same decision for Americans.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: brian bergstein, cyrus vance, encryption, going dark, law enforcement, police
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:yup
[ link to this | view in chronology ]
Re: Re: Re:yup
[ link to this | view in chronology ]
Why in the fuck are the headlines not saying...
"Apple's Encryption is such a joke it only keeps the honest people out!"
-or-
"Due to Apple's flawed encryption ..."
-or-
"Apple's joke of an encryption system is allowing...."
You get the idea!
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Suppose our Jane Doe has her phone stolen. It is not encrypted and the criminal decides to unlock it by force. He now has access to a plethora of information from Ms Jane and decides to put it to good use. Because I want to overuse emotions, he first uses her personal data to commit all sorts of fraud with her name while wiping her finances. Then he decides she's cute so he starts stalking her using the geo info on her phone finding a good spot where she seems to be alone regularly so he goes and rapes her. Not content, he discovers she has a nice, pretty sister and he proceeds to stalk her and do the same.
We could also have Mr John Doe lose his phone and be kidnapped for a ransom because the criminal find out he has a good amount of money due to his unencrypted phone.
I could go on and add dead kittens but I guess my point is clear. Now, how many phones are stolen, lost everyday? Do we want to protect millions of innocents or screw them all and catch all the criminals. Note I haven't even added totalitarian governments. In some places the entire family of a dissident would be tortured. But an encrypted phone could save them all, including the dissident. Imagine if Snowden and other whistleblowers the US govt is actively harassing could have remained fully anonymous through encryption. Snowden for one could be at home and not exiled in Russia.
Appealing to emotions is easy. The hard part is thinking and seeing through these emotions and realize that not all ends justify the means. Some people that had loved ones killed by Farook could see it.
[ link to this | view in chronology ]
Easy to say, unless it’s you
[ link to this | view in chronology ]
Re: Easy to say, unless it’s you
But he's not correct. If something happened to any one of my immediate family, I would certainly still have a serious problem with all of that. So, given the sample size of the two of us, there is no consensus.
[ link to this | view in chronology ]
WHAT???
"Remember the guy we arrested for that murder?"
"Oh yeah. I wonder how he enjoyed a year in Rikers?"
"Looks like he's innocent, according to the witness recanting his testimony."
"Oh, well, sucks to be him."
[ link to this | view in chronology ]
Those are the reactions they want to base laws on?
I know that it is not the same, but because someone they know or loved has been killed, they should want to hurt everyone else?
No! I would not change my mind if it happened to me, because to use my family or friends memory to do something so incredibly damaging for everyone else would be a disgrase.
[ link to this | view in chronology ]
Commerical consequences of ecrowed or banned encryption
It means the death of open source software. If anyone can compile a program, then an encryption ban is unenforceable, as is a mandated backdoor. No more Linux, BSD or Plan 9, suckers! Well, at least in the USA, land of the free and home of the brave.
Say good bye to rapid updates, competition, and downward pressure on software prices, too.
[ link to this | view in chronology ]
Re: Commerical consequences of ecrowed or banned encryption
[ link to this | view in chronology ]
Re: Commerical consequences of ecrowed or banned encryption
It most certainly does not. The worst case situation will be that some open source development simply stop happening in the US. It will absolutely continue to happen in the rest of the world, though.
We know what that looks like: prior to the last crypto wars, it was illegal to export strong crypto from the US. The result was that the US stopped being as capable of producing strong crypto, as the real development work simply moved overseas.
[ link to this | view in chronology ]
All this proceeds with the presumption that law enforcement wants to detect real major criminals from innocent civilians and bring them to justice.
The police are not your friends. They're a gang who gets its jollies from imprisoning or killing normal people the looks of whom they detest.
They don't even have any recognition that the rest of us are essential if they want their precious gang to continue to exist. They'd lock us all up -- or confine us to our homes -- if they could.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
I think this misses the point. This assumes people are guilty until proven innocent. The law shouldn't assume guilt until exonerating evidence shows up. The law needs to first prove guilt before declaring anyone as guilty. A lack of exonerating evidence does not prove guilt. If the law declares people guilty and they are not guilty then the problem is not that the law was unable to find exonerating evidence. The problem is that the law is not requiring sufficient evidence of guilt to prove guilt in the first place.
[ link to this | view in chronology ]
Exonerating evidence
Yes, this is a difference between how our justice system should work and how it does.
Let's say that D dies with that video locked up on his phone. That would be great cause to have C request someone unlock D's phone in order to get that evidence, but at that point since it goes against some comfortable presumptions (e.g. police officers are honest) few courts are going to want to unlock D's phone to get evidence that dispels it.
So yes, the situation is possible. But when the state wants backdoors for evidence, it's so they can convict who they want. They're not at all interested in detecting who actually committed a crime.
[ link to this | view in chronology ]
Re: Exonerating evidence
While there may be cases where this is true, in general, I would argue this is an exaggeration. But that still doesn't deter from the fact that it is guilt that needs to be proven and not the other way around. To argue that encryption would result in a lack of exonerating evidence is to argue that the law is wrongfully requiring that innocence is proven and not the other way around.
[ link to this | view in chronology ]
Re: Re: Exonerating evidence
I would be pleased if this is currently the norm, or ever was.
But having studied criminal investigation, that is not what I was taught. What I was taught was that a suspect is presumed guilty by police, and it's only a matter of determining how to nail them.
And if they are proven innocent by circumstance, there is no resumption of the hunt to find someone else. It's just assumed that he (the original suspect) got away from us this time.
Hollywood and old whodunits like to imagine differently, but these were always a generous presumption of how police detection actually worked.
[ link to this | view in chronology ]
Re: Re: Re: Exonerating evidence
[ link to this | view in chronology ]
alleged presumption of innocence
If you are saying that we need to restore presumtion of innocence back to the justice system, and the function of our law enforcement agencies to detecting perpetrators among the innocent people, I agree with you.
Feel free to try to implement such change through common activism. I am skeptical it can be done without major disruption of the society.
In the meantime suspects are commonly railroaded into guilty convictions or guilty plea bargains, and the police will resort to tampering or false testimony even sooner than they will rely on detective work. It is a lot easier.
Of course we would like it to be different. But pretending that the courts are usually fair is not going to change that.
[ link to this | view in chronology ]
Reading Between the Lines
[ link to this | view in chronology ]
Going Dark, more like going dumb.
Why he decided to puke this out onto the web is a mystery. Much like not knowing that Going Dark is an old, old solution. No phone, no mail, no email and now no brains required at the MIT Technology Review!
[ link to this | view in chronology ]
No one is blaming Apple for a wrongful arrest. It is quite the opposite. As the story explains, Apple's help was essential in finding the actual killer. The point of my piece is to show how Apple has typically been part of the investigative procedure, and ask what will be lost as Apple reduces that help, not only by being unable to unlock phones but perhaps by also cutting off access to data in the cloud.
As my story goes on to explain, I'm well aware that phone metadata today would also have exonerated the guy. So no one is proposing that the jails will be full of (more) innocent people because of Apple. All I'm asking is, what will the effects on the criminal justice system be? Shouldn't we at least hear the cops and prosecutors out rather than merely dismissing them out of hand?
So far a very small percentage of cases have been held up by the inability to carry out search warrants on phones. But isn't it possible that at some point some threshold will be crossed where we look up and find that has changed quickly--especially if the cloud access is cut off?
I'm not saying we need to ban encryption--the Burr-Feinstein bill, for example, is obviously a bad joke. But if you want to argue that we should make no changes in the law, it seems you should make an honest appraisal of the costs and benefits. Every day I read computer security people blowing off the idea that the cops are affected very much in their ability to pursue serious crimes. All I'm doing is showing you the argument for the costs. It's not surprising that you think the benefits outweigh them. But don't get into high dudgeon under preposterous headlines just because someone is reminding you that we should always revisit the trade-offs.
[ link to this | view in chronology ]
Re:
I think we have been hearing them. The problem is that they aren't really supporting anything that they're saying, and decades of history has already demonstrated that they aren't exactly honest and open in their rhetoric, so it's a mistake to assume that because they said it there must be truth to it.
[ link to this | view in chronology ]
Re:
No one is blaming Apple for a wrongful arrest. It is quite the opposite. As the story explains, Apple's help was essential in finding the actual killer. The point of my piece is to show how Apple has typically been part of the investigative procedure, and ask what will be lost as Apple reduces that help, not only by being unable to unlock phones but perhaps by also cutting off access to data in the cloud.
You're basically saying "if Apple doesn't help there will be wrongful arrests." It seems perfectly bizarre to suggest that bad police work is somehow Apple's fault.
All I'm asking is, what will the effects on the criminal justice system be? Shouldn't we at least hear the cops and prosecutors out rather than merely dismissing them out of hand?
That suggests that no one is taking the concerns seriously or going through them. But that's not true. Every time the police or others in law enforcement have raised concerns, people HAVE looked closely at them, and basically every time they've been massively exaggerated.
Remember this, for example? https://www.techdirt.com/articles/20141019/07115528878/everybody-knows-fbi-director-james-comey-is-w rong-about-encryption-even-fbi.shtml James Comey listed out some similar examples... and so the press checked them out and none of the stories checked out.
The point is that people DO listen to the police and the FBI and they go through every story and they usually find that encryption was not actually a problem.
In fact, it's nearly impossible to find an example where encryption was even a marginal issue.
And that's the fact of basic police work. You never get EVERY bit of evidence, but you build a case. And sometimes guilty people aren't caught. That's life.
Even more important, a big study on the issue by Harvard made it quite clear that police and law enforcement have MUCH MORE access to MUCH MORE information than ever before: http://cyber.law.harvard.edu/pubrelease/dont-panic/
The idea that "going dark" is a real problem doesn't seem supported by reality. Giving it credence based purely on anecdotes seems like a weak response, which is why Tim wrote his article.
So far a very small percentage of cases have been held up by the inability to carry out search warrants on phones. But isn't it possible that at some point some threshold will be crossed where we look up and find that has changed quickly--especially if the cloud access is cut off?
That seems far fetched based on the simple fact that most of the information from mobile phones that is now available WASN'T available just a few years ago... and police did just fine using other detective and investigation work. To argue that this will magically create some dark crime wave doesn't make much sense.
Every day I read computer security people blowing off the idea that the cops are affected very much in their ability to pursue serious crimes. All I'm doing is showing you the argument for the costs. It's not surprising that you think the benefits outweigh them. But don't get into high dudgeon under preposterous headlines just because someone is reminding you that we should always revisit the trade-offs.
To be fair here, you can blame me for the headline. Cushing had a different one that I scrapped in favor of this headline.
We absolutely recognize the tradeoffs. Hell, I did a whole video discussing the tradeoffs (https://www.techdirt.com/articles/20160321/16175933972/mike-masnick-explains-apple-versus-fbi.shtml ). But our basic concern here is that if you're going to discuss the trade offs you should present them accurately, which I felt that article did not fairly do.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
2 birds, one stone? Or is that the same bird?
[ link to this | view in chronology ]
Privacy & Civil Liberties vs. Giving customers what they want
And common civilians who enjoy rights often buy things becoming customers.
I hear giving customers what they want is a pretty strong business model.
[ link to this | view in chronology ]
Re: Privacy & Civil Liberties vs. Giving customers what they want
[ link to this | view in chronology ]