MIT Tech Review Tries To Blame Apple Encryption For Wrongful Arrest

from the innocent-people-getting-locked-up-because-of-encryption,-etc. dept

Brian Bergstein should know better. As the executive editor of the MIT Technology Review with fifteen years of technology journalism under his belt, he really shouldn't be asking "What if Apple is Wrong?" -- at least not in the way he does.

Bergstein glosses over the security implications of requiring phone manufacturers to hold the decryption keys for devices and services and instead presents his argument as an appeal to emotion. Those on Apple's side -- including Apple CEO Tim Cook -- are given only the briefest of nods before alarmists like Manhattan District Attorney Cy Vance are given the stage.

Bergstein does at least ask an interesting question: what if exonerating evidence is locked up in a phone? But his test case for "What if Apple is wrong?" doesn't apply as well as he seems to hope it does.

Devon Godfrey was killed in his apartment in 2010 -- and police arrested the wrong person. Somehow, Bergstein wants to blame the police screwing up on Apple. Investigators had only a week to pull evidence together to present to a grand jury. Some of that evidence happened to be located on a passcode-locked iPhone. But the evidence ultimately compiled and used has nearly nothing to do with that locked phone.

Cell phones had been found in Godfrey’s apartment, including an iPhone that was locked by its passcode. Arnold recalls doing what he always did in homicides back then: he obtained a search warrant for the phone and put a detective on a plane to Cupertino, California. The detective would wait in Apple’s headquarters and return with the data Arnold needed. Meanwhile, investigators looked more closely at the apartment building’s surveillance video, and Arnold examined records sent by Godfrey’s wireless carrier of when calls and texts were last made on the phones.

With this new evidence in hand, the case suddenly looked quite different. From the wireless carrier, Arnold saw that someone—presumably Godfrey—had sent a text from the iPhone at a certain time. But the recipient of that text had used a disposable “burner” phone not registered under a true name. So who was it? The iPhone itself had the crucial clue. Arnold could see that Godfrey referred to the person by a nickname. People who knew Godfrey helped police identify the man who went by that nickname. It was not the man who was originally arrested. It was Rafael Rosario—who also appeared in the apartment surveillance footage. Rosario confessed and later pleaded guilty.
A text message and a contact list, both of which are usually backed up to cloud storage, where they can be accessed without cracking the phone or breaking its encryption. As James Comey himself has pointed out while making an argument against Apple's stance in several ongoing All Writs-involved cases, law enforcement can access iCloud contents without breaking phone encryption.
“Today, Apple encrypts the iCloud but decrypts it in response to court orders,” he said. “So are they materially insecure because of that?”

Comey later reiterated this point, saying, “I see Apple today encrypting the iCloud and decrypting it in response to court orders. Is there a hole in their code?”
The frequency of the backups will vary from person to person, but this still gives investigators access to plenty of information supposedly "stored" in an uncrackable phone.

From there, the argument against Apple only gets worse, as the arguments themselves are sourced from the sort of people who'd rather see insecure devices than face obstacles when prosecuting suspects. Cy Vance, of course, has argued for outright encryption bans.

Vance also loves a good appeal to emotion.
Vance makes no dramatic claims about “going dark,” preferring a measured, lawyerly form of argument. When I tell him that his statistics on inaccessible iPhones don’t yet impress many computer scientists, he makes a facial expression equivalent to a shrug. “Some people have made the determination that not being able to do the kinds of work we do is an acceptable collateral damage,” he says. “I’m not sure how the individual would respond if someone close to him or her were the victim of a crime and the case might depend on the ability to access a phone. Easy to say, unless it’s you. We deal with a lot of victims. We talk to the people it’s actually happened to.”
The assumption is that everyone loves locking cops out of phones until they're a crime victim. But this assertion is just as false as Comey's exaggerated laments about "going dark." But even in the most famous case involving a locked iPhone -- one that involved an apparent act of terrorism manifesting itself as a mass shooting -- the relatives of victims were far from unanimous in their support of the FBI's efforts. Two people who lost close relations in the shooting -- including a mother who lost her son -- spoke out against the FBI's efforts to undermine cell phone security.
Her son was killed in the San Bernardino, Calif., massacre — but Carole Adams agrees with Apple that personal privacy trumps the feds’ demands for new software to break into iPhones, including the phone of her son’s killer.

The mom of Robert Adams — a 40-year-old environmental health specialist who was shot dead by Syed Rizwan Farook and his wife — told The Post on Thursday that the constitutional right to privacy “is what makes America great to begin with.”
Then there's the belief -- offered by Vance, Comey and others -- that law enforcement should have access to communications simply because they have a warrant. But what isn't acknowledged is that this is unprecedented access. Texting/messaging has largely replaced telephone calls and face-to-face conversations.

Prior to the advent of texting, these conversations could not have been recorded without a wiretap warrant, which is a last resort effort that has to be carried out in real time. What law enforcement has access to now -- if not walled off by encryption -- are hundreds or thousands of conversations it never would have had access to before, even with a search warrant, which does not cover the interception of communications. And it's a technique that would be almost completely useless to investigators after a criminal act like a murder has been committed. The fact that a murder victim had a phone in the house would have prompted detectives to look at call records -- something they can still do without breaking a phone's encryption. What was said during those phone calls would still remain a mystery, warrant or no. So, law enforcement isn't as far behind technology as it likes to pretend it is.

Bergstein, along with Lawfare's Susan Hennessey (who Bergstein quotes), both claim a corporation can't possibly decide what's best for Americans.
So is Apple ultimately fighting to uphold personal privacy and civil liberties? Or is it fighting for the right to sell any kind of phone it thinks its customers want while other people deal with the negative consequences? If it’s the latter, that’s understandable; like any public company, Apple is obligated to maximize its value to its shareholders. But society is not necessarily best served by letting Apple make whatever phones are optimal for its chosen business strategy, which is to create a shiny mobile vault that people will trust with every aspect of their lives.
But somehow they both feel it's perfectly acceptable for another party with a vested interest in total access to make that same decision for Americans.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: brian bergstein, cyrus vance, encryption, going dark, law enforcement, police
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 11 Apr 2016 @ 9:43am

    Law enforcement is suffering from an encroaching darkness because they are burying themselves under a giant haystack.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 11 Apr 2016 @ 9:51am

    Re:

    That only applies to organizations like the NSA that try to get everything from everyone. When they're getting one phone at a time, the haystack problem doesn't really apply so much.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 11 Apr 2016 @ 10:00am

    Re: Re:

    Use of stingray and ALP cameras etc. says they are building their own haystacks. There is also the problem of focusing on an immediate and obvious problem, getting into a phone, detracting from more productive police work, like talking to people.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 11 Apr 2016 @ 10:26am

    So, the whole Apply barely does encryption well, just keeps getting glossed over yet again.

    Why in the fuck are the headlines not saying...

    "Apple's Encryption is such a joke it only keeps the honest people out!"

    -or-

    "Due to Apple's flawed encryption ..."

    -or-

    "Apple's joke of an encryption system is allowing...."

    You get the idea!

    link to this | view in thread ]

  5. identicon
    jim, 11 Apr 2016 @ 10:33am

    Re: Re:yup

    Any court will tell you, you cannot use NSA. It's not evidence. The phone itself, is the evidence. There has to be an assumption, that visa courts, are illegal, also. That's why to convict some one legally, the right path has to be followed. Wrong path, eventually, the prosecutor, police and even Apple would get a black eye. That's why a public court asked Apple, a legal court, not the handmaiden of terror groups,the visa court. Apple said no, publically saying, it has to be secret courts or else. I hope they choke on it.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 11 Apr 2016 @ 10:34am

    Re:

    What?

    link to this | view in thread ]

  7. icon
    Ninja (profile), 11 Apr 2016 @ 11:04am

    I love how they appeal to emotions. Let's have our turn, shall we?

    Suppose our Jane Doe has her phone stolen. It is not encrypted and the criminal decides to unlock it by force. He now has access to a plethora of information from Ms Jane and decides to put it to good use. Because I want to overuse emotions, he first uses her personal data to commit all sorts of fraud with her name while wiping her finances. Then he decides she's cute so he starts stalking her using the geo info on her phone finding a good spot where she seems to be alone regularly so he goes and rapes her. Not content, he discovers she has a nice, pretty sister and he proceeds to stalk her and do the same.

    We could also have Mr John Doe lose his phone and be kidnapped for a ransom because the criminal find out he has a good amount of money due to his unencrypted phone.

    I could go on and add dead kittens but I guess my point is clear. Now, how many phones are stolen, lost everyday? Do we want to protect millions of innocents or screw them all and catch all the criminals. Note I haven't even added totalitarian governments. In some places the entire family of a dissident would be tortured. But an encrypted phone could save them all, including the dissident. Imagine if Snowden and other whistleblowers the US govt is actively harassing could have remained fully anonymous through encryption. Snowden for one could be at home and not exiled in Russia.

    Appealing to emotions is easy. The hard part is thinking and seeing through these emotions and realize that not all ends justify the means. Some people that had loved ones killed by Farook could see it.

    link to this | view in thread ]

  8. icon
    DSchneider (profile), 11 Apr 2016 @ 12:11pm

    Easy to say, unless it’s you

    He's actually correct here. If something has happened to one of my immediate family, I would have no problem breaking any law, doing anything I can think of if it might save my family member. Torture? sure, breaking and entering sure? sure. I won't even have to be convinced of your guilt. If it was even remotely possible I'd be doing it. And that is EXACTLY why we don't let people in those situations make the rules. You aren't thinking correctly. You're not thinking of the long term consequences of your actions or the damage you might be doing to yourself and others. That's why we're supposed to have elected officials who are supposed to think these things through, consider the short term and long term consequences, how things affect peoples rights, and make the appropriate laws to govern how these things are implemented. I know that last line seems like a joke, but that's how it's supposed to go.

    link to this | view in thread ]

  9. identicon
    Anon, 11 Apr 2016 @ 1:59pm

    WHAT???

    Only a week to present evidence to a Grand Jury? What bizarre land is this prosecutor living in, where the courts operate at a reasonable pace? I'm still waiting for the realistic episode of *Law and Order* where the court portion of the crime drama happens in an episode a year or two after the arrest part.

    "Remember the guy we arrested for that murder?"

    "Oh yeah. I wonder how he enjoyed a year in Rikers?"

    "Looks like he's innocent, according to the witness recanting his testimony."

    "Oh, well, sucks to be him."

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 11 Apr 2016 @ 2:23pm

    So let me get this straight: If one of my loved ones are killed by an act of violence, then the normal thing to do is to go out and punch everyone I see in the face, no matter if it is a 90 year old lady or a small child?
    Those are the reactions they want to base laws on?
    I know that it is not the same, but because someone they know or loved has been killed, they should want to hurt everyone else?

    No! I would not change my mind if it happened to me, because to use my family or friends memory to do something so incredibly damaging for everyone else would be a disgrase.

    link to this | view in thread ]

  11. icon
    John Fenderson (profile), 11 Apr 2016 @ 3:25pm

    Re: Easy to say, unless it’s you

    "He's actually correct here. If something has happened to one of my immediate family, I would have no problem breaking any law, doing anything I can think of if it might save my family member. "

    But he's not correct. If something happened to any one of my immediate family, I would certainly still have a serious problem with all of that. So, given the sample size of the two of us, there is no consensus.

    link to this | view in thread ]

  12. identicon
    Median Wilfred, 11 Apr 2016 @ 5:56pm

    Commerical consequences of ecrowed or banned encryption

    We all realize the consequences of outright banning of encryption, or mandating backdoors or mandating key escrow or some Golden Key, don't we?

    It means the death of open source software. If anyone can compile a program, then an encryption ban is unenforceable, as is a mandated backdoor. No more Linux, BSD or Plan 9, suckers! Well, at least in the USA, land of the free and home of the brave.

    Say good bye to rapid updates, competition, and downward pressure on software prices, too.

    link to this | view in thread ]

  13. icon
    Uriel-238 (profile), 11 Apr 2016 @ 5:59pm

    All this proceeds with the presumption that law enforcement wants to detect real major criminals from innocent civilians and bring them to justice.

    But that isn't what they want to do at all. They want to choose people they don't like and find something to pin on them to justify a prison sentence. That those who commit heinous crimes may turn up on that list is incidendal

    The police are not your friends. They're a gang who gets its jollies from imprisoning or killing normal people the looks of whom they detest.

    They don't even have any recognition that the rest of us are essential if they want their precious gang to continue to exist. They'd lock us all up -- or confine us to our homes -- if they could.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 11 Apr 2016 @ 8:26pm

    So at this point is it safe to assume that MIT is the TLA's research institution and marketing firm? Wonder when we're going to stop the government from co-opting public institutions for their own wars against the same public that funds them.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 11 Apr 2016 @ 9:10pm

    "what if exonerating evidence is locked up in a phone?"

    I think this misses the point. This assumes people are guilty until proven innocent. The law shouldn't assume guilt until exonerating evidence shows up. The law needs to first prove guilt before declaring anyone as guilty. A lack of exonerating evidence does not prove guilt. If the law declares people guilty and they are not guilty then the problem is not that the law was unable to find exonerating evidence. The problem is that the law is not requiring sufficient evidence of guilt to prove guilt in the first place.

    link to this | view in thread ]

  16. icon
    TechDescartes (profile), 11 Apr 2016 @ 9:33pm

    Reading Between the Lines

    I’m not sure how the individual would respond if someone close to him or her were the victim of a crime and the case might depend on the ability to access a phone. Easy to say, unless it’s you. We deal with a lot of victims. We talk to the people it’s actually happened to.
    Now, Mike, it would be a shame if anything bad happened to the writers. Or their families. Or their pets. We wouldn't want that, now, would we?

    link to this | view in thread ]

  17. icon
    Kal Zekdor (profile), 11 Apr 2016 @ 10:18pm

    Re: Re: Re:yup

    FISA court?

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 12 Apr 2016 @ 12:33am

    Re: Commerical consequences of ecrowed or banned encryption

    Open Source could easily comply with tsuchhe requirements, with a comment in the source code to indicate the parts that should not be removed to keep the software in compliance with the law.

    link to this | view in thread ]

  19. icon
    PaulT (profile), 12 Apr 2016 @ 1:13am

    Re: Re:

    In a nutshell? The story mentions Apple, so some people have to make up ways to attack them, for some reason, even if those reasons are the complete opposite of the reality being discussed.

    link to this | view in thread ]

  20. icon
    Uriel-238 (profile), 12 Apr 2016 @ 1:31am

    Exonerating evidence

    If the police A decide that suspect B is guilty, and then lie about what they've seen, and Judge C believes them, though witness D has captured video that would show what really happened, proving B to be liars and A innocent, then yes, D has exonerating evidence on his phone.

    Yes, this is a difference between how our justice system should work and how it does.

    Let's say that D dies with that video locked up on his phone. That would be great cause to have C request someone unlock D's phone in order to get that evidence, but at that point since it goes against some comfortable presumptions (e.g. police officers are honest) few courts are going to want to unlock D's phone to get evidence that dispels it.

    So yes, the situation is possible. But when the state wants backdoors for evidence, it's so they can convict who they want. They're not at all interested in detecting who actually committed a crime.

    link to this | view in thread ]

  21. icon
    David (profile), 12 Apr 2016 @ 6:13am

    Going Dark, more like going dumb.

    What is this fascination with Going Dark? That term has been around for longer than iPhones. It means dropping off the grid in some fashion. I have a friend that has used the phrase for years when he and his buddies go hunting. For elk.

    Why he decided to puke this out onto the web is a mystery. Much like not knowing that Going Dark is an old, old solution. No phone, no mail, no email and now no brains required at the MIT Technology Review!

    link to this | view in thread ]

  22. identicon
    Brian Bergstein, 12 Apr 2016 @ 6:47am

    Tim Cushing purports to be making a close analysis of my piece and puts it under a headline that is obviously inaccurate.

    No one is blaming Apple for a wrongful arrest. It is quite the opposite. As the story explains, Apple's help was essential in finding the actual killer. The point of my piece is to show how Apple has typically been part of the investigative procedure, and ask what will be lost as Apple reduces that help, not only by being unable to unlock phones but perhaps by also cutting off access to data in the cloud.

    As my story goes on to explain, I'm well aware that phone metadata today would also have exonerated the guy. So no one is proposing that the jails will be full of (more) innocent people because of Apple. All I'm asking is, what will the effects on the criminal justice system be? Shouldn't we at least hear the cops and prosecutors out rather than merely dismissing them out of hand?

    So far a very small percentage of cases have been held up by the inability to carry out search warrants on phones. But isn't it possible that at some point some threshold will be crossed where we look up and find that has changed quickly--especially if the cloud access is cut off?

    I'm not saying we need to ban encryption--the Burr-Feinstein bill, for example, is obviously a bad joke. But if you want to argue that we should make no changes in the law, it seems you should make an honest appraisal of the costs and benefits. Every day I read computer security people blowing off the idea that the cops are affected very much in their ability to pursue serious crimes. All I'm doing is showing you the argument for the costs. It's not surprising that you think the benefits outweigh them. But don't get into high dudgeon under preposterous headlines just because someone is reminding you that we should always revisit the trade-offs.

    link to this | view in thread ]

  23. icon
    John Fenderson (profile), 12 Apr 2016 @ 6:52am

    Re:

    "Shouldn't we at least hear the cops and prosecutors out rather than merely dismissing them out of hand?"

    I think we have been hearing them. The problem is that they aren't really supporting anything that they're saying, and decades of history has already demonstrated that they aren't exactly honest and open in their rhetoric, so it's a mistake to assume that because they said it there must be truth to it.

    link to this | view in thread ]

  24. identicon
    Brian Bergstein, 12 Apr 2016 @ 6:57am

    Tim Cook must believe there are times when they are telling the truth, or else why would he suggest the radical (for this country) step of a key disclosure law?

    link to this | view in thread ]

  25. icon
    John Fenderson (profile), 12 Apr 2016 @ 6:58am

    Re: Commerical consequences of ecrowed or banned encryption

    "It means the death of open source software."

    It most certainly does not. The worst case situation will be that some open source development simply stop happening in the US. It will absolutely continue to happen in the rest of the world, though.

    We know what that looks like: prior to the last crypto wars, it was illegal to export strong crypto from the US. The result was that the US stopped being as capable of producing strong crypto, as the real development work simply moved overseas.

    link to this | view in thread ]

  26. icon
    Mike Masnick (profile), 12 Apr 2016 @ 7:09am

    Re:

    Brian, thanks for responding. I can let Tim respond himself, but on a few points I think your argument is unfair as well:

    No one is blaming Apple for a wrongful arrest. It is quite the opposite. As the story explains, Apple's help was essential in finding the actual killer. The point of my piece is to show how Apple has typically been part of the investigative procedure, and ask what will be lost as Apple reduces that help, not only by being unable to unlock phones but perhaps by also cutting off access to data in the cloud.


    You're basically saying "if Apple doesn't help there will be wrongful arrests." It seems perfectly bizarre to suggest that bad police work is somehow Apple's fault.

    All I'm asking is, what will the effects on the criminal justice system be? Shouldn't we at least hear the cops and prosecutors out rather than merely dismissing them out of hand?

    That suggests that no one is taking the concerns seriously or going through them. But that's not true. Every time the police or others in law enforcement have raised concerns, people HAVE looked closely at them, and basically every time they've been massively exaggerated.

    Remember this, for example? https://www.techdirt.com/articles/20141019/07115528878/everybody-knows-fbi-director-james-comey-is-w rong-about-encryption-even-fbi.shtml James Comey listed out some similar examples... and so the press checked them out and none of the stories checked out.

    The point is that people DO listen to the police and the FBI and they go through every story and they usually find that encryption was not actually a problem.

    In fact, it's nearly impossible to find an example where encryption was even a marginal issue.

    And that's the fact of basic police work. You never get EVERY bit of evidence, but you build a case. And sometimes guilty people aren't caught. That's life.

    Even more important, a big study on the issue by Harvard made it quite clear that police and law enforcement have MUCH MORE access to MUCH MORE information than ever before: http://cyber.law.harvard.edu/pubrelease/dont-panic/

    The idea that "going dark" is a real problem doesn't seem supported by reality. Giving it credence based purely on anecdotes seems like a weak response, which is why Tim wrote his article.

    So far a very small percentage of cases have been held up by the inability to carry out search warrants on phones. But isn't it possible that at some point some threshold will be crossed where we look up and find that has changed quickly--especially if the cloud access is cut off?

    That seems far fetched based on the simple fact that most of the information from mobile phones that is now available WASN'T available just a few years ago... and police did just fine using other detective and investigation work. To argue that this will magically create some dark crime wave doesn't make much sense.

    Every day I read computer security people blowing off the idea that the cops are affected very much in their ability to pursue serious crimes. All I'm doing is showing you the argument for the costs. It's not surprising that you think the benefits outweigh them. But don't get into high dudgeon under preposterous headlines just because someone is reminding you that we should always revisit the trade-offs.

    To be fair here, you can blame me for the headline. Cushing had a different one that I scrapped in favor of this headline.

    We absolutely recognize the tradeoffs. Hell, I did a whole video discussing the tradeoffs (https://www.techdirt.com/articles/20160321/16175933972/mike-masnick-explains-apple-versus-fbi.shtml ). But our basic concern here is that if you're going to discuss the trade offs you should present them accurately, which I felt that article did not fairly do.

    link to this | view in thread ]

  27. identicon
    Brian Bergstein, 12 Apr 2016 @ 7:26am

    Re: Re:

    Mike, if you read my piece I think you'll see I agree with much of what you're saying here. I say Comey's claims of going dark sound like hyperbole because these claims have been hyperbole for 20 years. I'm well aware of the Berkman report and the golden age of surveillance calculus. All I'm saying is, we're about to find out whether that calculus holds.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 12 Apr 2016 @ 12:16pm

    Re: Exonerating evidence

    "They're not at all interested in detecting who actually committed a crime."

    While there may be cases where this is true, in general, I would argue this is an exaggeration. But that still doesn't deter from the fact that it is guilt that needs to be proven and not the other way around. To argue that encryption would result in a lack of exonerating evidence is to argue that the law is wrongfully requiring that innocence is proven and not the other way around.

    link to this | view in thread ]

  29. icon
    Uriel-238 (profile), 12 Apr 2016 @ 1:41pm

    Re: Re: Exonerating evidence

    [Citation Needed], specifically instances when the police after determining suspect(s) continued to investigate further for more suspects, or worked to rule their current batch out.

    I would be pleased if this is currently the norm, or ever was.

    But having studied criminal investigation, that is not what I was taught. What I was taught was that a suspect is presumed guilty by police, and it's only a matter of determining how to nail them.

    And if they are proven innocent by circumstance, there is no resumption of the hunt to find someone else. It's just assumed that he (the original suspect) got away from us this time.

    Hollywood and old whodunits like to imagine differently, but these were always a generous presumption of how police detection actually worked.

    link to this | view in thread ]

  30. icon
    crade (profile), 12 Apr 2016 @ 2:59pm

    So is Apple ultimately fighting to uphold personal privacy and civil liberties? Or is it fighting for the right to sell any kind of phone it thinks its customers want

    2 birds, one stone? Or is that the same bird?

    link to this | view in thread ]

  31. icon
    Uriel-238 (profile), 12 Apr 2016 @ 3:20pm

    Privacy & Civil Liberties vs. Giving customers what they want

    I'm pretty sure that privacy and civil liberties are things that people want. Or rather things people miss when they're taken away, and things that people who don't have get really angry about them not being there.

    And common civilians who enjoy rights often buy things becoming customers.

    I hear giving customers what they want is a pretty strong business model.

    link to this | view in thread ]

  32. identicon
    Anonymous Coward, 12 Apr 2016 @ 4:26pm

    Re: Re: Re: Exonerating evidence

    Which still misses the point. In your case what needs to change is not access to exonerating evidence but the fact that guilt needs to be proven and not the other way around.

    link to this | view in thread ]

  33. icon
    Uriel-238 (profile), 12 Apr 2016 @ 7:16pm

    alleged presumption of innocence

    what needs to change is not access to exonerating evidence but the fact that guilt needs to be proven and not the other way around.

    If you are saying that we need to restore presumtion of innocence back to the justice system, and the function of our law enforcement agencies to detecting perpetrators among the innocent people, I agree with you.

    Feel free to try to implement such change through common activism. I am skeptical it can be done without major disruption of the society.

    In the meantime suspects are commonly railroaded into guilty convictions or guilty plea bargains, and the police will resort to tampering or false testimony even sooner than they will rely on detective work. It is a lot easier.

    Of course we would like it to be different. But pretending that the courts are usually fair is not going to change that.

    link to this | view in thread ]

  34. identicon
    Brian Bergstein, 14 Apr 2016 @ 8:53am

    Re: Privacy & Civil Liberties vs. Giving customers what they want

    There is not necessarily a contradiction between supporting civil liberties and giving customers what they want. But if you read my piece, I am posing this question in the context of Tim Cook suggesting that a key disclosure law might need to be passed. Given the negative civil liberties implication of such a law, given the Fifth Amendment, it seems fair to ask just what the primary goal of the company is.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.