Lawyers normally take 1/3 on contingency, and an increase if I goes to trial is common.
For a class action, where the entire point is that it's not worth it for individual members to sue, a disproportionately higher percentage is reasonable, as it makes it more likely that the public benefit of the lawsuit happens at all.
For instance this plan requires users pay $40 a month to add a tablet to the plan, which is only $10 a month if you remain on AT&T's metered data plans.
I tried to parse that and my brain threw an exception. Either this is a typo or I'm missing some important context. ------------------------------------------------------
AT&T's current model works like this: You pay for a certain level of base service (minutes, text, data). Based on the tier of base service, adding on additional lines costs a fixed fee per month to use that base service. Assuming unlimited minutes/texts and fixed set of data, adding a phone costs ~$15/month and a data-only device (i.e. a tablet) costs $10/month.
So if you stay on a metered plan and pay $100/month for your phone, and want to add your tablet, it'll cost you an additional $10.
If you have an unlimited plan and pay $150/month for your phone, and want to add your tablet, it'll cost you an additional $40.
In short, you pay more to have the option of unlimited* data on a device, then $40 (as opposed to $10) per additional device.
No one is demanding that those who didn't get the necessary warrants be punished per se.
Instead, we're demanding that those who violate the law at a minimum not benefit from it. Since the law is clearly established that a warrant is required under these circumstances, failure to get one means that the Agent of the people didn't have authority to seize and search the phone.
Taking another persons property without authority to do so is theft, or in this case probably armed robbery.
We're demanding that we, the people, be held accountable for the actions of our agent. The minimal level of accountability is that we lose our advantage (the information we obtained without authority), so suppression is a minimalist first step. Some of us are also demanding that we be accountable for damages caused by our agent, under simple agency principles.
It appears that some people think theft is ok as long as it's done by the agent of the people against a single person.
That's essentially it. I'm a data scientist, rather than a cryptographer, so I didn't have the term of art (ephemeral keys). I've implemented a similar system for data processing, but what I see would be (essentially) a set of keys that time out where each section of storage gets slowly migrated from key to key, so that for any live system it will have a reasonably fresh key, but that when taken offline they become static.
This would necessarily mean a slightly higher overhead on the device (since it would always be encrypting a new volume), but it could also use smaller keys tied to the generally available compute power - similar to how bitcoin mining gets harder over time.
This sort of escalating encryption would obviously be harder to implement than a static key encryption, and harder to be sure no one planted a back door in it itself, but would have the advantage of maintaining the same relative level of protection over time for current devices.
The non-absurd argument for security is that sometimes they really do need to decrypt things, but as we've seen it's far too often used now as an easy way to bypass other protections, rather than for extraordinary situations. Since we've been shown that we can't trust the watchers on their own when there aren't technical barriers, the alternative may be that practical barriers (total compute available) are a better alternative, like we had until recently due to scalability problems.
There may actually be a way to get both a secure(ish) device and a way to decrypt it.
We've seen recently that there's a way to break PGP through factoring of very large primes (which is what some people think the NSA's Utah data center is for), but that it takes a huge amount of compute time.
If your iPhone uses a rolling set of encryption keys, but where the rolling refactoring could be stopped with physical possession of the device, then a nation-state could seize the phone and eventually decrypt it, since the rolling key would stop rolling.
Now the catch, of course, is that you'd need to keep the key size growing with Moore's Law, so that even with physical possession it would still be a significant effort to break, essentially making it so that only in rare circumstances would it be worth breaking the encryption.
We used this same paradigm for years with location information - the law evolved that having the police "tail" someone wasn't an invasion of privacy, because anything you do in public isn't private. But the paradigm in place meant that mass surveillance was impracticably expensive, so it was only used when it was really worth it. Now that mass surveillance is cheap, we're stuck with a legal landscape that no longer yields the same relative privacy as before - where you were private simply due to the cost of breaking your privacy.
Professor Kerr explains this in his Equilibrium-Adjustment theory of the 4th Amendment, but the same principle could be applied to computer encryption - grow the keys steadily to make it hard to decrypt a phone you have physical possession of, but possible if it's worth it.
This gets trickier with stored data (suck up everything, sit on it for 10 years until it's easy to break, and then charge anyone you find with an ongoing conspiracy for whatever violation you find), but there may be solutions to this (extremely large keys on transmitted data, smaller rolling keys locally).
Of course, this would necessarily mean that older data could be decrypted, so the US Government would need to thing long and hard about whether it wants it to be practical to break US encryption standards for older data.
This isn't a case of market failure. This is a case of regulatory failure.
Just because there's a (heavily regulated) market that doesn't deliver doesn't mean that the market failed - it usually means the opposite, that the market was prevented from working by regulation, which led to failure.
Just because the regulation was by one entity (the state) and it's being overridden by another entity (the FCC) doesn't mean it's not caused by a regulatory failure.
To see this, consider this thought experiment: If there were no regulations preventing competition, would we see competitors who couldn't enter the market? Of course not - the regulatory restrictions are the problem, not the market.
Won't this law be preempt end by the TPP (if passed) as it would change the regulatory landscape that foreign (Samsung) held companies have to operate under, to their detriment?
On the post: Settlement Reached In Class Action Lawsuit Against Rightscorp For Robocalls
Re: Lawyer fees
Lawyers normally take 1/3 on contingency, and an increase if I goes to trial is common.
For a class action, where the entire point is that it's not worth it for individual members to sue, a disproportionately higher percentage is reasonable, as it makes it more likely that the public benefit of the lawsuit happens at all.
On the post: AT&T Is Happy To Remove Wireless Broadband Caps, But Only If You Sign Up For Its TV Services
Re:
I tried to parse that and my brain threw an exception. Either this is a typo or I'm missing some important context.
------------------------------------------------------
AT&T's current model works like this:
You pay for a certain level of base service (minutes, text, data). Based on the tier of base service, adding on additional lines costs a fixed fee per month to use that base service. Assuming unlimited minutes/texts and fixed set of data, adding a phone costs ~$15/month and a data-only device (i.e. a tablet) costs $10/month.
So if you stay on a metered plan and pay $100/month for your phone, and want to add your tablet, it'll cost you an additional $10.
If you have an unlimited plan and pay $150/month for your phone, and want to add your tablet, it'll cost you an additional $40.
In short, you pay more to have the option of unlimited* data on a device, then $40 (as opposed to $10) per additional device.
Make sense now?
*where unlimited != unlimited
On the post: Federal Judge Finds NYPD Engaged In Evidence Spoliation By Destroying Documents Related To Summons Quota Lawsuit
Re: Equitable solution
Accepting the plaintiffs allegation as true and that the NYPD intentionally destroyed evidence of it's truth would be equitable.
If that results in in the NYPD agreeing that they broke the law: well, too bad.
This is what an adverse inference is for.
FIFY :)
On the post: Judge Not Impressed With Government's Warrantless 921-Page 'Peek' Into A Suspect's Cellphone
Re: Shaking my head at the bias
Instead, we're demanding that those who violate the law at a minimum not benefit from it. Since the law is clearly established that a warrant is required under these circumstances, failure to get one means that the Agent of the people didn't have authority to seize and search the phone.
Taking another persons property without authority to do so is theft, or in this case probably armed robbery.
We're demanding that we, the people, be held accountable for the actions of our agent. The minimal level of accountability is that we lose our advantage (the information we obtained without authority), so suppression is a minimalist first step. Some of us are also demanding that we be accountable for damages caused by our agent, under simple agency principles.
It appears that some people think theft is ok as long as it's done by the agent of the people against a single person.
On the post: Hillary Clinton Wants A 'Manhattan Project' For Encryption... But Not A Back Door. That Makes No Sense
Re: Re: Delayed-Escrow Encryption
This would necessarily mean a slightly higher overhead on the device (since it would always be encrypting a new volume), but it could also use smaller keys tied to the generally available compute power - similar to how bitcoin mining gets harder over time.
This sort of escalating encryption would obviously be harder to implement than a static key encryption, and harder to be sure no one planted a back door in it itself, but would have the advantage of maintaining the same relative level of protection over time for current devices.
The non-absurd argument for security is that sometimes they really do need to decrypt things, but as we've seen it's far too often used now as an easy way to bypass other protections, rather than for extraordinary situations. Since we've been shown that we can't trust the watchers on their own when there aren't technical barriers, the alternative may be that practical barriers (total compute available) are a better alternative, like we had until recently due to scalability problems.
On the post: Hillary Clinton Wants A 'Manhattan Project' For Encryption... But Not A Back Door. That Makes No Sense
Delayed-Escrow Encryption
We've seen recently that there's a way to break PGP through factoring of very large primes (which is what some people think the NSA's Utah data center is for), but that it takes a huge amount of compute time.
If your iPhone uses a rolling set of encryption keys, but where the rolling refactoring could be stopped with physical possession of the device, then a nation-state could seize the phone and eventually decrypt it, since the rolling key would stop rolling.
Now the catch, of course, is that you'd need to keep the key size growing with Moore's Law, so that even with physical possession it would still be a significant effort to break, essentially making it so that only in rare circumstances would it be worth breaking the encryption.
We used this same paradigm for years with location information - the law evolved that having the police "tail" someone wasn't an invasion of privacy, because anything you do in public isn't private. But the paradigm in place meant that mass surveillance was impracticably expensive, so it was only used when it was really worth it. Now that mass surveillance is cheap, we're stuck with a legal landscape that no longer yields the same relative privacy as before - where you were private simply due to the cost of breaking your privacy.
Professor Kerr explains this in his Equilibrium-Adjustment theory of the 4th Amendment, but the same principle could be applied to computer encryption - grow the keys steadily to make it hard to decrypt a phone you have physical possession of, but possible if it's worth it.
This gets trickier with stored data (suck up everything, sit on it for 10 years until it's easy to break, and then charge anyone you find with an ongoing conspiracy for whatever violation you find), but there may be solutions to this (extremely large keys on transmitted data, smaller rolling keys locally).
Of course, this would necessarily mean that older data could be decrypted, so the US Government would need to thing long and hard about whether it wants it to be practical to break US encryption standards for older data.
On the post: Presidential Hopeful Marco Rubio Defends AT&T's Right To Write Bad State Broadband Law
Market Failure
Just because there's a (heavily regulated) market that doesn't deliver doesn't mean that the market failed - it usually means the opposite, that the market was prevented from working by regulation, which led to failure.
Just because the regulation was by one entity (the state) and it's being overridden by another entity (the FCC) doesn't mean it's not caused by a regulatory failure.
To see this, consider this thought experiment: If there were no regulations preventing competition, would we see competitors who couldn't enter the market? Of course not - the regulatory restrictions are the problem, not the market.
On the post: Manhattan DA's Office Serves Up Craptastic White Paper Asking For A Ban On Encryption
Preemption
Could at least be some silver lining there....
On the post: Law Professor Pens Ridiculous, Nearly Fact-Free, Misleading Attack On The Most Important Law On The Internet
The Professor is right
On the other hand, who spends so much time on 4chan that they think that's all the internet is....?
Next >>