"The fact is that Mr. Snowden committed very serious crimes,"
This is telling. He is not a suspect. He is a criminal, already.
"and the U.S. government and Department of Justice believe that he should face them."
Meaning, I guess, "the US government and DoJ will do everything possible to have a court of law rubber-stamp our condemnation."
"And that’s why we believe that Mr. Snowden should return to the United States where he will face due process, and he’ll have the opportunity -- if he returned to the United States -- to make that case in a court of law."
Translation: "given we cannot ourselves be DA, jury, judge, and executioner we will leave the judge part for others."
"But obviously our view on this is that he committed and is accused of very serious crimes."
No matter what, he is going to go down. He *committed* the (rather foggy) crimes. We are also going to accuse him of anything else we can find, just in case.
OK.
Now, let me see if I really understand it:
* he is considered a criminal, not a suspect; * he is going to be charged of espionage and treason, among many others; * he will not be able to have a public trial (see Guantanamo trials); * he will not have access to any documentation collected and used against him (see Guantanamo again); * he is risking almost certain death as a result.
Now, why the hell would he put himself under the so-called mercy of the government? What am I missing here?/div>
Technically valid. Morally, and legally... this is a different discussion. If it will be legally valid will have to wait until a number of courts of law discuss it. Right now I cannot even state it is illegal (my opinion, IANAL).
For the moral and ethical parts... all I can say is that -- and, again, in my opinion -- this is ethically wrong: certificates are used to provide one with a *private* conversation between parts. HTTPS inspection breaks this expectation of privacy. Since, many times, security depends on privacy, then HTTPS inspection implies a break in security as well.
This is even more critical if one thinks of how we have been promoting HTTPS usage -- which, pretty much, boils down to "use HTTPS and you will be secure". Add to it the fact that all browsers allow for one to bypass the security warnings and proceed -- which the majority of users do -- and this is a recipe for disaster (I *like* the ability to bypass the security warning, but I have a pretty good idea of what to do, and of the risks).
On the other hand, a similar process has been in use for quite many years to allow compartimentalisation and *increase* security. Picture a site that uses an internal CA to generate certificates that are used internally, and has a gateway to the external world ("protected" by a publicly-acquired certificate). By using software that requires all certificate roots to be present (and refuses to accept new roots over the wire), this site can guarantee that internal data will not be mistakenly sent out, or that external data will not be accepted unless coming in thru the gateway. In this usage, the internal servers only have the roots for the internal CA, and the gateway has *only* the internal root for the internal-facing, ah, listener, and the external root for the external-facing one.
This uses a similar (in functionality) software. Also, I am simplifying this a *lot*.
So. As usual, it is not the technology that is bad, but the usage one makes of it. But, frankly, X.509 is a dated technology, does not scale well, does not really guarantee provenance, etc, etc, ad nauseum./div>
I did NOT state it is OK. I do not agree with it, given this breaks, completely, the already flimsy trust I have on X.509 certificate usage.
But it is, still, a valid use of X.509. Welcome to the marvelous world of standards.
The whole point here is it IS used. One can buy (er, licence) available commercial software to do that. The only thing we can do is loudly complain about services using this. And be *very* careful when accessing HTTPS sites *anywhere*.
In summary: the new reality is this will get to be even more common. Many companies already deploy HTTPS inspection, many more will do (perhaps because of liability containment).
The fact this is stupid has no impact on it being deployed./div>
I doubt. They did not steal a real certificate, they just created another one signed by their own root. This is, even if not really candid, an accepted use of the protocol./div>
yes, they probably have (I do not know because I *never* used their service); chances are they use a root off a known CA, so that they can dynamically issue a new "server" certificate on the fly (that will probably have the same common name, or even the same distinguished name, as the real site certificate).
You could blacklist it, but this will probably cause all HTTPS connections to fail. This is not a bad idea, all in all, but is sort of pointless: you already know they do MITM, so all you need to do is *NOT* use their service. As I do...
Perhaps a better approach would be to use SSH2 tunnelling, or VPNs (as long as the VPN software uses certificate & root(s) pinning, so that it will fail if a different cert is received).
No matter what, get used to checking the server certificate whenever you use a different provider. HTTPS inspection (a.k.a. MITM) is getting to be common-place./div>
Look no further than offerings to monitor HTTPS -- the easy way to to is to intercept the user's request, replace the web host certificate by your own (and the roots for it), and then act as a MITM between the user and the target server.
You end up with a user monitor, and a monitor target web server. Easy. Privacy? Gone. Security? Gone as well.
And yes, available from many of the "security" vendors.
you can minimise the risk with certificate pinning, I guess. Easily done if the user is running a program. But very difficult to do for web browsers./div>
Keep in mind that internal policies may dictate that Exchange only maintains the last 'n' months of emails; of course, users may archive older emails in their local machine (which could crash and lose them). I have seen this limit at 6 months, for example. Limits on storage are also common.
Of course, it is not know if the IRS has such policy. But many companies have it in place (it allows keeping a limit on storage *and* allows for a legally valid excuse if faced with discovery: "we have a policy of only maintaining emails for a period")./div>
Techdirt has not posted any stories submitted by cerda.
Why only Google?
Are they compliant to the French interpretation of international law? Not? Never been asked to?/div>
Searched email after the leak? Why?
Why after? I would expect the search to go _before_ the leak. It seems Snowden leaked after he tried to raise concerns.
But, perhaps, this is just a journalistic mistake./div>
syntax
Come to my lair
This is telling. He is not a suspect. He is a criminal, already.
"and the U.S. government and Department of Justice believe that he should face them."
Meaning, I guess, "the US government and DoJ will do everything possible to have a court of law rubber-stamp our condemnation."
"And that’s why we believe that Mr. Snowden should return to the United States where he will face due process, and he’ll have the opportunity -- if he returned to the United States -- to make that case in a court of law."
Translation: "given we cannot ourselves be DA, jury, judge, and executioner we will leave the judge part for others."
"But obviously our view on this is that he committed and is accused of very serious crimes."
No matter what, he is going to go down. He *committed* the (rather foggy) crimes. We are also going to accuse him of anything else we can find, just in case.
OK.
Now, let me see if I really understand it:
* he is considered a criminal, not a suspect;
* he is going to be charged of espionage and treason, among many others;
* he will not be able to have a public trial (see Guantanamo trials);
* he will not have access to any documentation collected and used against him (see Guantanamo again);
* he is risking almost certain death as a result.
Now, why the hell would he put himself under the so-called mercy of the government? What am I missing here?/div>
Re: Re: it can happen; it is happening; it should not be allowed, though.
For the moral and ethical parts... all I can say is that -- and, again, in my opinion -- this is ethically wrong: certificates are used to provide one with a *private* conversation between parts. HTTPS inspection breaks this expectation of privacy. Since, many times, security depends on privacy, then HTTPS inspection implies a break in security as well.
This is even more critical if one thinks of how we have been promoting HTTPS usage -- which, pretty much, boils down to "use HTTPS and you will be secure". Add to it the fact that all browsers allow for one to bypass the security warnings and proceed -- which the majority of users do -- and this is a recipe for disaster (I *like* the ability to bypass the security warning, but I have a pretty good idea of what to do, and of the risks).
On the other hand, a similar process has been in use for quite many years to allow compartimentalisation and *increase* security. Picture a site that uses an internal CA to generate certificates that are used internally, and has a gateway to the external world ("protected" by a publicly-acquired certificate). By using software that requires all certificate roots to be present (and refuses to accept new roots over the wire), this site can guarantee that internal data will not be mistakenly sent out, or that external data will not be accepted unless coming in thru the gateway. In this usage, the internal servers only have the roots for the internal CA, and the gateway has *only* the internal root for the internal-facing, ah, listener, and the external root for the external-facing one.
This uses a similar (in functionality) software. Also, I am simplifying this a *lot*.
So. As usual, it is not the technology that is bad, but the usage one makes of it. But, frankly, X.509 is a dated technology, does not scale well, does not really guarantee provenance, etc, etc, ad nauseum./div>
it can happen; it is happening; it should not be allowed, though.
But it is, still, a valid use of X.509. Welcome to the marvelous world of standards.
The whole point here is it IS used. One can buy (er, licence) available commercial software to do that. The only thing we can do is loudly complain about services using this. And be *very* careful when accessing HTTPS sites *anywhere*.
In summary: the new reality is this will get to be even more common. Many companies already deploy HTTPS inspection, many more will do (perhaps because of liability containment).
The fact this is stupid has no impact on it being deployed./div>
Re: Re:
Re: MITM is for your security. Right?
But, for the common user, it will be swallowed as-is./div>
MITM is for your security. Right?
You could blacklist it, but this will probably cause all HTTPS connections to fail. This is not a bad idea, all in all, but is sort of pointless: you already know they do MITM, so all you need to do is *NOT* use their service. As I do...
Perhaps a better approach would be to use SSH2 tunnelling, or VPNs (as long as the VPN software uses certificate & root(s) pinning, so that it will fail if a different cert is received).
No matter what, get used to checking the server certificate whenever you use a different provider. HTTPS inspection (a.k.a. MITM) is getting to be common-place./div>
this is MITM done wrong.
You end up with a user monitor, and a monitor target web server. Easy. Privacy? Gone. Security? Gone as well.
And yes, available from many of the "security" vendors.
you can minimise the risk with certificate pinning, I guess. Easily done if the user is running a program. But very difficult to do for web browsers./div>
Policies, and laws
On the other hand, laws do not stop the criminals, only the lawful persons.
So what?/div>
Re: There is a distinction between the address an the content
For example, if you go to a blogging site, it is not the site itself that is important as much as which URI you got to.
So, just guessing, I would regard his statements as "grabbing the whole nine yards"; he just did not know enough to say that . Or not say it./div>
Policies
Of course, it is not know if the IRS has such policy. But many companies have it in place (it allows keeping a limit on storage *and* allows for a legally valid excuse if faced with discovery: "we have a policy of only maintaining emails for a period")./div>
Techdirt has not posted any stories submitted by cerda.
Submit a story now.
Tools & Services
TwitterFacebook
RSS
Podcast
Research & Reports
Company
About UsAdvertising Policies
Privacy
Contact
Help & FeedbackMedia Kit
Sponsor/Advertise
Submit a Story
More
Copia InstituteInsider Shop
Support Techdirt