Senate Given The Go-Ahead To Use Encrypted Messaging App Signal
from the feinstein,-burr-will-continue-to-use-AOL-chatrooms dept
Certain senators have repeatedly pushed for encryption bans or encryption backdoors, sacrificing personal security for national security in a move that will definitively result in less of both. Former FBI Director James Comey's incessant beating of his "Going Dark" drum didn't help. Several legislators always managed to get sucked in by his narrative of thousands of unsearched phones presumably being tied to thousands of unsolved crimes and free-roaming criminals.
It will be interesting if the anti-encryption narratives advanced by Sens. Feinstein and Burr (in particular -- although others equally sympathetic) continue now that senators can officially begin using an encrypted messaging system for their own communications.
Without any fanfare, the Senate Sergeant at Arms recently told Senate staffers that Signal, widely considered by security researchers and experts to be the most secure encrypted messaging app, has been approved for use.
The news was revealed in a letter Tuesday by Sen. Ron Wyden (D-OR), a staunch privacy and encryption advocate, who recognized the effort to allow the encrypted messaging app as one of many "important defensive cybersecurity" measures introduced in the chamber.
ZDNet has learned the policy change went into effect in March.
If this isn't the end of CryptoWar 2.0, then it's at least a significant ceasefire. Senators are going to find it very hard to argue against encrypted communications when they're allowed to use encrypted messaging apps. It's not that legislators are above hypocrisy. It's just that they usually allow a certain amount of time to pass before they commence openly-hypocritical activity.
This doesn't mean the rest of the government is allowed to use encrypted chat apps for official communications. Federal agencies fall under a different set of rules -- ones that provide for more comprehensive retention of communications under FOIA law. Congressional communications, however, generally can't be FOIA'ed. It usually takes a backdoor search at federal agencies to cut these loose. So, members of Congress using an encrypted chat app with self-destructing messages may seem like the perfect way to avoid transparency, but it's the law itself that provides most of the opacity.
If encryption's good for the Senate, it's good for the public. There's no other way to spin this. Even Trump's pro-law enforcement enthusiasm is unlikely to be enough to sell Congress on encryption backdoors. With this power in the palm of their hands, they're more apt to see the benefits of leaving encryption un-fucked with.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: encryption, end to end encryption, messaging, senate, signal
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Re: Re: Why do people believe that AES is secure?
Yes, this was well know about DES.
Not so for AES.
[ link to this | view in chronology ]
Re: Re: Re: Re: Why do people believe that AES is secure?
Everyone knew it was too short from the beginning, including IBM (which invented the scheme and proposed a 128 bit key length).
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
The DES standard was proposed by IBM with a large key length, but it was shortened by government (at least that's the rumor). IBM had (secretly, I heard) already developed both linear and differential cryptanalysis, which was still unknown to the public. Doesn't it seem pretty clear that the government weakened the key in order to be able to secretly crack DES, and that was easier with a shorter key? It seems to follow in my way of thinking.
Do you think the government really wants a public standard that foreign government can use and they can't break? That seems unlikely. It seems more likely that both DES and AES were carefully selected and positioned to be acceptable to the "masses" while already cracked by the NSA. It happened before, and I would expect that it will happened again and again.
[ link to this | view in chronology ]
Re: Re: Re: Why do people believe that AES is secure?
the gummint doesnt have to have the smarts to do anything, all they have to do is extort, bribe, or otherwise threaten the companies or individuals involved to put in the backdoors or else that kiddie pron The They planted on their computer is found...
kampers, you naive propaganda victims need to get this straight : the puppetmasters are eee-vil, they DO SHIT like that ALL THE TIME, it shapes your world, but you cant believe it... the psychopaths count on your ignorance...
[ link to this | view in chronology ]
Re: Re: Re: Re: Why do people believe that AES is secure?
AES is open source, and has an International community behind it. So, if like the elliptic curve, they can find some subtle weakness,to exploit, they may get have partial success in weakening the system. A deliberate backdoor on the other hand is going to be almost impossible to get in, as they cannot find out who all the expert watchers of the development are, never mind subvert them. Besides, good luck with subverting someone like Moxie Marlinspike.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
real cryptographer and security expert, and is the man behind the signal protocol and open whisper system, which are open source.
As to quantum computers, all the es perts know that they are a theoretical threat, but other than the one time pad, they are also required to build system that will withstand their use to attack systems.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
Fact: IBM and the US Government have been decades ahead of ANYTHING ever done by the Open Source community in the area of encryption. How do we know that? Documented history. DES. Walt Tuchman. IBM. Common sense.
Fact: The US Government is serving the interests of the US Government in endorsing (and shaping) encryption standards. How do we know that? History and common sense.
Fact: There is only so much work ANYONE is willing to do for FREE. Yes, religion, including Open Source religion, does inspire some people to work for free (at least a little). Sometimes even incredibly smart people (Linus is a Saint). But EVEN THEY are either (a) independently wealthy or (b) actually use their time to make money. There really are no other choices. So, where does more work get done, in the religious community, or the commercial community. The Commercial Community. How do we know that? History and common sense.
How the argument that Open Source encryption is BETTER protection for your data survives is another one of the great FAKE NEWS stories of our time. Both IBM and the Government have a long documented history of LEADING in this area with SECRET SOLUTIONS. Leading by a lot. Because they pay for it. How do we know? History.
Free/Open Source Encryption is Weaker than EVERY Commercial Alternative, by public demonstration, and think about it, common sense. It really is IMPOSSIBLE to pay for a better scheme? Of course not. They're all better (provably), or they wouldn't sell (billions of $).
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
There is a very good reason why the cryptology community have gone for standard and open source encryption, it is because there is a long history of individuals and small groups designing encryption systems that were all too easily broken.
As the old adage goes, it is easy to design an encryption system that you cannot break, but much harder to design a system that others cannot break.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
Eh? That doesn't make sense.
Whether the system is open or closed, once a way to break in through it has been found, anyone using the now-broken system is vulnerable.
Assuming the fact of the vulnerability isn't disclosed somehow, the odds of its being found by the people who have ability and access to fix it would presumably correspond roughly to the number of such people who exist - which probably would mean that the edge would go to the open system.
Once the vulnerability is known, the odds of a fix actually being created depend on how many people with the ability and access to fix it actually care to do so. There are different factors affecting that in open and closed contexts, so this one could be argued case-by-case, and may be a wash.
But once a fix has been created, it has to be gotten out to the users.
With open software, the users can (generally speaking) get the fix for free, the same way they (generally speaking) got the original software. That means there's little obstacle to their getting it.
With closed software, the users may very well need to pay to get the fix - especially if "being paid" is one of the reasons the providers of the closed software bothered to create a fix in the first place. That means there is an obstacle in the way, which makes users less likely to actually get the fixed version.
Even if the providers of the closed software make the fix available for free to anyone who already has the unfixed software (including people who pirated it?), there may still be other obstacles; consider the number of people who turn off Windows Update because they don't trust Microsoft not to break things they like, much less the number of organizations which turn it off because they know updating will break things. The same consideration does apply with open software to some extent, but IMO less so, since in the worst case the users can still avoid any undesired changes by forking.
This does not necessarily hold. Although I do not fully understand the details or recall my source for this just offhand, I am given to understand that in some cases, adding additional mathematical manipulation to the math which constitutes a given form of encryption can actually make it easier to reverse the process and extract the original cleartext from the ciphertext.
(Using the same data twice in the process is one thing which can have this result; for example, while using the cleartext itself as the seed for your RNG to produce an encryption key might seem like a good idea, it means that the number which the cleartext represents has been used twice in producing the ciphertext, and that in turn may make the net mathematical transformation less complex.)
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
Obscurity being "a thing that is unclear or difficult to understand".
So, translating the statement, Making your encoded message difficult to understand is a bad thing. This could only be promoted and accepted by the Open Source community. I think even a casual observer would see this presumption as obviously ridiculous. The comment about "pimple faced kids" just make it more ridiculous.
Not that you said this, I am just venting about the logic often (and publicly) displayed by the open source community about encryption.
You can think of every component of your encryption machine being an attack surface. The more you expose, the more opportunity you give the attacker. EVERY nuance of your encryption machine is exposed in open source, you have the largest possible attack surface. This is precisely why people serious about encryption do not expose the details of their encryption machines, and do not publish their source. By serious, I mean highly paid professionals, not open source saints like Linus (and he is a legitimate saint).
I can agree with you about the economy of open source. It is, after all, free, pretty economical. I could also agree that someone without cryptographic training would probably produce a solution weaker than AES. So, I could agree that open source solutions are better than most amateur solutions. But assuming that the choice is either public open source or amateur hour is a false choice.
"You get what you pay for" is usually more true than false. Obscuring your encryption technique, by hiding it in closed source, is more secure, not less secure. You might have to pay to analyze it and verify it, but money will work. This is demonstrated publicly and repeatedly. Just ask the pros, not the priests.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Why do people believe that AES is secure?
No - in this context, "obscurity" means "being little-known". I.e., if your security relies on not many people knowing about you, you're not really very secure.
It's the difference between "everyone knows there's a combination lock here, but not many people know the combination, and it's hard to figure out and "the combination to this lock is easy to figure out, but not very many people know that this combination lock exists in the first place". The latter is "security by obscurity"; the former is not.
In simple analogy, an encryption algorithm is like a lock, and an encryption key is like the combination to that lock. Keeping the combination secret is not security by obscurity; keeping the algorithm secret is.
Both can increase security, technically (just as having a hidden combination lock with a hard-to-figure-out combination is technically more secure than a non-hidden lock with the same combination) - but keeping the algorithm secret is short-term security at best (just as the hidden combination lock will eventually be discovered), and because of all the ways a privately-devised encryption algorithm could have unknown weaknesses, is more likely to reduce net security (vs. using a known and well-studied one) than increase it.
That depends on what you mean by "expose".
If you mean "put in a place which is accessible to be attacked", then sure; that's true of any software. However, if there's a hole somewhere else in the software, you may unexpectedly find that an interface which you thought was internal-only may suddenly be reachable by an external attacker - and is therefore exposed, for this purpose.
If you mean "make known to the attacker", then no - because you cannot guarantee that the attacker will never know a given detail; even in the absolute best-case scenario, much less a real-world plausible scenario, binary disassembly and decompilation are things which exist.
[ link to this | view in chronology ]
Re: Re: Why do people believe that AES is secure?
[ link to this | view in chronology ]
Re:
But the chances that the government knows how to do it but the public doesn't are pretty low.
One of the things about cryptography is that no encryption algorithm should be created and used in house without public scrutiny. All algorithms should go through a long period of public scrutiny before being approved for use. Standard algorithms, not non-standard in-house, algorithms are considered safer exactly because they went through a much more thorough testing process that involves a whole lot more very intelligent people before they got approved. It's why the government, IIRC, now uses encryption standards in opposed to stuff that they made in house. Exactly because their in house ciphers later turn out to be garbage.
Given the fact that the public can much more thoroughly scrutinize a cipher than the small group of people working for the government (and, remember, it's not like the government is composed of the most intelligent meritorious people. They're the government, classic example of lazy people that take your money and don't have the merit to make their own money by actually working. It's the private sector of individuals that are much more intelligent) all it takes is for one person to find a flaw in the cipher, publicly present it, and everyone will know its weakness. Then new ciphers will be worked on. and that's exactly how cryptography advances. Older ciphers become obsolete and get replaced by newer, better, ciphers that don't have the same weaknesses as the older ones. One day AES may also get replaced as weaknesses are found but, in the meantime, it's unlikely that there is a secret esoteric weakness that only our dumb government knows about but the many very smart people that scrutinize these ciphers can't yet figure out.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
For computational problems where a loosely coupled system is useful, anybody who can build a community of supporters can gain use of more computing power than the largest supercomputer. In he Internet age, inspirational leadership, and a willingness to work in a very open fashion is the kkey to obtaining massive computational and even human resources.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
If you are referring to something like Stuxnet it should be noted that there is a huge difference between being able to find a specific software vulnerability and exploit it to install a worm and being able to crack what underlies a huge percentage of all computer security.
That specific vulnerabilities exist here and there is no surprise. Big deal. There isn't a huge widespread effort to crack every little vulnerability that every computer system may have and expose it and computer systems are constantly evolving as software changes and so new vulnerabilities are always being introduced.
On the other hand being able to crack AES or RSA would be huge as it would render so many of our security systems vulnerable. So there is a much larger widespread effort trying to crack these and to ensure that they are secure.
[ link to this | view in chronology ]
Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
As stated this is very discouraged. Private, secret, encryption ciphers have a high probability of being weak because they didn't undergo the test of time and the test of widespread public scrutiny. If they want to use their own private encryption ciphers they can do so at their own risk but I would much rather stick with something true and tried. Trust me, the government would be wise to do the same and they very well know it. They learned from their mistake when it came to DES which is why now they adopt AES, a true and tried cipher.
Will weaknesses later appear? Possibly. But because it's a widespread standard chances are when a weakness does appear someone will notice it and publicize it and it will be widely known that it's time to upgrade to something new. The fact that there are many eyeballs scrutinizing it is what makes it ideal.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
Re: Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
If you knew enough about cryptography to be able to make comments worth paying attention to, you wouldn't be making the comments that you are making.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
Unless you're a world-class crypto expert (and maybe even then), you can't possibly come up with a scheme that's more secure than one that has been vetted by dozens of true crypto experts (many of whom do *not* work for your adversary).
There are lots of techniques for cracking crypto, which, unless you're an expert, you've never heard of.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
There's no reason to have to 'trust' anyone. The AES encryption specification is completely and fully public, it was chosen among many in an open competition some years ago whose goal was to choose the next encryption standard.
There's nothing stopping you from learning how AES works - there are tons of resources on the 'Net, nothing stopping you from learning the math, from learning how 8-, 16-, 32-, and 64-bit CPUs generally work and how they differ, and why this was one of the reasons AES was chosen.
And there's nothing stopping you from learning and (hopefully) understanding why pretty much all cryptographers (even those who are criminals and anarchists) think that AES is a good scheme. Cryptographers have been pounding on, digging into, scratching, and trying to break AES for years.
Not using AES because people who wear suits and ties use it is a really poor basis for decision-making about, well, anything.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
Do you really think they could secure all "their" infrastructure with something "more secure" without the whole world knowing?
Here's how it's done:
https://en.wikipedia.org/wiki/Kerckhoffs%27s_principle
[ link to this | view in chronology ]
Re: Re: Re: I just find it an incredible argument to go to your adversary (the government) to define the encryption scheme (AES) to protect yourself from that same government.
[ link to this | view in chronology ]
It's all a show, no matter which perspective you have on it.
[ link to this | view in chronology ]
Re:
For what it's worth, here is my vision of a secure world:
Pretty much every processor now has a SIMD unit, even tiny little processors on cheap phones and such.
These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory. Then, this encrypted and protected data chunk can travel wherever it likes. It can be used, abused, corrupted, whatever. However, in the future, when you need it again, you retrieve whatever you get, decrypt it, validate it, and use it, knowing it is correct data with a verifiable measure of certainty.
Encryption for everyone, everywhere, all the time, for almost no cost. Well programmed, these SIMD units, inside the CPU, burn almost no resources, because they are so inherently parallel and optimized to do just this.
A protected world.
Amen. :)
[ link to this | view in chronology ]
Re: Re:
Say, for example, that GWiz and I whipped up a kernel driver for Linux that essentially encrypted and protected both the DRAM memory system and the external storage, all the time, with no reasonable performance impact. That is, you would gain the benefits of ECC memory and Erasure Coded RAID using standard memory and standard storage on everything from cell phones to servers.
The question is: Do you think there is some type of hybrid Open Source + Pay for Something mode that could work in this market segment? For example, offering weaker encryption or protection for free systems, and stronger encryption and protection for pay for systems? Or something like that?
I really am interested in your opinion, and could well consider lunch with you in the future.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
[ link to this | view in chronology ]
Re: Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
[ link to this | view in chronology ]
Re: Why do you need one?
[ link to this | view in chronology ]
Re: Re: Why do you need one?
[ link to this | view in chronology ]
Re: Re: Re: Why do you need one?
[ link to this | view in chronology ]
Re: Re: Re: Re: Why do you need one?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Why do you need one?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Why do you need one?
Some people like this because it goes back to the proofs of the encryption scheme to prove the pseudorandomness. Others prefer to base their PRNGs on proofs specifically made for random number generators.
Don't forget that you still need to seed any PRNG from a good source of initial randomness.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Why do you need one?
[ link to this | view in chronology ]
Re7: Why do you need one?
Of course -- exactly right! And here you go:
https://xkcd.com/221/
http://dilbert.com/strip/2001-10-25
And there's also this gem:
Anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin.
-- John von Neumann
[ link to this | view in chronology ]
Re: Re7: Why do you need one?
[ link to this | view in chronology ]
Re: Re: Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
[ link to this | view in chronology ]
Re: Re: Re: Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
"Sounds good. Do you have a trustworthy source of random numbers?"
To which you replied:
"Why do you need one?"
This is why I said you lack knowledge about how present day encryption is performed. One needs a random number to seed what ever algorithm is being used.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: These SIMD units can encrypt and protect data INSIDE the CPU (before it travels anywhere) and only write ENCRYPTED DATA and ECC to memory.
[ link to this | view in chronology ]
The Senate may not grasp encryption issues, but I'm sure they can still master intrinsic angular momentum.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
But but but.....we are special
Are you kidding? They will have no difficulty at all in creating an special exemption for themselves.
[ link to this | view in chronology ]
Re: But but but.....we are special
[ link to this | view in chronology ]
Re: Re: But but but.....we are special
[ link to this | view in chronology ]
Open Source Security Question
[ link to this | view in chronology ]
Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Open Source Security Question
The basic adage of encryption is to assume that the attacker knows all details of the system, and that only the key is secret. Unless you can ensure that the attacker cannot get access to a working system, and/or exfiltrate the source code, they will have details of how the system works.
In particular with respect to encryption, peer review is essential, and open source gets more peer review than closed source because the peers choose themselves. This usually means that their are people pounding on the code long before it gets widespread use, while with closed source, this pounding usually takes place after it get into widespread use.
While as ever, no approach is perfect, the open source approach increases the chances of flaws being found before there is widespread use. Further, when a live exploit is in use, finding the bug being exploited is the hard part, and within open source their are many more people available to go looking for it. This is part of the reason why open source reaction times to exploits are measured in hours, rather tan months.
[ link to this | view in chronology ]
Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Open Source Security Question
With modern open source encryption systems, you do not need to worry too much about the encryption, but rather much more about keeping software up to date, and managing your keys in a secure fashion. Currently the biggest threat is not a compromised encryption system, but rather a compromised operating system letting spyware in.
[ link to this | view in chronology ]
Re: Re: Re: Re: Open Source Security Question
Meanwhile, over in Building 5300, the NSA succeeded in building an even faster supercomputer. “They made a big breakthrough,” says another former senior intelligence official, who helped oversee the program. The NSA’s machine was likely similar to the unclassified Jaguar, but it was much faster out of the gate, modified specifically for cryptanalysis and targeted against one or more specific algorithms, like the AES. In other words, they were moving from the research and development phase to actually attacking extremely difficult encryption systems. The code-breaking effort was up and running.
The breakthrough was enormous, says the former official, and soon afterward the agency pulled the shade down tight on the project, even within the intelligence community and Congress. “Only the chairman and vice chairman and the two staff directors of each intelligence committee were told about it,” he says. The reason? “They were thinking that this computing breakthrough was going to give them the ability to crack current public encryption.”
In addition to giving the NSA access to a tremendous amount of Americans’ personal data, such an advance would also open a window on a trove of foreign secrets. While today most sensitive communications use the strongest encryption, much of the older data stored by the NSA, including a great deal of what will be transferred to Bluffdale once the center is complete, is encrypted with more vulnerable ciphers. “Remember,” says the former intelligence official, “a lot of foreign government stuff we’ve never been able to break is 128 or less. Break all that and you’ll find out a lot more of what you didn’t know—stuff we’ve already stored—so there’s an enormous amount of information still in there.”
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Open Source Security Question
Could another key weakening trick, like the promotion of selected elliptic curves happen,. Wellyes of course it could, but specific suggestions like that will be viewed with more suspicion going forward. Elliptic curve cryptography is still used, it now known that some curves make iit easier to attack, but then all cryptography based on more complex maths ay turn out to have such a weakness. Such attacks however are hard to find, and so only turn up rarely. Also, they tend to nbe of limited use, by bringing the time to decode a message to level where it is useful for selected messages, but nowhere fast enough for geberal surveillance.
Is open source encryption invulnerable to introduced weaknesses, no, but they will have to be subtle and hard to find, in the mathematical sense, and found by someone who will keep them secret, rather than publishing for academic glory. Also code bugs will occur, but here the open source community can usually respond with a patch to fix the issue within hours.
With a proprietary binary software model, even if you can examine the source under an NDA, there is no way to check that it is the code running on your system.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Open Source Security Question
Of course not, but that is a silly thing to say.
"Even a slight obfuscation of data on top of a known good system makes it stronger and not weaker, right?"
Wrong. There have been many papers written on this subject, you don't need me telling you this.
Security by obscurity is not very good security at all, it might stop pimple faced kids in mommies basement but it will not stop knowledgeable and motivated personnel.
[ link to this | view in chronology ]
Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Open Source Security Question
It was not stated that a closed system would be "easier to attack".
Stating that a closed system is not more "secure" than an open one does not imply that it is easier to attack. There are many vectors and tools with which code can be compromised, being able to peruse the source might be interesting but it does not make cracking encryption of modern systems any easier.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Open Source Security Question
I think this speaks more directly to what you meant, right?
Better to be direct when speaking with others in public and in writing, it saves everyone time.
:)
[ link to this | view in chronology ]
Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
Closed source can possibly contain exploits, both intentional and unintentional, in addition to the much coveted backdoors. All this without your knowledge, whereas with open source one has a community with various goals who routinely review the source code and make updates. The exploits and backdoors would be soon found and eliminated ... or so the story goes. There have been attempts, some successful, to circumvent this but they were soon found and stopped - hopefully. Anyways - closed systems do not have this feature and in fact, some profit driven closed source systems have been found to contain intentional exploits, these were not put there for your benefit.
Not sure how your so called experts benefit from open source over closed, perhaps you could expound.
[ link to this | view in chronology ]
Re: Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
Like I said previously, there have been many papers on this topic already - I doubt I can add anything to them as it has been beaten to death already.
My only advice is to keep an open mind.
Many who offer software for your benefit do not actually give a shit about you, your well being or even if said software provides a valuable service - all they care about is ripping you off, and that has become far too easy.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re: Well, access to the source code does not INCREASE the level of security, right?
[ link to this | view in chronology ]
Re: Open Source Security Question
If you want to toss something completely new into the market, though, open source doesn't make it magically more secure out of the gate, any more than big money closed source development does. Many eyes, especially the more qualified ones, over time, is what helps secure your source. Which also goes for your algorithms / novel theory.
Then you (or rather vendors using your system) have to make sure they don't bork it in their implementation of your implementation. Which was the weak spot several times with quantum crypto tools.
If this is all happening ultralocally inside a processor or device, it is less likely to be cracked until the attacker has possession. And you had been mentioning governments...
Many eyes, good eyes, over time. That is the security point of open source. It is only theoretical unless that happens, though. But a truly secure system should be secure regardless of who has the source. Closed systems, you don't know how well it was done in the first place, certainly not that many people checked it, you don't know who may have gotten hold of the source, and... since closed source counts on being closed for security, that is a huge weakness. It should be negligible for security reasons whether the source is closed or not, it certainly should not be counted on as a security factor. (And some seem to depend on that as the main bit of security, sadly.)
[ link to this | view in chronology ]
Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Open Source Security Question
Really?
Based upon what?
[ link to this | view in chronology ]
Re: Re: Re: Open Source Security Question
Indeed they are so sophisticated that they have overplayed their hand and convinced everybody to encryption for ordinary tasks, like reading the news. Also, they are so sophisticated that all the data that they collect only provides them with the information to work out what happened after an attack has taken place.
So tell me again how sophisticated governments are, when they are trying to use the law to get private companies to invent and install backdoors for them.
[ link to this | view in chronology ]
Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Open Source Security Question
Also remember that an arrogant person can believe that they or the organization they lead knows everything of relevance for assessing the strength of a cryptography system. A sophisticated person know that they are huge gaps in their knowledge, and the way to deal with that is to open the encryption system to examination by anybody who cares to look at it.
Also classified does not equal kept secret from those you most wish to keep it secret from, as spies exist. This is the second reason why obscurity does not do anything for security, other than install a false sense of confidence.,
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Open Source Security Question
[ link to this | view in chronology ]
Hypocrites often do not see themselves as such and frequently project.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Good guy
[ link to this | view in chronology ]
[ link to this | view in chronology ]
A question for you enryptions theoreticians
[ link to this | view in chronology ]
Re: A question for you enryptions theoreticians
[ link to this | view in chronology ]
Re: Re: A question for you enryptions theoreticians
Same cycles, 3 tasks that all contribute to the protection, encryption and compression of the data.
That's the basic idea - what do you think?
[ link to this | view in chronology ]
Re: Re: Re: A question for you enryptions theoreticians
[ link to this | view in chronology ]
Re: Re: Re: Re: A question for you enryptions theoreticians
My suggestion is that the opposite needs to happen. If the application has the ECC encoder and decoder, nothing else actually needs one. The result is simpler, more configurable, more measurable, and as open source, more verified (hard drive and flash vendors guard their ECC techniques). All good, right?
[ link to this | view in chronology ]
Re: A question for you enryptions theoreticians
Such systems are not easily cracked but it requires both ends to have the same "matrix".
Why do you reply to yourself all the time?
[ link to this | view in chronology ]
Re: Re: A question for you enryptions theoreticians
[ link to this | view in chronology ]
Senator says..
For *other* people, that is.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
NSA Recommends against adopting AES (Suite B)
"Until this new suite is developed and products are available implementing the quantum resistant suite, we will rely on current algorithms," officials wrote. "For those partners and vendors that have not yet made the transition to Suite B algorithms, we recommend not making a significant expenditure to do so at this point but instead to prepare for the upcoming quantum resistant algorithm transition."
https://arstechnica.com/security/2015/08/nsa-preps-quantum-resistant-algorithms-to-head- off-crypto-apocolypse/
[ link to this | view in chronology ]
Re: NSA Recommends against adopting AES (Suite B)
"Out of all the programs that have been leaked by Snowden, the Bullrun Decryption Program is by far the most expensive. Snowden claims that since 2011, expenses devoted to Bullrun amount to $800 million. The leaked documents reveal that Bullrun seeks to "defeat the encryption used in specific network communication technologies".[6]"
So, the government does not use or even recommend publicly disclosed algorithms for very sensitive information (like their own). And they are spending (likely) billions to break publicly known standards.
Doesn't this argue pretty firmly that open standards are weaker than closed ones, even according to the NSA? I know that are "many papers" that multiple posters have referred to, but how about considering the facts of the matter?
Closed source encryption is better, right? This is a "by demonstration" example by the US Government. The real question is "can you afford it", which they obviously can.
[ link to this | view in chronology ]
Re: Re: NSA Recommends against adopting AES (Suite B)
[ link to this | view in chronology ]
Is this a well known argument against public encryption standards?
[ link to this | view in chronology ]
Re: Is this a well known argument against public encryption standards?
Making obscurity part of a cryptology system security is a fool's errand, as sooner or latter attackers will get hold of the details of how the system works. Just look at how quickly DRM is broken, and it relies on obscurity as part of its security measures.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Actual value of free (open) encryption software < paid (closed) encryption software
This is demonstrated publicly by the US Government (NSA Suite A cryptography vs. NSA Suite B cryptography). They keep the important one secret. I think your argument is that the government is a fool. I dunno, maybe not.
You get what you pay for. Usually more true than not. :)
[ link to this | view in chronology ]
Re: Re:
Type A crypto-system (osos) is subject to automated attacks, that is, if the underlying crypto is ever cracked, everyone who used it is exposed, and everything they ever transferred over the public network may be revealed
Type B crypto-system (nosnos) requires a crypto-analyst to focus on this one code, which he would never do unless paid a lot of money. There is no public cred for breaking a private crypto-system. Money is the only way to do it.
Type A Free to use, big financing in place to break
Type B Pay to use, no financing in place to break
Pay now or pay later.
[ link to this | view in chronology ]
Re: Re: Re:
Type A (OSS) - you know for a fact someone, somewhere is trying to break it.
Type B (Proprietary) - you have no idea whether someone is trying to break it.
Proprietary "crypto systems" are only useful for internal messages (think company e-mail). As soon as you use it for public-facing applications it's like issuing hackers a challenge invitation.
[ link to this | view in chronology ]
Re: Re: Re:
1) an individual or small group of people can only know a limited amount of advanced mathematics, and so are unable to get close to saying there are no known attack vectors.
2Closed source, or even chip implementations of crypto can be reversed engineered, and so attackers know how the system works.
Also note that it is all too easy to design a crypto system where message analysis alone will enable it to be broken.
Obscurity is a bad approach to cryptography, as the assessment of the system is based on the limited knowledge of one or two people, and they just cannot know enough to carry out that assessment. Also, the way the system works cannot be kept secret when an implementation is made available to the public
The failure of obscurity is made abundantly clear by the failure of any DRM system to hold up for more than a few months of attacks by amateurs.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Open Source is Free Crypto, but for a Limited Time. Little doubt it will be cracked, the real question is when, and whether that date is in the future or the past.
[ link to this | view in chronology ]
It will be interesting if the anti-encryption narratives advanced by Sens. Feinstein and Burr (in particular -- although others equally sympathetic) continue now that senators can officially begin using an encrypted messaging system for their own communications.
Of course they will. After all, the unwashed masses don't deserve privacy. :)
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]