Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

how do you conclude that any TrueCrypt version is secure in light of this disclosure?


Because this happened to 7.2. Chances are, this is a warrant canary, and people will take the last known good 7.1 (i.e. the one that is being audited) and build their forks from there.


1. 7.2 just disabled things. It didn't introduce a vulnerability.

2. Warrant canary makes no sense. There is nothing for a warrant to grab.

3. Forking is legally troublesome. Just because you can see the source doesn't mean you can distribute the source.


>1. 7.2 just disabled things. It didn't introduce a vulnerability.

I had a look at the diff; I have neither the time nor crypto-specific skill to do an audit, but there are plenty of code changes that aren't warnings in there.

>2. Warrant canary makes no sense. There is nothing for a warrant to grab.

No, but they could have theoretically been asked to introduce a backdoor from a Lavabit-style order.

>3. Forking is legally troublesome. Just because you can see the source doesn't mean you can distribute the source.

Do you think that will stop people? Even if there is some legal issue, the Streisand Effect always overcomes it. So what if a TrueCrypt fork can't use Github because of that issue? Is the world suddenly lacking good hosting providers, perhaps in Switzerland or similar? Has everyone forgotten how to set up a public git repo themselves? Somehow, I think not. So what if many devs will probably have to contribute anonymously? With a product such as TrueCrypt, they probably should anyway.


been asked to introduce a backdoor from a Lavabit-style order

This makes no sense. Lavabit was compelled to turn over evidence it told the government it had, which is straightforward law. There is nothing "Lavabit-style" about "distribute a back door or else." You would need explicit legislation to allow that. If the developer's cat was kidnapped to force him to put in a backdoor, there is nothing "Lavabit-style" about that.

You do hedge with the word "theoretically," but "theoretically" this could be a message from the aliens.

Do you think that will stop people?

It will stop the smart people, and you need smart people to work on it. Otherwise you would only be allowed to distribute it by illicit means, and you could never trust it. It would be as trustworthy as warez sites. Why even bother with that nonsense?

Much saner to just build something new, having learned from TrueCrypt's experiences. For example, something that starts from the command-line and then has a GUI in top, instead of the reverse.


> "There is nothing "Lavabit-style" about "distribute a back door or else." You would need explicit legislation to allow that."

http://en.wikipedia.org/wiki/Bullrun_(decryption_program)

No legislastion authorizes that program, among many others. I think what the parent was referring to as "Lavabit-style" was the "compromise your users or face legal action" move. Turning over customer data or introducing a backdoor are both means to the same end, as far as the NSA is concerned.


Is there something at that Bullrun link that shows the USG uses legal methods (or even illegal methods) to force product makers to insert backdoors into their equipment against the product makers' will?[1]

The NSA inserting backdoors into the product without the cooperation or maybe even knowledge of the vendor -- while troubling for any number of reasons -- is vastly different. Especially when giving advice to developers about how to stay in bounds with the law. If the metagame becomes "the government can legally force you to insert backdoors into your product" any developer faced with this threat might believe it, when he should know it's bunk.

[1] RSA allegedly got paid $10 million to make a change the NSA wanted. RSA customers should demand an answer, but that's not forcing the RSA. People get paid to do things all the time.


Not there, but in the historical record for sure.

>Hushmail stated that the Java version is also vulnerable, in that they may be compelled to deliver a compromised java applet to a user.

>http://en.wikipedia.org/wiki/Hushmail#Compromises_to_email_p...

>Hushmail turned over cleartext copies of private email messages associated with several addresses at the request of law enforcement agencies under a Mutual Legal Assistance Treaty with the United States.; e.g. in the case of U.S. v. Tyler Stumbo.

>if a court order has been issued by the Supreme Court of British Columbia compelling us to reveal the content of your encrypted email, the "attacker" could be Hush Communications, the actual service provider.


Wow, Huhsmail was pretty silly with their claims:

the company that provides Hushmail, states that it will not release any user data without a court order from the Supreme Court of British Columbia,

This is malarkey. Someone in the US could say "I'll fight all the way to the Supreme Court!!!" but you would be a fool to trust your business to their determination. Especially if they say it about a subpoena, which means they haven't even retained a lawyer to ask about this. (If your business plan depends on being able to wage a legal battle, you really shouldn't be scrambling through the yellow pages for a lawyer when you get your first subpoena.)

Back to the topic, I'll have to point out that this still isn't evidence of a company being required to backdoor a product. Hushmail, the same company that thinks it can fight a subpoena for third-party data all the way to the Supreme Court, said "well, we might be compelled to backdoor our product." This is just more repeating of the meme without evidence. It's unfortunate because some developer who remembers Hushmail might take their ill-informed legal opinion as reality.

Of course, Hushmail had access to cleartext copies of the messages. That's the killer. The government has the right to evidence about third parties in your possession. (Canada derives from British law tradition like the US. The government's right to all evidence is a concept that goes back centuries. If you can show that Canada broke from this tradition I would be most interested.)


> No legislastion authorizes that program, among many others.

No legislation forbids it either. If the NSA is legally able to search/seize a given piece of information why would you think they should not be allowed to forensically analyze it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: