Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This stuff happens more than anyone in infosec wants to admit; it's (ironically) what got me into professional software security to begin with, after being upset by what a commercial network monitoring tool would have allowed us to do to our customers at an ISP I helped run.

It's especially funny to see a government sponsored telecom reaching out to Moxie Marlinspike. Also: this isn't like that time a random Microsoft recruiter accidentally hit ESR and he wrote the "Your Worst Nightmare" post about it. Like Moxie says, money buys technology, and they will eventually find someone to rig up a workable solution for what they're trying to do.

Moxie: one thing that would be hugely helpful is a quick list of the things you did that make you confident in Twitter's TLS code (which: thanks for doing).



In this case I was mostly referring to the inclusion of certificate pinning (ex: https://github.com/moxie0/AndroidPinning) in the mobile apps, which would theoretically prevent them from using a UAE or Saudi controlled CA to do the interception. In addition to iOS and Android, we also refused to compromise with low-end platforms like MediaTek, and made sure those clients were also all-TLS and that they employed certificate pinning. We also did common sense stuff like including assertions in all the platforms that would prevent accidental HTTP leakage.

More generally, a lot of common sense effort was put into the general TLS posture of the website. From certificate pinning baked into the browsers, to HSTS headers, to making sure the links in search results are https.

There was a bunch of generally nice TLS stuff in the pipe as well, but I'm not sure if it shipped yet.


If you really want the world to be a more secure place, can I please ask that you relicense the AndroidPinning code as BSD or something less viral than GPLv3?

I don't see Instagram, Facebook, etc. using that code to secure their apps, they won't license their Android clients as GPLv3 just to use the Android pinning library. While it is easy enough to re-create your code (though I have not looked at it), given that we're talking encryption libs, it's always nicest to have secure, vetted libs that just work.

(As a matter of fact, I'm sure you had to relicense it for Twitter to use it in their app.)


From the README:

>Please contact me if this license doesn't work for you.

I see no reason why Moxie should give Facebook and Instagram this valuable feature for free. When did open-source hackers become the unpaid laborers for silicon valley?

If they want it, they can either release the source code for their applications and liberate their users, or they can pay (hopefully) through the nose for it. Maybe that'll buy a few more months of TextSecure development, or whatever other cool things Moxie is doing now.


Facebook or Instagram will just reimplement it themselves if they care. Smaller developers will just remain insecure. GPLv3 harms adoption of something like this.


If you think that, go implement a MIT-licensed variant.


Not that simple, you have often stated that normal programmers shouldn't be near security and now you are stating that they should go implement something that is specifically to enhance the security of the web.

The gp isn't asking for a change of license because he hate the GPL, he is (properly correctly) predicting what will happen if that license isn't changed: specifically, the thing that Moxie is trying to prevent won't be.


I don't say normal developers shouldn't be near "security"; I say they shouldn't be implementing cryptographic primitives.


Note: AndroidPinning is not a cryptographic primitive.


No, it isn't.


Facebook or Instagram could PAY for getting a license better than GPLv3.

That something is GPL does not mean that could not be also licenced as proprietary for those that pay if they don't want the limitations of GPL.


We live in a world where if the cost of too high for something like this, it will be written off as unnecessary. Unless there is someone really pushing this from within, Facebook/Instagram/etc probably won't implement something like this, or will just create their own (possibly poor) substitute.

I get the idea that people should be paid for their work, and it's his choice how he licenses it. On there other hand, if the point is to make sure this spreads as far as possible and gets used everywhere, then maybe a very permissive license is called for.


If he hadn't taken the time to publish this code, you wouldn't have even known to try to zing him for using the "wrong" license. Perhaps the most rational solution for people like Moxie would simply be to never publish their code, and simply continue to write forcefully and effectively about technical controls and privacy.

Then they wouldn't have to jump through silly hoops to prove whether they "really want the world to be a more secure place".

Or, how about this: if you really want the world to be a more secure place, why don't you take the time to learn how to implement certificate pinning for Android apps and publish your own MIT-licensed implementation? I'm sure Moxie would join the rest of us in cheering you on.


You don't have to be such an ass. I asked nicely enough.

I fully acknowledge Moxie is better at security than I ever will even dream of being. I just hoped he might see the value in releasing it under a more-amicable license. I don't have the numbers, but more-liberal licenses are by a wide margin the choice for open-source crypto.

I'm not speaking from the armchair, I've released open-source code under BSD/MIT myself. I don't have Moxie's skill for security, is it so wrong to point out the obstacle the license represents? He did release it to help secure the web, did he not? Why don't you let him reply.


Basically all of the software that I write for projects like this is GPL by default, but I generally include a note (as in this case) that developers should contact me if the license doesn't work for them.

I find this to be a good balance: those who wish to take my work and openly contribute their own work are free to do so, and those that don't need to contact the copyright holder. I don't think it's a lot to ask in this case, and I definitely don't think that licensing issues are what's holding back internet security here or otherwise.


I hardly think that calling moxie's choice of license "viral" and questioning whether he "really wants the world to be a more secure place" is asking "nicely enough". religous wars aside i also would like to hear from one of the cryptography gods about licensing cryptography software since ComputerGuru does have a good point when talking about software such as openssh


There's more value in forcing vendors to work with Free Software licenses than in compromising the ideals of open source to allow vendors to benefit without contributing back.

You should be asking yourself how you can change your project so that GPL3 licensed code will be acceptable, rather than asking others to relicense their code.


I humbly contend that forcing people to do anything in the name of an preserving the purity of an ideology is a Bad Idea.


Not to mention, you can't really force them to do anything: they'll just avoid the GPL code, create it themselves, or find something similar under another license.


Authors are giving their work away, subject to restrictions of their choosing. There's no coercion involved.


I'm responding to the idea that ideological purity is the goal, not the author's right.


Isn't it amazing how people come out of the woodwork to point out the force inherent in the GPL never say "oh, by the way, thanks for publishing a reference spec I'm free to use to develop my own code."

As Thomas said, people would be less bitchy, and less holier-than-thou (cough), had Moxie not written any code, or written it and charged an arm and a leg for it.

It's sort of what patio11 talks about. The cheaper the service, the worse the people treat you.


I didn't say anything about Moxie.


> Why don't you let him reply.

The internet doesn't work that way.

> You don't have to be such an ass. I asked nicely enough.

No, not really. Would you have asked the creator of a closed source crypto library to give it away?

I used to agree with you, that security software should be BSDLed to encourage use, but now I see it just encourages more low-end closed-source software.

If that software was open, users could know what they were using and could with work really be safe. But by trusting a closed source app, especially one that can't afford anything for security, they'll never be secure (see this article for proof) and thus are worse off than if they're knowingly only partially secure.

It sounds rough, but better the mob steal some money because you used an insecure app, causing you learn and audit your security requirements, than for you to feel secure until someone shows up and shoots you.


Fully agree. Securing an application is just part of the overhead of creating it. To expect people to hand these bits and pieces out seems a bit overboard, if not somewhat entitled. This stuff costs time, money, and effort to make. The author released it under GPL3. If you can't afford to shell out for it, you can use the code to reroll your own. There's plenty of documentation on the topic as well.


Agreed. The onus shouldn't just be on Moxie. We could easily flip the question around. Why not put the onus on the company's mentioned. Why don't Facebook and Instagram relicence their code as GPL to be compatible. Do they not want the Web to be safe? Will they put "not having a GPL app" before "our users are safe"? Etc etc


> I don't see Instagram, Facebook, etc. using that code to secure their apps, they won't license their Android clients as GPLv3

They might license the code under a non-exclusive license with different terms. I.e. the copyright holder is free to license the same source code under various licenses.

So, e.g. I could license some code to the community under GPL, but I could also license it closed-source to a corp for a fee.


> money buys technology

I don't think it matters. He quickly noticed that the problem is cultural, that

>> I’d much rather think about the question of exploit sales in terms of who we welcome to our conferences, who we choose to associate with, and who we choose to exclude, than in terms of legal regulations. I think the contextual shift we’ve seen over the past few years requires that we think critically about what’s still cool and what’s not.

But the problem with/in Saudi Arabia is also cultural, or social, not technological. It doesn't really matter that they can buy exploits or intercept communications. What matters is that those in power can stay in power while doing all that.

Mao and Stalin built some of the most repressive regimes the world has seen with 1930s technology, and even then they were behind the times. Do you think those would have been rocked by secure Twitter? On the other hand, Greeks ran fairly decent democracies when the closest thing to mass communications was shouting in a place with good acoustics.

I'm not saying the west should just provide scum of the world with access to modern technology. Let's not kid ourselves though. Whether we do or not, it won't change much.


The problem here in Saudi is indifference. People take it as granted: "Of course its being intercepted" or "they know everything, don't even try." Its a decapitating indifference to the extent that people around me are mystified why I have a VPN connection 24/7 on my desktop and mobile phone; why do I even bother? Even many techies around me think I am naive to be taking all these precautions. Resistance is futile.

PS: Mobily is my carrier. Discomforting. Maybe resistance is futile after all. Sigh.


It is possible to believe both things at the same time: that dictatorships will inevitably acquire exploits, backdoors, and monitoring tools, and that it's unconscionable for companies to sell these things to dictatorships.

The story is perhaps clearer on exploit markets. The alternative to markets is publication, which burns the vulnerability by hastening its patch deployment. Dictatorships will inevitably acquire more exploits, but they are in a race against everyone in else discovering vulnerabilities.


I think there's a defensible case that in the case of Saudi Arabia, there are other parties operating in the country who are much worse for the rights of both Saudi citizens and humans elsewhere, and selling the digital equivalent of arms to the Saudi Government isn't inherently evil.

I'd sure rather deal with the current Saudi Government than with Al Qaeda. Yes, there are fairly bad elements within the government, and it is at best one of the more restrictive regimes in the world, but there are some alternatives that are worse.


> dictatorships will inevitably acquire exploits

That's not what I mean. Even if you somehow stop them from acquiring exploits, they will remain in power because it's not derived from subtle technological advantages.


Perhaps the way to phrase it is that the soviet surveillance apparatus was an expression of power, just as the modern technological surveillance apparatus is an expression of modern power.

I think that this stuff matters, to the extent that I'd like to be in solidarity with those everywhere who are in a tension against authority. Not selling exploits is one small way that I can do that, and writing a blog post about it is one small contribution (I can hope) to creating a culture of doing that.


If the technological advantages do not help them stay in power, why do you think they would pursue them? And what 'advantage' would they be?


Why do people eat themselves into morbid obesity? Why did soviets reverse rivers? Why are American prosecutors trying to jail kids for sexting?

Is eating not obviously beneficial? Is there something wrong with large-scale engineering? Shouldn't we fight child pornography?

Drives, rules, and organisations outlive and outgrow their usefulness all the time. Why would surveillance be an exception?


It'll change plenty if we do. Oppressive regimes are taken down by conspiracies and secret communication. If they eliminate this ability to associate, with our assistance, there will never be any space for revolution, or even reform.

This logic reeks of the law of averages: "I might as well swim over Niagara Falls because I could die any day, even from crossing the street. If I die today, it was just my day to die."

Of course, I have less of a reply to "Even if I don't sell it to them, someone will." The middle class finds it very easy to rationalize behavior that will keep the consumption flowing.


Like Moxie says, money buys technology, and they will eventually find someone to rig up a workable solution for what they're trying to do.

Governments are in a unique position here. They can always just move up the stack. Can't break the crypto? That's fine. They can just require the mobile phone companies to sell phones with spyware already included.


That problem is, I think, a showstopper for "anti-circumvention" tools like whatever- the- next- generation- of - Tor will be. Dictatorships have little to lose by backdooring or rootkitting devices; they'll laugh off any outrage stirred up by the discovery of these methods.

But the economics flip around in Europe, Japan, the US, &c: governments there do have something to lose by surreptitiously backdooring huge numbers of devices, and the odds are good that any efforts to do so will be detected (the state of the art for reverse engineering now includes decapsulation and imaging of electronics packages).


That problem is, I think, a showstopper for "anti-circumvention" tools like whatever- the- next- generation- of - Tor will be. Dictatorships have little to lose by backdooring or rootkitting devices; they'll laugh off any outrage stirred up by the discovery of these methods.

Well until something like the DIY Cellphone gets more traction to deal with backdooring/rootkitting: https://webcache.googleusercontent.com/search?q=cache:http:/... (MIT Media Lab)


The US government is a special case again. Since most of the companies mentioned here are headquartered in the US, the US government can resort to the no-tech solution of just asking for the data and presenting a subpoena (or so was my experience working for a large US telecom carrier).


The difference is that a subpoena doesn't decrypt an EDH TLS session.


But it does decrypt the data at rest.


It could be argued that the scandal in Germany proves even european governments don't have a lot to lose by backdooring devices beyond what the law permits them. Admittedly, the number of backdoored devices was probably low, but the government did seem to act unlawfully.

http://www.spiegel.de/international/germany/the-world-from-b...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: