Hacker Newsnew | past | comments | ask | show | jobs | submit | ott2's commentslogin

Thanks for a sensible explanation about the possible rationale of the actors. Perhaps a different way to distil your point: the film industry is stuck in a particular local optimum, and one needs to take into account the position of the actors when assessing their actions.


Your first point is false -- in some countries bauxite is processed with power from coal-fired plants. Your other points seem valid.


That's madness or desperation. Sensible people ship bauxite as far as necessary to find cheap energy.


Since you claim not to be a statistician, pardon me if I also ignore your unsubstantiated opinion that the null hypothesis was "so shitty that they practically guaranteed p<.05".


Hey, you don't need to be a statistician to understand the "data mining" fallacy. If you look at 100 different null hypotheses, and you set p<.05 as your threshold for rejecting, then you're going to find ~ 5 ones you can reject at p<.05. Duh. Monkeys typing Shakespeare and all that.


The takeaway of the article seems to be: when fighting limited-liability companies, being irrational is a good tactic. It is sad if guerilla warfare is the only option left when fighting corporate actions one disagrees with.


Worryingly accurate metaphor: Unix is Christianity, complete with Linus as Luther.


TL;DR: the "creativity" label has been grabbed by the ruling classes to justify their own social position, and has little to do with what used to be called creativity. (This may be an interesting insight, but I'm not sure this argument is well supported by the article.)


There's at least three poles pulling against each other WRT stealing the creativity label, you've got one, two other poles are masquerading technical incompetence as creativity, and the other (related?) is masquerading a fundamental lack of good style taste as creativity.

There probably are people/situations trying to do more than one at the same time.


There are a number of points in the article. Another is that books about creativity are simply another kind of management self-help "power of positive thinking" tripe.


You cannot know whether the date the content was published will be relevant for a reader or not. It is not your assessment of timeliness that matters, but the reader's. So in the interest of doing the right thing, don't obscure the date.


Your summary is worth a thousand web sites.


Can we make this saying a thing? I want it to be a thing.


Check the date -- the proposal is a month old. So Bruce Schneier's comments may be relevant, but one can't really accuse Brian Smith of ignoring them.


Good point, I missed that on the first pass, I look forward to an update from him or a comment on this from Schneier.


So, Mr Smith wants to add a bunch of elliptic curve options (using the NIST curves, weakest P-256 first), while removing the widely used TLS_RSA_WITH_AES_256_CBC_SHA256 due to "concerns" about "performance" and an unsubstantiated argument that ephemeral key exchange is somehow always better. Hmm.


Mr. Smith joins pretty much the entire mainstream of cryptography in urging people to switch away from RSA and towards ECC. The NIST P-256 curve is the most common ECC curve used. It was generated by picking a prime that is fast to compute with and hashing a string with SHA-1.


As djb pointed out in "Security dangers of the NIST curves", SHA-1 does not prove much. If NSA knows a weak class of curves, they try as many strings as they want until SHA-1 of the string hits a weak curve.


It doesn't prove much; you can always just generate a new P-256-alike with the instructions NIST provided for you in that document.


Who can generate? Aren't the values fixed by the standards, mustn't both client and servers use the same as long as they support the given standard?


The standard provides a standard set of curve parameters and a NIST-sanctioned way of generating new curve parameters using the exact same method.


How can be any other parameters than the standard ones used in the current browsers and servers? I think they can't, am I right?

And how can browsers start to use any other parameters before they standardise them? I think they can't?


This is true but not particularly meaningful to me, because you can't really do anything new with crypto at all without some kind of software update. For instance, the primes and generator for conventional number theoretic DH are also pre-generated and baked into a standard.


So maybe we should generate our own curves. I propose something as follows:

1. Locate a public string. A tweet or a quote should suffice.

2. SHA-512 the string to obtain a seed.

3. Use that seed to generate b, and calculate N = #E(Fp) = n * h, and choose a base point P. Of course we need to ensure that these parameters are safe against known attacks.

4. Mandate that the new set of parameters MUST be supported wherever NIST prime curves are supported.

The last step is probably the most difficult. You don't need that if you don't need to interoperate with other implementations though.


The only thing you want not to happen is for software to start generating and negotiating its own curves, because that then requires all interoperable implementations to parse and validate random curves from attackers.


No, I didn't say that everyone generates their own curves. I meant the security community should generate our own curves. Somebody should email Thomas Pornin.


There are already several alternative curve sets, satisfying various degrees of paranoia:

- http://certivox.org/display/EXT/CertiVox+Standard+Curves

- http://tools.ietf.org/html/rfc5639

- curve25519 and the other djb et al curves.


Sorry, I didn't mean to imply you were saying that.


TLS_RSA_WITH_AES_256_CBC_SHA256 is not forward secret (if you have the certificate private key you can passively decrypt all past/future sessions) so removing it is a great idea. I'm only responding to what you said and not making a judgement about the rest of the guy's suggestions.


Beware of random cryptographers bearing suggestions? #HUMINT


I really hope TLS crypto standards won't be set (or changes rejected) based on who proposed a particular change.

Changes should stand on their own merits. I don't see why we should trust anyone, random or not. It's not about trusting people, it's about trusting algorithms. So it doesn't really matter who in particular proposes changes.


In a ideal world, but if you get a lot of "contributions" some might slip through. Remember, no one can match NSA's budget, manpower and maybe brainpower.


Some of the OpenSSH and IETF mailing list archives about SSH also make interesting reading. Exactly why does RFC 4253 mandate Group 1 and Group 14 as the only required key exchange mechanisms? (They were defined by RFC 2412, written by someone who was "assigned to the DARPA Information Technology Office".) However, perhaps it would be better to fork a separate thread for such speculations.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: