Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one. This is a property of the algorithm being specified, not just an individual implementation, and we’ve seen it play out over and over again in cryptography.

I’d actually like to see more (non-cryptographic) standards take this into account. Many web standards are so complicated and/or ill-specified that trillion dollar market cap companies have trouble implementing them correctly/consistently. Standards shouldn’t just be thrown over the wall and have any problems blamed on the implementations.





> No, the argument is that the algorithm (as specified in the standard) is difficult to implement correctly, so we should tweak it/find another one.

This argument is without merit. ML-KEM/Kyber has already been ratified as the PQC KEM standard by NIST. What you are proposing is that the NIST process was fundamentally flawed. This is a claim that requires serious evidence as backup.


You can't be serious. "The standard was adopted, therefore it must be able to be implemented in any or all systems?"

NIST can adopt and recommend whatever algorithms they might like using whatever criteria they decide they want to use. However, while the amount of expertise and experience on display by NIST in identifying algorithms that are secure or potentially useful is impressive, there is no amount of expertise or experience that guarantees any given implementation is always feasible.

Indeed, this is precisely why elliptic curve algorithms are often not available, in spite of a NIST standard being adopted like 8+ years ago!


I'm having trouble understanding your argument. Elliptic curve algorithms have been the mainstream standard for key establishment for something like 15 years now. The NIST standards for the P-curves are much, much older than 8 years.

> You can't be serious. "The standard was adopted, therefore it must be able to be implemented in any or all systems?"

If we did that we'd all be using Dual_EC...


DJB has specific (technical and non-conspiratorial) bones to pick with the algorithm. He’s as much an expert in cryptographic implementation flaws and misuse resistance as anybody at NIST. Doesn’t mean he’s right all the time, but blowing him off as if he’s just some crackpot isn’t even correctly appealing to authority.

I hate that his more tinfoil hat stuff (which is not totally unjustified, mind you) overshadows his sober technical contributions in these discussions.


There are like 3 cryptographers in all of NIST. NIST was a referee in the process. The bones he's picking are with the entire field of cryptography, not just NIST people.

> The bones he's picking are with the entire field of cryptography

Isn't that how you advance a field, though?

It has been a couple hundred years, but we used to think that disease was primarily caused by "bad humors".

Fields can and do advance. I'm not versed enough to say whether his criticisms are legitimate, but this doesn't sound like a problem, but part of the process, to me (and his article is documenting how some bureaucrats/illegitimate interests are blocking that advancement).

The "area adminstrator" being unable or unwilling to do basic math is both worrying, and undermines the idea that the standards that are being produced are worth anything, which is bad for the entire field.

If the standards are chock full of nonsense, then how does that reflect upon the field?


The standards people have problems with weren't run as open processes the way AES, SHA3, and MLKEM were. As for the rest of it: I don't know what to tell you. Sounds like a compelling argument if you think Daniel Bernstein is literally the most competent living cryptographer, or, alternately, if Bernstein and Schneier are the only cryptographers one can name.

In a lot of ways this seems, from the outside, to be similar to "Planck's principle"; e.g. physics advances one funeral at a time.

In exactly what sense? Who is the "old guard" you're thinking of here? Peter Schwabe got his doctorate 16 years after Bernstein. Peikert got his 10 years after.

They may not be involved with this process, but ITL has way more than 3 cryptographers.

> I hate that his more tinfoil hat stuff (which is not totally unjustified, mind you) overshadows his sober technical contributions in these discussions.

Currently he argues that NSA is likely to be attacking the standards process to do some unspecified nefarious thing in PQ algorithms, and he's appealing to our memories of Dual_EC. That's not tinfoil hat stuff! It's a serious possibility that has happened before (Dual_EC). True, no one knows for a fact that NSA backdoored Dual_EC, but it's very very likely that they did -- why bother with such a slow DRBG if not for this benefit of being able to recover session keys?


NSA wrote Dual EC. A team of (mostly European) academic cryptographers wrote the CRYSTALS constructions. Moreover, the NOBUS mechanism in Dual EC is obvious, and it's not at all clear where you'd do anything like that in Kyber, which goes out of its way not to have the "weird constants" problem that the P-curves (which practitioners generally trust) ended up with.

It took a couple of years to get the suspicion about Dual_EC out.

No it didn't. The problem with Dual EC was published in a rump session at the next CRYPTO after NIST published it. The widespread assumption was that nobody was actually using it, which was enabled by the fact that the important "target" implementations (most importantly RSA BSAFE, which I think a lot of people also assumed wasn't in common use, but I may just be saying that because it's what I myself assumed) were deeply closed-source.

None of this applies to anything else besides Dual EC.

That aside: I don't know what this has to do with anything I just wrote. Did you mean to respond to some other comment?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: