Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If the standard had clear algorhitm -> source code, thrn couldnt everyone copy from there though?




AES is actually a good example of why this doesn’t work in cryptography. Implementing AES without a timing side channel in C is pretty much impossible. Each architecture requires specific and subtle constructions to ensure it executes in constant time. Newer algorithms are designed to not have this problem (DJB was actually the one who popularized this approach).

Reconcile this claim with, for instance, aes_ct64 in Thomas Pornin's BearSSL?

I'm familiar with Bernstein's argument about AES, but AES is also the most successful cryptography standard ever created.


Okay, I should've said implementing AES in C without a timing sidechannel performantly enough to power TLS for a browser running on a shitty ARMv7 phone is basically impossible. Also if only Thomas Pornin can correctly implement your cipher without assembly, that's not a selling point.

I'm not contesting AES's success or saying it doesn't deserve it. I'm not even saying we should move off it (especially now that even most mobile processors have AES instructions). But nobody would put something like a S-Box in a cipher created today.


If your point is "reference implementations have never been sufficient for real-world implementations", I agree, strongly, but of course that cuts painfully across several of Bernstein's own arguments about the importance of issues in PQ reference implementations.

Part of this, though, is that it's also kind of an incoherent standard to hold reference implementations to. Science proceeds long after the standard is written! The best/safest possible implementation is bound to change.


I don't think it's incoherent. On one extreme you have web standards, where it's now commonplace to not finalize standards until they're implemented in multiple major browser engines. Some web-adjacent IETF standards also work like this (WebTransport over HTTP3 is one I've been implementing recently).

I'm not saying cryptography should necessarily work this way, but it's not an unworkable policy to have multiple projects implement a draft before settling on a standard.


Look at the timeline for performant non-leaking implementations of Weierstrass curves. How long are you going to wait for these things to settle? I feel like there's also a hindsight bias that slips into a lot of this stuff.

Certainly, if you're going to do standards adoption by open competition the way NIST has done with AES, SHA3, and MLKEM, you're not going to be able to factor multiple major implementations into your process.


This isn’t black and white. There’s a medium between:

* Wait for 10 years of cryptanalysis (specific to the final algorithm) before using anything, which probably will be relatively meager because nobody is using it

* Expect the standardization process itself to produce a blessed artifact, to be set on fire as a false god if it turns out to be imperfect (or more realistically, just cause everybody a bunch of pain for 20 years)

Nothing would stop NIST from adding a post-competition phase where Google, Microsoft, Amazon, whoever the hell is maintaining OpenSSL, and maybe Mozilla implement the algorithm in their respective libraries and kick the tires. Maybe it’s pointless and everything we’d expect to get from cryptographers observing that process for a few months to a year has already been suitably covered, and DJB is just being prissy. I don’t know enough about cryptanalysis to know.

But I do feel very confident that many of the IETF standards I’ve been on the receiving end of could have used a non-reference implementation phase to find practical, you-could-technically-do-it-right-but-you-won’t issues that showed up within the first 6 months of people trying to use the damn thing.


I don't know what you mean by "kick the tires".

If by that you mean "perfect the implementation", we already get that! The MLKEM in Go is not the MLKEM in OpenSSL is not the MLKEM in AWS-LC.

If instead you mean "figure out after some period of implementation whether the standard itself is good", I don't know how that's meant to be workable. It's the publication of the standard itself that is the forcing function for high-quality competing implementations. In particular, part of arriving at high-quality implementations is running them in production, which is something you can't do without solving the coordination problem of getting everyone onto the same standard.

Here it's important to note that nothing we've learned since Kyber was chosen has materially weakened the construction itself. We've had in fact 3 years now of sustained (urgent, in fact) implementation and deployment (after almost 30 years of cryptologic work on lattices). What would have been different had Kyber been a speculative or proposed standard, other than it getting far less attention and deployment?

("Prissy" is not the word I personally would choose here.)


I mean have a bunch of competent teams that (importantly) didn’t design the algorithm read the final draft and write their versions of it. Then they and others can perform practical analysis on each (empirically look for timing side channels on x86 and ARM, fuzz them, etc.).

> If instead you mean "figure out after some period of implementation whether the standard itself is good", I don't know how that's meant to be workable.

The forcing function can potentially be: this final draft is the heir apparent. If nothing serious comes up in the next 6 months, it will be summarily finalized.

It’s possible this won’t get any of the implementers off their ass on a reasonable timeframe - this happens with web standards all the time. It’s also possible that this is very unlikely to uncover anything not already uncovered. Like I said, I’m not totally convinced that in this specific field it makes sense. But your arguments against it are fully general against this kind of phased process at all, and I think it has empirically improved recent W3C and IETF standards (including QUIC and HTTP2/3) a lot compared to the previous method.


Again: that has now happened. What have we learned from it that we needed to know 3 years ago when NIST chose Kyber? That's an important question, because this is a whole giant thread about Bernstein's allegation that the IETF is in the pocket of the NSA (see "part 4" of this series for that charming claim).

Further, the people involved in the NIST PQ key establishment competition are a murderers row of serious cryptographers and cryptography engineers. All of them had the knowhow and incentive to write implementations of their constructions and, if it was going to showcase some glaring problem, of their competitors. What makes you think that we lacked implementation understanding during this process?


I don’t think IETF is in the pocket of the NSA. I really wish the US government hadn’t hassled Bernstein so much when he was a grad student, it would make his stuff way more focused on technical details and readable without rolling your eyes.

> Further, the people involved in the NIST PQ key establishment competition are a murderers row of serious cryptographers and cryptography engineers.

That’s actually my point! When you’re trying to figure out if your standard is difficult to implement correctly, that everyone who worked on the reference implementations is a genius who understands it perfectly is a disadvantage for finding certain problems. It’s classic expert blindness, like you see with C++ where the people working on the standard understand the language so completely they can’t even conceive of what will happen when it’s in the hands of someone that doesn’t sleep with the C++ standard under their pillow.

Like, would anyone who developed ECC algorithms have forgotten to check for invalid curve points when writing an implementation? Meanwhile among mere mortals that’s happened over and over again.


I don't think this has much of anything to do with Bernstein's qualms with the US government. For all his concerns about NIST process, he himself had his name on a NIST PQC candidate. Moreover, he's gotten into similar spats elsewhere. This isn't even the first time he's gotten into a heap of shit at IETF/IRTF. This springs to mind:

https://mailarchive.ietf.org/arch/msg/cfrg/qqrtZnjV1oTBHtvZ1...

This wasn't about NSA or the USG! Note the date. Of course, had this happened in 2025, we'd all know about it, because he'd have blogged it.

But I want to circle back to the point I just made: you've said that we'd all be better off if there was a burning-in period for implementors before standards were ratified. We've definitely burnt in MLKEM now! What would we have done differently knowing what we now know?


> What would we have done differently knowing what we now know?

With the MLKEM standard? Probably nothing, Bernstein would have done less rambling in these blog posts if he was aware of something specifically wrong with one of the implementations. My key point here was that establishing an implementation phase during standardization is not an incoherent or categorically unjustifiable idea, whether it makes sense for massive cryptographic development efforts or not. I will note that something not getting caught by a potential process change is a datapoint that it’s not needed, but isn’t dispositive.

I do think there is some baby in the Bernstein bathwater that is this blog post series though. His strongest specific point in these posts was that the TLS working group adding a cipher suite with a MLKEM-only key exchange this early is an own goal (but that’s of course not the fault of the MLKEM standard itself). That’s an obvious footgun, and I’ll miss the days when you could enable all the standard TLS 1.3 cipher suites and not stress about it. The arguments to keep it in are legitimately not good, but in the area director’s defense we’re all guilty of motivated reasoning when you’re talking to someone who will inevitably accuse you of colluding with the NSA to bring about 1984.


In what way is adding an MLKEM-only code point an "own goal"? Exercise for the reader: find the place where Bernstein proposed we have hybrid RSA/ECDH ciphersuites.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: