logoalt Hacker News

zahllosyesterday at 2:26 PM6 repliesview on HN

In context, this particular issue is that DJB disagrees with the IETF publishing an ML-KEM only standard for key exchange.

Here's the thing. The existence of a standard does not mean we need to use it for most of the internet. There will also be hybrid standards, and most of the rest of us can simply ignore the existence of ML-KEM -only. However, NSA's CNSA 2.0 (commercial cryptography you can sell to the US Federal Government) does not envisage using hybrid schemes. So there's some sense in having a standard for that purpose. Better developed through the IETF than forced on browser vendors directly by the US, I think. There was rough consensus to do this. Should we have a single-cipher kex standard for HQC too? I'd argue yes, and no the NSA don't propose to use it (unless they updated CNSA).

The requirement of the NIST competition is that all standardized algorithms are both classical and PQ-resistant. Some have said in this thread that lattice crypto is relatively new, but it actually has quite some history, going back to Atjai in '97. If you want paranoia, there's always code theory based schemes going back to around '75. We don't know what we don't know, which is why there's HQC (code based) waiting on standardisation and an additional on-ramp for signatures, plus the expensive (size and sometimes statefulness) of hash-based options. So there's some argument that single-cipher is fine, and we have a whole set of alternative options.

This particular overreaction appears to be yet another in a long running series of... disagreements with the entire NIST process, including "claims" around the security level of what we then called Kyber, insults to the NIST team's security level estimation in the form of suggesting they can't do basic arithmetic (given we can't factor anything bigger than 15 on a real quantum computer and we simply don't have hardware anywhere near breaking RSA, estimate is exactly what these are) and so on.


Replies

HelloNurseyesterday at 2:59 PM

The metaphor near the beginning of the article is a good summary: standardizing cars with seatbelts, but also cars without seatbelts.

Since ML-KEM is supported by the NSA, it should be assumed to have a NSA-known backdoor that they want to be used as much as possible: IETF standardization is a great opportunity for a long term social engineering operation, much like DES, Clipper, the more recent funny elliptic curve, etc.

show 4 replies
adgjlsfhk1yesterday at 4:11 PM

The problem with standardizing bad crypto options is that you are then exposed to all sorts of downgrade attack possibilities. There's a reason TLS1.3 removed all of the bad crypto algorithms that it had supported.

show 2 replies
vessenesyesterday at 7:14 PM

My professors at Brown were walking on QR lattice cryptography well before 1997, although they may not have been publishing much - NTRU was in active development throughout the mid 1990s when I was there. Heating up by 1997 though, for sure.

croteyesterday at 4:32 PM

> In context, this particular issue is that DJB disagrees with the IETF publishing an ML-KEM only standard for key exchange.

No, that's background dressing by now. The bigger issue is how IETF is trying to railroad a standard by violating its own procedures, ignoring all objections, and banning people who oppose it.

They are literally doing the kind of thing we always accuse China of doing. ML-KEM-only is obviously being pushed for political reasons. If you're not willing to let a standard be discussed on its technical merits, why even pretend to have a technology-first industry working group?

Seeing standards being corrupted like this is sickening. At least have the gall openly claim it should be standardized because it makes things easier for the NSA - and by extension (arguably) increasing national security!

vorpalhexyesterday at 3:26 PM

The standard will be used, as it was the previous time the IETF allowed the NSA to standardize a known weak algorithm.

Sorry that someone calling out a math error makes the NIST team feel stupid. Instead of dogpiling the person for not stroking their ego, maybe they should correct the error. Last I checked, a quantum computer wasn't needed to handle exponents, a whiteboard will do.

show 1 reply
aaomidiyesterday at 2:34 PM

Except when the government starts then mandating a specific algorithm.

And yes. This has happened. There’s a reason there’s only the NIST P Curves in the WebPKI world.

show 1 reply