Love me some Peter Gutmann. This is classic Gutmann.
A lot of this presentation is mooted by understanding PQC as an scientific question rather than an engineering one. What are the precise natures of quantum-superior attacks on cryptosystems and what are key establishments and signatures that resist those attacks? Whatever else you think of quantum cryptanalysis those are undeniably important theoretical questions.
A few more slides are mooted by the likelihood that any mainstream deployed PQC system is going to be hybridized with a classical cryptosystem.
As an articulation of a threat model for modern computing, it simultaneously makes some sense and proves too much: if you think OWASP-type vulnerabilities are where everyone's head should be at (and I sort of agree), then all of cryptography is a sideshow. I'm a connoisseur of cryptographic vulnerabilities that break real systems the way SQL injection does (a bitflipping attack on an encrypted cookie, a broken load-bearing signature scheme) but even I have to admit there's 1 of those for every 10,000 conventional non-cryptographic attack.
But of course, it also depends on who your adversary is. Ironically, if you're worried about state-level SIGINT, the barrier for OWASP-style attacks may be higher than that of large-scale codebreaking; passive interception and store-now-decrypt-later is the SIGINT love language.
My biggest thing with all of this is a core belief about organizations like NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost.
(Also, his RSA-1024 analysis is off; it's missing batch attacks).
Reading Gutmann I see a disconnect from reality. Yes a cryptographically relevant quantum computer may never materialize, nor a useful quantum computer at all. These are interesting research topics and interesting discussions. But it doesn't matter. Using that as arguments against PQC is moot, no longer relevant.
The decision to move the world to PQC has already been taken, and we are in full transition mode. Orgs like NATO has transitioned. Many have published time plans to phase out classical algos (NSA, GHQC, Australia) or will shortly (EU). Orgs like ETSI will transition to PQC in upcoming revisions (5G) and next gen systems (6G). Just to name a few.
In general, looking at agencies, nations pushing for PQC algorithms and transition time plans seems to be curiously well aligned. All using the NIST PQC algos, no real mentioning of hybrid solutions. Australias plan to ban the use of SHA-256 is one that sticks out a bit from the others.
Regarding time plans, I find the suddenness, almost haste to be quite interesting. I've been searching quite extensively to gather info for PQC talks, and the motivation for a stated time plan are rarely well motivated. Much hand waving. And what drove the rush? I'm suspecting a bit of a crowd panic. But there is a clear change around 2022, 2023. Suddenly the PQ threat became very important. And within 10 years (or less) the world will be using PQC basically everywhere PKI is used besides legacy systems where lifetimes and update issues makes the change unrealistic.
And are we really sure the schemes used will be hybrid? If DJB is right, NIST is pushing for PQC only. The NSA recommendations that was released last year AFAIK does not state anything about hybrid solutions. Details are less clear what NATO has transitioned to but seems to be PQC only. And the ETSI push for PQC in 5G, 6G does not seem to aim for hybrid solutions. I would love to be told to be wrong about the use of hybrid schemes.
The iMessage protocol, Signal and things happening in IETF goes for hybrid schemes, which I think is the right way (if that count for anything). So it seems we have the open world and private sector versus governments, orgs fairly close to governments when it comes to hybrid vs PQC only.
But maybe this is actually the reason behind the presentation: he sees this vast effort to transition to PQC, and he's wandering whether it is worth it or not.
And for what is worth, I'm also a bit skeptic of the maturity of PQ cryptosystem, as they tend to be much more complicated than the classical counterpart and didn't receive as much scrutiny. What happened to Rainbow (layered UOV) is a cautionary tale.
Yes, but that is why I say that his argument is moot. It is probably not worth going to PQC, and it will probably expose us to problems. But it is too late to try and get people to understand, agree on this.
The world has decided that the percieved risk makes it worth going to PQC and we can't change that fact no matter the nice pictures of Schwerer Gustav. We are in full transition mode and will not go back. Even if CRQC never materialize.
We probably will experience a number of issues, problems that we wouldn't have by not going to PQC. Driving costs, making things vulnerable. More work for us I guess.
> I find the suddenness, almost haste to be quite interesting.
> But there is a clear change around 2022, 2023.
I think that's probably because the NIST competition [1] to choose their standard algorithms really started to heat up then.
NIST has a very large gravity well in the academic and industrial cryptographic community, so as soon as it became clear which algorithms NIST would pick (they chose Kyber / ML-KEM and Dilithium / ML-DSA), the (cryptographic) world felt it could start transitioning with much more certainty and haste.
Yes, that is one aspect, and when the drafts was published you could see orgs started running (I've got a nice timeline in my slides). But I still find the haste interesting. There is very little time for the transitions compared to the adoption rate of other crypto standards. The NIST algos are imho still quite immature, which is one big motivation for hybrid schemes.
A bit off topic, as a European, what is happening with DOGE, slashing funding for CISA, TAA etc, I'm seriously worried about NIST. As you say, NIST is very important in many areas. For USA, with things like the coordintated universal time normal. But also for federal cybersec standards that have led to interop with the rest of the world cryptographically. Will NIST be slashed, and if so will the crypto department be spared? If not, what would remain? New standards, the validation program? Will Falcon become a standard, or for that matter the new lightweight symmetric algo based on Ascon? (For which I'm eagerly waiting for NIST to publish test vectors so that I'm able verify that my implementation is compliant.)
I think the haste is probably down to a risk calculation. If practical quantum breaks of classical crypto don't materialise in the next 5-10 years, "all" that's happened is we've cycled onto a new cypher suite sooner than we otherwise would have.
The reverse picture, where they do and we haven't, is so colossally damaging that it doesn't matter if the probability of quantum breaks landing is actually quite small. In expected value terms we still come out ahead.
You don't need to assume that someone in an NSA lab has already demonstrated it for this to work out, and you don't need to assume that there is ever a practical quantum computer deployed for this stuff. All you need is for the probability to be above some small threshold (1%? 5%? I could believe something in that range) to make running for the exits the right move today.
> Because the current plans aren't to migrate to just hybrid classical+PQC schemes, the plans are to migrate to PQC fully. Discarding both RSA and ECC.
This isn't true. NIST has been saying that, but everyone else just laughs and implements hybrid since throwing out RSA/ECC is so obviously stupid.
If you have references to nations, governments that state that transition to hybrid I would love to get references. The EU transition will not be hybrid. The NSA plan is not hybrid. ETSI is not hybrid.
My view is that IETF and commercial entities such as Apple, Google and open source world are the ones going hybrid. In this case I would love to be wrong.
That is a very relevant point. Add a bit of scare mongering, herd mentality and downplaying of the technical effects, risks, you get the ones setting policies taking a decision to transition - just like everybody else.
When I have seen time estimates, everyone is referring to Mosca's Theorem. This is the idea that "store now, decrypt later", combined with the estimated time until a working quantum cryptanalysis is feasible, and a finite transition time for existing crypto standards and technologies (think update times for long-living tokens like ID cards with certificates) makes the available delay until a change must start quite short.
Some additional facts that may aid the discussion:
- Dilithium + Kyber is faster than ECDSA + ECDH on the same hardware. Depending on the platform, it can be up to 33% faster.
- Most commercial entities are implementing hybrid, but in concatenation, not layered, mode. The classical inclusion is mainly for compatibility with "legacy" systems.
As long as the speed difference isn't orders of magnitude, and you are doing many, many session inits, I don't see this to be a real argument. For embedded systems, the difference is indeed at least 10x. As we observed running Dilithium on RV32I on the Tillitis Tkey (https://tillitis.se/). EdDSA takes about a second, which is ok for a single signature. ML-DSA on the same platform is ~20 seconds. But yes, if you are a web server and don't scale automatically with number of sessions, the better performance is good.
The negative thing for all systems with the current NIST PQC algoritms are the longer keys, which gets even worse when going hybrid. See the experiments by Cloudflare for example with PQC in TLS, and adding PQC keys in certificates.
I agree that for certificates, there is a possibility of being compatible with legacy systems by adding the ML-DSA signature as an extension (and the legacy system being able to handle the larger certs). But please show references for the main reason for hybrid is compatibility. IF we look at the motivation for the TLS 1.3 hybrid scheme it states:
"The primary goal of a hybrid key exchange mechanism is to facilitate the establishment of a shared secret which remains secure as long as as one of the component key exchange mechanisms remains unbroken."
That page states that backwards compatibility maybe is one of several possible additional goals. But it is not the main goal.
The problem PQC algorithms must solve is not only to be resistant to attacks by future quantum computers but also against attacks on classical computers. And the hardness in terms of security as the number of bits in the key must scale about as fast as classical algorithms. And work as about well a classical algorithms on classical computers.
All ciphers have warts. The ones we use are the ones we think are secure, but also have warts we can live with. RSA scales slowly with number of bits (and have other warts). EC scales faster and becomes faster than RSA for the same strength, but has other warts. McEliece seems like a good, conservative PQC algorithm, but those keys... TDEA was deemed to provide a too low security margin, but it also was to slow (48 rounds) and with a too small block size.
"Reading Gutmann I see a disconnect from reality. Yes a cryptographically relevant quantum computer may never materialize, nor a useful quantum computer at all. [...] But it doesn't matter."
I hear you saying this: Gutmann's factual points may be correct. But he is discussing quantum computing. Quantum computing is so irrelevant to post-quantum cryptography that even mentioning it in that context makes Gutmann seem disconnected from reality.
I don't necessarily disagree. I'm just trying to make sure I understand.
Good question. I may have not explained very well. Let me give it a try.
1. I read the first part, with Gustav Schwerer etc as an argument against CRQC to ever become possible, so moving to PQC is not needed.
2. I read the second part, about the hardness of attacking classical algorithms, for example the DES cracker as a motivation against attacks on crypto being an actual threat vector. Which ties in to point 1 as a CRQC would be very expensive and hard to use in practice. It is not a computer but a physics experiment.
3. He then talks about real threats and how they don't change very much. Points to OWASP top ten etc.
I totally agree with him on point 1 and 2. I'm just as skeptical. But what I'm saying is that it is too late. We are quite possibly switching to algorithms that will never add any security. Spending huge resources and pushing unneeded changes to systems around the world. Telling the world that it is unnecessary will not change that fact.
Coming to point 3. I don't agree with him. Yes, OWASP top 10 shows the same more or less trivial attack are the ones being used. Nobody use a zero day unless it is needed. If I can become sysadmin through a reused password or a misconfiguration of Teams, why use something more advanced? But I see him using this as an argument against ciphers being broken a real threat vector, and that is a different attack.
The first one is an active attack against a system. It may be a nation state actor that wants to infiltrate, get a persistent access, exfiltrate and possibly destroy the system. It may be a ransomware organization. They will do the same thing, but their timeline is much shorter. (and of course the difference between a nation state and organized crime can be very blurry).
But recording Internet traffic, and over long time (days, months, years, decades) try to decrypt it are solely of interest to a nation state. And for that attack and end game, what OWASP top ten looks like is totally irrelevant. It is not an active attack against a system. It is done in secrecy by entities with a lot of patience by entities that have huge resources. For them Quantum Computers are very interesting and relevant to discuss. But not in public.
I guess that was a waay to long answer. But I hope it explained what I ment.
> NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost
I’m tempted to draw an analogy to the space race of the 60s, and the current AI panic
I'm with you, but the government (at least in the US and UK) should definitely be spending more time figuring out how to patch reliably, and little less on PQC.
Perhaps the other side of the coin Peter calls "churn" is "updated software". Government, like every other old organization, has a ton of legacy systems. Even if CRQC is a pipe dream, having an excuse to tell everyone across the board they need to update their stuff sometime in the next decade might be a net win.
... you might be worried about other things like postal interception, or physical security of your devices. You can never order any electronics or anything online, or use couriers services if you are person of interest for state level spooks like FBI, CIA. Tech spooks making personal visits to your home, phone or computers in-person also possible.
State SIGINT is generally a lot easier than many of those actions. State SIGINT can especially if the state you’re primarily concerned about isn’t the one you live in.
For many foreign states like US and China its's easier to them to intercept and alter packages going trough borders. Both legally and logistically.
It has already been documented how CIA and FBI have permanent presence in main courier airport hubs. It's easy for Amazon, FedEx or USPS to divert packet or letter trough convener belt that goes trough government areas.
Same for Chinese of course. Anything ordered from China with your name or address cant' be trusted if you believe you might be sufficiently important person of interest for them.
nice analysis. fully agree with you. all the crypto in the world does nothing if i can stream me your framebuffer via some gpu flaw, bring my own broken ass driver on your platform and take continual screenshots of what you so keenly decrypted for me, or sit in your baseband because the java running on your simcard was well... java (and protected with 0000/2580/1337 ♡).
there are so many roads to rome, intelligence communities i gather have taken also to more open roads than breaking any type of crypto. (not to say they dont do that anymore) If its more operationally useful, so they will be doing it. That's a given.
Yeah, I don't think anyone told him about WindsorGreen, the RSA-cracking supercomputer that IBM built for the NSA. RSA-1024 probably should be assumed unsafe at this point.
A lot of this presentation is mooted by understanding PQC as an scientific question rather than an engineering one. What are the precise natures of quantum-superior attacks on cryptosystems and what are key establishments and signatures that resist those attacks? Whatever else you think of quantum cryptanalysis those are undeniably important theoretical questions.
A few more slides are mooted by the likelihood that any mainstream deployed PQC system is going to be hybridized with a classical cryptosystem.
As an articulation of a threat model for modern computing, it simultaneously makes some sense and proves too much: if you think OWASP-type vulnerabilities are where everyone's head should be at (and I sort of agree), then all of cryptography is a sideshow. I'm a connoisseur of cryptographic vulnerabilities that break real systems the way SQL injection does (a bitflipping attack on an encrypted cookie, a broken load-bearing signature scheme) but even I have to admit there's 1 of those for every 10,000 conventional non-cryptographic attack.
But of course, it also depends on who your adversary is. Ironically, if you're worried about state-level SIGINT, the barrier for OWASP-style attacks may be higher than that of large-scale codebreaking; passive interception and store-now-decrypt-later is the SIGINT love language.
My biggest thing with all of this is a core belief about organizations like NSA: that they exist primarily to secure budget for NSA. Given that, the one thing you absolutely don't want to do is have a system that is breakable only at almost-implausible cost.
(Also, his RSA-1024 analysis is off; it's missing batch attacks).