I just want to comment on how clear I find Filippo Valsorda's writing on this kind of thing. Even for an old dunderhead like me, his mathematics and examples were easy to follow. I really appreciate that kind of clarity in technical writing.
Is there any reason to believe that Grover's is as good as it gets? I'm on board here, and I think the article caveats that it's a matter of cost, priority, and assumptions. Cool, cool, I'm already using xaes-256-gcm. But I'm just curious if quantum could have new applications for algorithmic analysis, or take advantage of other weaknesses?
Grover's algorithm is provably optimal [0]. No quantum algorithm will ever find an n-bit key by queries to any reasonable sort of oracle faster than Grover's algorithm, and Grover's algorithm is way too slow to be a serious problem.
But symmetric ciphers are not black boxes. They're mostly built on some variant of a Feistel network, which is a very nice construction for turning a messy function into an invertible function that, in a potentially very strong sense, acts like a cryptographically secure permutation.
When I was in grad school, one project I contemplated but never spent any real time on was trying to either generate a real security proof for quantum attacks on Feistel networks or to come up with interesting quantum attacks. And there is indeed an interesting quantum attack against 3-round Feistel networks [1].
This is interesting, because, depending on what sort of security is needed, three or four rounds of Feistel network are sufficient against classical attack [2].
Now ciphers like AES have many more than 3 rounds, so hopefully they're fine. But maybe they're not. My intuition is that there is probably a reasonably small n1 and a reasonably small n2 >= n1 (probably constants, maybe logarithmic in numbers of bits) for which there is no quantum algorithm that can break symmetric crypto given classical query access even to the round functions (n1) or quantum query access to the round functions (n2) [3], but I'm not aware of any proof of anything of the sort. And my intuition definitely should be be trusted fully! (Maybe, even if I'm wrong, there is still a number of rounds that is sufficient for security against query access to the entire cipher.)
[3] It would be extremely cool if someone built quantum computers and networks and storage such that two parties that don't trust each other could actually communicate and exchange (interesting [5]) qubits. I've written some fun papers on the possible implications of this. If we ever get the technology, then it might actually be meaningful to consider things like chosen-quantum-ciphertext attacks against a classical symmetric cipher. But that's many, many years away, and, in any case, an attacker will only ever get to do a quantum query attack against a cryptosystem if a victim lets them. [4] Otherwise all queries will be classical.
[4] Or in very complex settings where there is an obfuscated black box, for example. This may be relevant for zk-snarks or similar constructions.
[5] I don’t consider the optical qubits exchanged in commercial devices that supposedly implement quantum key distribution to be interesting. To the vendors of such devices, sorry.
The only caveat is that AES is not necessarily a black box. It's possible there may be hidden structure to take advantage of, but if there is there's no reason to suspect it's one that's amenable to a quantum speedup.
As far as the Grover speedup goes, it's already optimal. Requiring O(sqrt(N)) queries is the proven lower bound for unstructured search.
If you like this kind of thing: there's a deterministic algorithm for finding minimum spanning trees in a graph that's proven optimal, but no one knows its exact runtime.
Basically, the best they've proven is something like O(n * inverse_ackermann(n)), but it seems likely the algorithm actually runs in O(n). We also already have a randomised algorithm for this problem that runs in O(n) expected time on worst case input. The expectation is over the random choices.
I think quantum may be practically mitigated with aggressive key rotation in some cases. I've been prototyping an oauth machine-to-machine integration with a banking vendor that has our ecdsa keys rotate every 5 minutes. The keys are scheduled for deletion after 10 minutes. I see no reason I couldn't reduce this to something like 30s/60s. Our counterparty frequently scans our JWKS endpoint for revocation, so in practice an attacker with a quantum computer would need to be very fast if they wanted to break this particular wire agreement the scary way.
This wouldn’t help symmetric key encryption, which is what this is talking about. The keys you are rotating are asymmetric keys, which are only used to exchange symmetric keys for the actual encryption. In good setups, those symmetric keys are changed every session anyway.
If an attacker can break the symmetric encryption in a reasonable amount of time, they can capture the output and break it later.
In addition, how are you doing the key rotation? You have to have some way of authenticating with the rotation service, and what is to stop them from breaking THAT key, and getting their own new certificate? Or breaking the trusted root authority and giving themselves a key?
> This wouldn’t help symmetric key encryption, which is what this is talking about.
I agree. The point I am trying to make is that even for asymmetric encryption (which is far more vulnerable), there are still plausible ways to make a quantum break more difficult.
The only thing that could compromise this scheme, aside from breaking the signing keys, would be to have TLS broken to the extent that viewing real-time traffic is possible. Any TLS break delayed by more than 15 minutes would be worthless.
> Any TLS break delayed by more than 15 minutes would be worthless.
It sounds like you’re talking about breaking TLS’s key exchange? Why would this not have the usual issue of being able to decrypt recorded traffic at any time in the future?
Edit: If it’s because the plaintext isn’t useful, as knorker got at in a sibling comment… I sure hope we aren’t still using classical TLS by the time requiring it to be broken in 1 minute instead of 15 is considered a mitigation. Post-quantum TLS already exists and is being deployed…
The problem with key rotation as a defense is it is going to have to happen at EVERY level. You will have to rotate root CA keys at the same rate, or those could just be hacked, and your rotation won’t matter anymore.
This will probably not help enough for asymmetric keys, and is unnecessary for symmetric keys. https://arxiv.org/abs/2603.28846 claims an attack runtime of a few minutes.
There are enough order-of-magnitude breakthroughs between today and scalable quantum error correction, that it makes no sense to try to to guess exactly the order of magnitude of the attacks that will be feasible.
Either you believe they won't happen, in which case you can keep using long-term ECDSA keys, or you believe they will happen, in which case they are likely to overshoot your rotation period.
Going from breaking a key in a month to breaking a key in 1 second seems trivial compared to the effort of going from where we are now to being able to break a key in a month.
I dont know what the quantum future holds, but if quantum actually happens then i have low faith in your plan.
Sounds like overkill. Quantum is a premature concern, but if there’s really that much paranoia why not use PQC like ML-KEM instead of rolling this strange thing?
I'm not sure what you mean by "this strange thing" as the article promotes AES128 for symmetric encryption and explains why it is dumb to move to "post-quantum" for that use case.
I think there are too many unknowns to bet it all on one horse.
So, if we have to change all of our infrastructure due to a supposed quantum computing threat, I'd go with HybridPQ for asymmetric encryption.
Correct. The keys are only used for signing JWTs. Trust was established with the vendor out of band from this wire protocol (the URL they scan for public keys).
I'm not sure I understand, but haven't you just moved the problem to the out of band layer? And is that layer not secured using the same normal (somewhat) long-lived TLS as most sites?
I don't think I understand the threat model you are using here?
On one hand I hear that quantum computers will crack factorisation and discrete logarithms, on the other that the max number factorised is 15 and that 21 might not even be feasible.
From what i understand the 15 factor was just a stunt and didnt use the actual error corrected algorithm that needs to be used in general.
I think an analogy would be, imagine you are driving across north america in a car, but your engine is broken. The mechanic is near by so you put it in neutral and push it.
If someone said, well it took you half an hour to push it to the mechanic, it will take the rest of your life to get it across north america - that would be the wrong take away. If the mechanic actually fixes the engine, you'll go quite fast quite quickly. On the other hand maybe its just broke and can't be fixed. Either way how fast you can push it has no bearing on how fast the mechanic can fix it or how fast it will work after its fixed.
Maybe people will figure out quantum computers maybe they won't, but the timeline of "factoring" 15 is pretty unrelated.
In the context of cryptography, keep in mind its hard to change algorithms and cryptographers have to plan for the future. They are interested in questions like: is there a > 1% change that a quantum computer will break real crypto in the next 15 years. I think the vibe has shifted to that sounding plausible. Doesn't necessarily mean it will happen, its just become prudent to plan for that eventuality, and now is when you would have to start.
This article, "Factoring is not a good benchmark to track Q-day", was posted this month by one of Cloudflare's lead post-quantum researchers specifically addressing the factoring issue.
It doesn't say much by itself, but it has four very good links on the subject. One of these has a picture of the smallest known factor-21 circuit, which is vastly larger than that of the factor-15 circuit, and comparable to much larger numbers. Another is Scott Aaronson's article making the analogy of asking factoring small numbers as asking for a "small nuclear explosion" - if you're in 1940 and not able to make a small nuclear explosion, that doesn't mean you're much farther away from a big nuclear explosion.
In the last month there has been a sharp vibe shift among cryptography engineers based on rumors that we may have demonstrations of CRQCs much sooner than anticipated, perhaps within 5 years. You're not going to get satisfactory answers beyond that; everybody understands the "factored 15" thing, the people for whom the vibe has shifted have priced that in.
It’s coming from everywhere all at once. Is there a prediction market on timing yet (literally one of the only useful things I can think of for the damnable casinos).
I’ve seen so much change so fast my assumption is someone did it already and preprints are making the rounds.
The idea seems to be that there will some sort of cascading effect if we can somehow create physical qubits with sufficient noise performance. It's this "threshold" we keep hearing about. Once we exceed threshold there is a possibility that we can use error correction to expand everything without limit.
This assumes that there will not be other problems that arise. I suspect that "error correcting" thousands of qubits entangled with one another will be one of those problems.
To get useful results, a quantum computer needs all of its qbits to stay entangled with each other, until the entire group collapses into the result. With current technology, it is very difficult for a reasonable sized group of qbits to stay coherently entangled, so it can only solve problems that are also relatively easy to solve on classical computers.
If someone today were to figure out how to keep large numbers of bits entangled, then quantum computing would instantly be able to break any encryption that isn't quantum safe. It's not something that we are slowly working toward; it's a breakthrough that we can't predict when, or even if, it will happen.
Rotation protects one threat model, not both. A broken signing key five minutes old is one forged-window. Harvested ciphertext in someone's archive does not care when you deleted the session key. Rotate the signer, but put xaes-256-gcm on the payload if you want the bytes safe ten years out.
Very good breakdown, if I’m understanding Grover’s algorithm correctly, are you saying essentially that it would require either too much compute or too much time to be feasible but is still much more realistic than a brute force attack?
If that’s the case, would the time eventually be basically irrelevant with enough compute? For instance, if what’s now a data center is able to fit in the palm of your hand (comparing early computers that took up rooms to phones nowadays). So if compute is (somehow) eventually able to be incredibly well optimized or if we use something new, like how microprocessors were the next big thing, would that then be a quantum threat to 128-bit symmetric keys?
I am not an expert, but while you are correct that a fast enough traditional computer (or a parallel enough computer) could brute force a 128 bit key, the amount of improvement required would dwarf what we have already experienced over the last 40 years, and is likely physically impossible without some major fundamental change in how computers work.
Compute has seen in the ballpark of a 5-10 orders of magnitude increase over the last 40 years in terms of instructions per second. We would need an additional 20-30 orders of magnitude increase to make it even close to achievable with brute force in a reasonable time frame. That isn’t happening with how we make computers today.
> That isn’t happening with how we make computers today.
Keep here in mind that computers today have features approaching the size of a single atom, switching frequencies where the time to cross a single chip from one end to the other is becoming multiple cycles, and power densities that require us to operate at the physical limits of heat transfer for matter that exists at ambient conditions.
We can squeeze it quite a bit further, sure. But anything like 20-30 orders of magnitude is just laughable even with an infinite supply of unobtanium and fairy dust.
You don't need to keep shrinking features. Brute forcing is highly parallel; to break a key within a certain time frame all you need is a large enough quantity of chips. While it's in the realm of science fiction today, in a few centuries we might have nanorobots that can tile the entire surface of mars with processors. That would get you enough orders of magnitude of additional compute to break a 128 bit key. 256 bit would probably still be out though.
Classical brute force is embarrassingly parallel, but Grover's algorithm (the quantum version) isn't. To the extent you parallelize it, you lose the quantum advantage, which means that to speed it up by a factor of N, you need N^2 processors.
The article discusses this in detail, and calculates that "This means we’ll need 140 trillion quantum circuits of 724 logical qubits each operating in parallel for 10 years to break AES-128 with Grover’s."
The power and heat are the issues for that, though. Think about how much energy and heat are used/generated in the chips we have now. If we tiled out those chips to be 20 orders of magnitude larger… where is the heat going to go, and where is the energy coming from?
In my example I had imagined that your nanobots would also create solar panels and radiators for the chips you were tiling the surface of mars with. This is why it needs to be done on the surface instead of underground somewhere.
The calculated DW cost of the quantum attack is 2^104 (with conservative/optimistic assumptions and ignoring the physical cost of a single logical gate), which is "much more realistic than a brute force attack" in the same sense that a 128-bit brute force attack is much more realistic than a 256-bit brute force attack.
None of those are remotely practical, even imagining quantum computers that become as fast (and small! and long-term coherent!) as classical computers.
He mentions "non-existing AES-512" but why not? Why not AES-1024 or AES-4096? Is it too much processing power needed to encrypt and decrypt? I am guessing perhaps also the algo needs work - you can't just take AES-128 and add bits, if you could it would have been done?
WPA3 was announced in 2018 [0]. I don't think it's reasonable to blame them for not anticipating the next decade of cryptographic research.
...but even if they had, what realistically could they have done about it? ML-KEM was only standardized in 2024 [1].
also, the addition of ECDH in WPA3 was to address an existing, very real, not-theoretical attack [2]:
> WPA and WPA2 do not provide forward secrecy, meaning that once an adverse person discovers the pre-shared key, they can potentially decrypt all packets encrypted using that PSK transmitted in the future and even past, which could be passively and silently collected by the attacker. This also means an attacker can silently capture and decrypt others' packets if a WPA-protected access point is provided free of charge at a public place, because its password is usually shared to anyone in that place.
Does it matter if an attacker can decrypt public wifi traffic? You already have to assume the most likely adversary (e.g. the most likely to sell your information) is the entity running the free wifi, and they can already see everything.
It is precisely because the operator of the wifi is not necessarily the adversary a user may be most concerned about. They may be, but they are not the only one. They are the one you know can be, but they aren't the only one.
> You already have to assume the most likely adversary is the entity running the free wifi
why do you have to assume that?
you're at Acme Coffeeshop. their wifi password is "greatcoffee" and it's printed next to the cash register where all customers can see it.
with WPA2 you have to consider N possible adversaries - Acme Coffee themselves, as well as every single other person at the coffeeshop.
...and also anyone else within signal range of their AP. maybe I live in an apartment above the coffeeshop, and think "lol it'd be fun to collect all that traffic and see if any of it is unencrypted".
with WPA3 you only have to consider the single possible adversary, the coffeeshop themselves.
Because it's a near certainty (at least in the US) that businesses will spy on you to the extent that they can, but it's actually incredibly rare to be around a nerd with Wireshark? Things like facebook used to not use https long after public wifi was ubiquitous and you could easily sniff people, and it basically didn't matter. Now nearly everything uses TLS so it really doesn't matter. Actually most public wifi I encounter has no security.
Good post. Entirely correct, and well known amongst quantum researchers, but under appreciated in general.
Grover attacks are very blatantly impractical. When someone describes Grover-type attacks in the same breath as Shor-type attacks, without caveats, that's a red flag.
I'm not familiar with their stance, but bear in mind the costs of introducing new key type on the ecosystem, and on maintenance of SSH implementations.
Tangentially related but regarding RSA and ECC... With RSA can't we just say: "Let's use 16 384 bit keys" and be safe for a long while?
And for ECC, I know many are using the "2 exp 255 - 19" / 25519 for it's unlikely to be backdoored but it's only 256 bits but... Can't we find, say, "2 exp 2047 - 19" (just making that one up) and be safe for a while too?
Basically: for RSA and ECC, is there anything preventing us from using keys 10x bigger?
> Tangentially related but regarding RSA and ECC... With RSA can't we just say: "Let's use 16 384 bit keys" and be safe for a long while?
That's correct. The quantum computer needs to be "sufficiently larger" than your RSA key.
> Basically: for RSA and ECC, is there anything preventing us from using keys 10x bigger?
For RSA things get very unwieldy (but not technically infeasible) beyond 8192 bits. For ECC there are different challenges, some of which have nothing to do with the underlying cryptography itself: one good example is how the OpenSSH team still haven't bothered supporting Ed448, because they consider it unnecessary.
Many implementations limit the RSA key size to 8,192 or 16,384 bits (because the maximum bit length determines indirectly how much stack space is required).
Disconcerting opening. If you want to put hash algorithms in the same category as symmetric keys in this particular case then say so without referring to them as if they are symmetric keys.
Hashes are symmetric cryptography primitives, and it's even proper to talk about key sizes for e.g. HMAC and HKDF hash-based constructions, to which Grover's algorithm applies analogously to how it applies to cipher keys.
Assuming a member of the target audience sees the connection between HMAC and symmetric keys AFA usage, would you like them to be making leaps like this in their regular usage of cryptography? (I really couldn't tell you if an algorithm that involves being able to look into the box in the middle might not have characteristics that means part or all the primitives involved are less quantum safe than an algorithm that lacks that possibility yet I'd suspect I have a lot more experience than the average reader drawn in by the title.)
encryption is not ever to be considered impossible to break.
every encryption scheme has at least one way to be decrypted.
fidelity of information is one use of encryption, if you apply the solution and get garbage, something is wrong, somewhere.
occultation of information is another use, that is commonly abused by extending undue trust. under the proviso that encryption will eventually be broken, you cant trust encryption to keep a secret forever, but you can keep it secret, for long enough that it is no longer applicible to an attack,or slightly askew usecase, thus aggressive rotation of keys becomes desirable
> encryption is not ever to be considered impossible to break
One-time pads [0] are actually impossible to break, but they're pretty tricky to use: you must never ever reuse them, they must be truely random, and you need some way to share them between both parties (which isn't that easy since they need to be at least as large as all the data that you ever want to transmit).
not trying to be obtuse, but there is at least one solution, the one used to decrypt.
if you know something about the content e.g. it is for russians, or americans.
you can use a frequency analysis to identify vowels. that goes for a simple substitution cypher that is relying on low frequency of usage[one time use] and does not keep it brief.
when you further substitute numbers for words, you gain more room for verbosity.
if you have high stakes, your message in the clear, should only be useful for a limited time, at the point that it is no longer actionable.
im very familiar with one time pads random, and keyed.
they are a little simple, you can use a triaxial scheme, or a tensor like scheme, for more leg room and more complexity.
depending on what you are doing it may be necessary, to not carry any pads, but to have access at some point, to agreed upon keys, in order to generate a pad on the spot. or even work in your head, if you have skill. e.g. jackdwlovemybigsphnxfqurtz as a weak example.
> not trying to be obtuse, but there is at least one solution, the one used to decrypt
Right, which is why I didn't quote that part :)
> you can use a frequency analysis to identify vowels.
That will help in many cases, but not against a properly-used one-time-pad.
> but to have access at some point, to agreed upon keys, in order to generate a pad on the spot
That's not really a one-time pad then, that's just a stream cipher. Which do work better than one-time pads in the vast majority of cases, aside from not being "perfectly" secure.
Grover's algorithm is provably optimal [0]. No quantum algorithm will ever find an n-bit key by queries to any reasonable sort of oracle faster than Grover's algorithm, and Grover's algorithm is way too slow to be a serious problem.
But symmetric ciphers are not black boxes. They're mostly built on some variant of a Feistel network, which is a very nice construction for turning a messy function into an invertible function that, in a potentially very strong sense, acts like a cryptographically secure permutation.
When I was in grad school, one project I contemplated but never spent any real time on was trying to either generate a real security proof for quantum attacks on Feistel networks or to come up with interesting quantum attacks. And there is indeed an interesting quantum attack against 3-round Feistel networks [1].
This is interesting, because, depending on what sort of security is needed, three or four rounds of Feistel network are sufficient against classical attack [2].
Now ciphers like AES have many more than 3 rounds, so hopefully they're fine. But maybe they're not. My intuition is that there is probably a reasonably small n1 and a reasonably small n2 >= n1 (probably constants, maybe logarithmic in numbers of bits) for which there is no quantum algorithm that can break symmetric crypto given classical query access even to the round functions (n1) or quantum query access to the round functions (n2) [3], but I'm not aware of any proof of anything of the sort. And my intuition definitely should be be trusted fully! (Maybe, even if I'm wrong, there is still a number of rounds that is sufficient for security against query access to the entire cipher.)
[0] The classic result is https://arxiv.org/abs/quant-ph/9701001 and there are newer, more exact results, e.g. https://arxiv.org/abs/0810.3647
[1] https://ieeexplore.ieee.org/document/5513654
[2] https://en.wikipedia.org/wiki/Feistel_cipher
[3] It would be extremely cool if someone built quantum computers and networks and storage such that two parties that don't trust each other could actually communicate and exchange (interesting [5]) qubits. I've written some fun papers on the possible implications of this. If we ever get the technology, then it might actually be meaningful to consider things like chosen-quantum-ciphertext attacks against a classical symmetric cipher. But that's many, many years away, and, in any case, an attacker will only ever get to do a quantum query attack against a cryptosystem if a victim lets them. [4] Otherwise all queries will be classical.
[4] Or in very complex settings where there is an obfuscated black box, for example. This may be relevant for zk-snarks or similar constructions.
[5] I don’t consider the optical qubits exchanged in commercial devices that supposedly implement quantum key distribution to be interesting. To the vendors of such devices, sorry.
As far as the Grover speedup goes, it's already optimal. Requiring O(sqrt(N)) queries is the proven lower bound for unstructured search.
Basically, the best they've proven is something like O(n * inverse_ackermann(n)), but it seems likely the algorithm actually runs in O(n). We also already have a randomised algorithm for this problem that runs in O(n) expected time on worst case input. The expectation is over the random choices.
https://en.wikipedia.org/wiki/Expected_linear_time_MST_algor...
If an attacker can break the symmetric encryption in a reasonable amount of time, they can capture the output and break it later.
In addition, how are you doing the key rotation? You have to have some way of authenticating with the rotation service, and what is to stop them from breaking THAT key, and getting their own new certificate? Or breaking the trusted root authority and giving themselves a key?
I agree. The point I am trying to make is that even for asymmetric encryption (which is far more vulnerable), there are still plausible ways to make a quantum break more difficult.
The only thing that could compromise this scheme, aside from breaking the signing keys, would be to have TLS broken to the extent that viewing real-time traffic is possible. Any TLS break delayed by more than 15 minutes would be worthless.
It sounds like you’re talking about breaking TLS’s key exchange? Why would this not have the usual issue of being able to decrypt recorded traffic at any time in the future?
Edit: If it’s because the plaintext isn’t useful, as knorker got at in a sibling comment… I sure hope we aren’t still using classical TLS by the time requiring it to be broken in 1 minute instead of 15 is considered a mitigation. Post-quantum TLS already exists and is being deployed…
What makes you say that? This is the store now decrypt later attack, and it's anything but worthless.
Oh, worthless for your oauth? Uh… but how do you bootstrap the trust? Sounds to me like you need post quantum to carry the whole thing anyway.
Or you mean one key signs the next? Ok, so your bet is that within the time window an RSA key, RSA can't be cracked?
Why in the world would anyone want to depend on that? Surely you will also pair it with PQ?
There are enough order-of-magnitude breakthroughs between today and scalable quantum error correction, that it makes no sense to try to to guess exactly the order of magnitude of the attacks that will be feasible.
Either you believe they won't happen, in which case you can keep using long-term ECDSA keys, or you believe they will happen, in which case they are likely to overshoot your rotation period.
I dont know what the quantum future holds, but if quantum actually happens then i have low faith in your plan.
I think there are too many unknowns to bet it all on one horse.
So, if we have to change all of our infrastructure due to a supposed quantum computing threat, I'd go with HybridPQ for asymmetric encryption.
I don't think I understand the threat model you are using here?
What is going on?
I think an analogy would be, imagine you are driving across north america in a car, but your engine is broken. The mechanic is near by so you put it in neutral and push it.
If someone said, well it took you half an hour to push it to the mechanic, it will take the rest of your life to get it across north america - that would be the wrong take away. If the mechanic actually fixes the engine, you'll go quite fast quite quickly. On the other hand maybe its just broke and can't be fixed. Either way how fast you can push it has no bearing on how fast the mechanic can fix it or how fast it will work after its fixed.
Maybe people will figure out quantum computers maybe they won't, but the timeline of "factoring" 15 is pretty unrelated.
In the context of cryptography, keep in mind its hard to change algorithms and cryptographers have to plan for the future. They are interested in questions like: is there a > 1% change that a quantum computer will break real crypto in the next 15 years. I think the vibe has shifted to that sounding plausible. Doesn't necessarily mean it will happen, its just become prudent to plan for that eventuality, and now is when you would have to start.
https://bas.westerbaan.name/notes/2026/04/02/factoring.html
It doesn't say much by itself, but it has four very good links on the subject. One of these has a picture of the smallest known factor-21 circuit, which is vastly larger than that of the factor-15 circuit, and comparable to much larger numbers. Another is Scott Aaronson's article making the analogy of asking factoring small numbers as asking for a "small nuclear explosion" - if you're in 1940 and not able to make a small nuclear explosion, that doesn't mean you're much farther away from a big nuclear explosion.
I’ve seen so much change so fast my assumption is someone did it already and preprints are making the rounds.
This assumes that there will not be other problems that arise. I suspect that "error correcting" thousands of qubits entangled with one another will be one of those problems.
To get useful results, a quantum computer needs all of its qbits to stay entangled with each other, until the entire group collapses into the result. With current technology, it is very difficult for a reasonable sized group of qbits to stay coherently entangled, so it can only solve problems that are also relatively easy to solve on classical computers.
If someone today were to figure out how to keep large numbers of bits entangled, then quantum computing would instantly be able to break any encryption that isn't quantum safe. It's not something that we are slowly working toward; it's a breakthrough that we can't predict when, or even if, it will happen.
Shor's and Grover's still are algorithm that require a massive amount of steps...
If that’s the case, would the time eventually be basically irrelevant with enough compute? For instance, if what’s now a data center is able to fit in the palm of your hand (comparing early computers that took up rooms to phones nowadays). So if compute is (somehow) eventually able to be incredibly well optimized or if we use something new, like how microprocessors were the next big thing, would that then be a quantum threat to 128-bit symmetric keys?
Compute has seen in the ballpark of a 5-10 orders of magnitude increase over the last 40 years in terms of instructions per second. We would need an additional 20-30 orders of magnitude increase to make it even close to achievable with brute force in a reasonable time frame. That isn’t happening with how we make computers today.
Keep here in mind that computers today have features approaching the size of a single atom, switching frequencies where the time to cross a single chip from one end to the other is becoming multiple cycles, and power densities that require us to operate at the physical limits of heat transfer for matter that exists at ambient conditions.
We can squeeze it quite a bit further, sure. But anything like 20-30 orders of magnitude is just laughable even with an infinite supply of unobtanium and fairy dust.
None of those are remotely practical, even imagining quantum computers that become as fast (and small! and long-term coherent!) as classical computers.
WPA3 moved from symmetric AES to ECDH which is vulnerable to Quantum. Gonna be a tonne of IOT inverters waste.
The say the 's' in IoT stands for secure, and from my experience that is true. Pretty much nothing is getting thrown out, because it isn't secure.
...but even if they had, what realistically could they have done about it? ML-KEM was only standardized in 2024 [1].
also, the addition of ECDH in WPA3 was to address an existing, very real, not-theoretical attack [2]:
> WPA and WPA2 do not provide forward secrecy, meaning that once an adverse person discovers the pre-shared key, they can potentially decrypt all packets encrypted using that PSK transmitted in the future and even past, which could be passively and silently collected by the attacker. This also means an attacker can silently capture and decrypt others' packets if a WPA-protected access point is provided free of charge at a public place, because its password is usually shared to anyone in that place.
0: https://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#WPA3
1: https://en.wikipedia.org/wiki/ML-KEM
2: https://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#Lack_of...
why do you have to assume that?
you're at Acme Coffeeshop. their wifi password is "greatcoffee" and it's printed next to the cash register where all customers can see it.
with WPA2 you have to consider N possible adversaries - Acme Coffee themselves, as well as every single other person at the coffeeshop.
...and also anyone else within signal range of their AP. maybe I live in an apartment above the coffeeshop, and think "lol it'd be fun to collect all that traffic and see if any of it is unencrypted".
with WPA3 you only have to consider the single possible adversary, the coffeeshop themselves.
Grover attacks are very blatantly impractical. When someone describes Grover-type attacks in the same breath as Shor-type attacks, without caveats, that's a red flag.
And for ECC, I know many are using the "2 exp 255 - 19" / 25519 for it's unlikely to be backdoored but it's only 256 bits but... Can't we find, say, "2 exp 2047 - 19" (just making that one up) and be safe for a while too?
Basically: for RSA and ECC, is there anything preventing us from using keys 10x bigger?
That's correct. The quantum computer needs to be "sufficiently larger" than your RSA key.
> Basically: for RSA and ECC, is there anything preventing us from using keys 10x bigger?
For RSA things get very unwieldy (but not technically infeasible) beyond 8192 bits. For ECC there are different challenges, some of which have nothing to do with the underlying cryptography itself: one good example is how the OpenSSH team still haven't bothered supporting Ed448, because they consider it unnecessary.
the time to run the algorithm has cubic scaling - 1000x more time required.
but it remains exponentially faster, just 1 minute becomes 1 day, 1 day becomes 3 years. still "easily" broken
you can run benchmarks yourself: openssl speed rsa1024 rsa2048
also this (slightly dated) java ex writeup covers this well: https://www.javamex.com/tutorials/cryptography/rsa_key_lengt...
tldr trade off is found between better performance and how many years the data needs to be assumed confidential
every encryption scheme has at least one way to be decrypted.
fidelity of information is one use of encryption, if you apply the solution and get garbage, something is wrong, somewhere.
occultation of information is another use, that is commonly abused by extending undue trust. under the proviso that encryption will eventually be broken, you cant trust encryption to keep a secret forever, but you can keep it secret, for long enough that it is no longer applicible to an attack,or slightly askew usecase, thus aggressive rotation of keys becomes desirable
One-time pads [0] are actually impossible to break, but they're pretty tricky to use: you must never ever reuse them, they must be truely random, and you need some way to share them between both parties (which isn't that easy since they need to be at least as large as all the data that you ever want to transmit).
[0]: https://en.wikipedia.org/wiki/One-time_pad
if you know something about the content e.g. it is for russians, or americans.
you can use a frequency analysis to identify vowels. that goes for a simple substitution cypher that is relying on low frequency of usage[one time use] and does not keep it brief.
when you further substitute numbers for words, you gain more room for verbosity.
if you have high stakes, your message in the clear, should only be useful for a limited time, at the point that it is no longer actionable.
im very familiar with one time pads random, and keyed.
they are a little simple, you can use a triaxial scheme, or a tensor like scheme, for more leg room and more complexity.
depending on what you are doing it may be necessary, to not carry any pads, but to have access at some point, to agreed upon keys, in order to generate a pad on the spot. or even work in your head, if you have skill. e.g. jackdwlovemybigsphnxfqurtz as a weak example.
Right, which is why I didn't quote that part :)
> you can use a frequency analysis to identify vowels.
That will help in many cases, but not against a properly-used one-time-pad.
> but to have access at some point, to agreed upon keys, in order to generate a pad on the spot
That's not really a one-time pad then, that's just a stream cipher. Which do work better than one-time pads in the vast majority of cases, aside from not being "perfectly" secure.