The threat estimate for a quantum computer that breaks cryptography shall be based on currently available data and the understanding that only the Schor algorithm is known to provide exponential speedup for factorisation.
Let’s give IBM credit for attempting to factor in the number 35 in 2022, although they failed there [1]. Before that, the successful factorisation happened for the number 21 in 2012 [2] and the first factorisation of 15 in 2001 [3].
Now we have three points. There is a trend that the factorised number grows by a number of 10 for every ten years. Thus, to get a quantum computer that facilitates RSA-2048, we shall wait for 2^2048 years.
I think they must not be planning on the trend being linear. Maybe we’re at the linear looking beginning of a sigmoid.
It seems like a reasonable bet on their part, in the sense that Google has a lot of money to play with. Even if it is unlikely that it takes off, behind hit by quantum attacks would be pretty bad for them, so maybe they see it as insurance against an unlikely but catastrophic event.
I understand that it is enjoyable to rework all protocols involving public key cryptography and propose alternatives, and I support that. It can be an excellent catalyst for coming up with new cryptographic primitives and a better understanding of existing ones, and Google can certainly afford such activities.
The issue is the emphasis on the importance of an inevitable threat. Organisations start to justify implementing QC-resistant algorithms on the basis that others are doing so, and in that way, they justify themselves that the threat is real. After that, we would end up with government agencies issuing a requirement that only QC-safe algorithms are acceptable for security in the process, killing useful cryptographic primitives like ElGamal cryptosystems, homomorphic encryption, blind signatures and a class of zero-knowledge proofs.
At the moment, there are only three points from which one can extrapolate further advancements. The Shor algorithm requires exponential suppression of errors with the number of qubits. This is why, although we have hundred qubit QCs, the most significant factorisation is being done at most with five qubits. Threfore, if we would have “exponential progress” in advancing QC, the advancement of factoring integers would be linear. Faster progress requires miracles. QC error correction would be no panacea either, as it would require repeated application and would require an unreasonable number of qubits as well as would increase runtime.
The miracle as well be that we figure out a way to factorize numbers on classical computers in polynomial time. Or as well break lattice based crypto on classical computers.
Running Shor's algorithm requires essentially error-free logical qubits. Not a single logical qubit of that quality has ever been demonstrated. The Harvard results are impressive, but their logical qubits are worse than some physical qubits.
It is not enough to place qubits on a single chip or on a grid. We already know how to do that. The hard part is keeping them isolated while allowing arbitrary control of their interference.
The core issue why Schor algorithm is hard is that it requires exponential supppression of error with number of qubits to produce meaningful results. Therefore we don’t actually see much results here as the error rates have not yet reached thresholds to do it with more qubits. The error correction would not be a panacea either because it would necessitate it’s repeated application to get necessary threshold to run Schor algorithm. This would require unreasonable amount of physical qubits.
Let’s give IBM credit for attempting to factor in the number 35 in 2022, although they failed there [1]. Before that, the successful factorisation happened for the number 21 in 2012 [2] and the first factorisation of 15 in 2001 [3].
Now we have three points. There is a trend that the factorised number grows by a number of 10 for every ten years. Thus, to get a quantum computer that facilitates RSA-2048, we shall wait for 2^2048 years.
[1]: https://arxiv.org/pdf/2103.13855v1.pdf
[2]: https://www.nature.com/articles/nphoton.2012.259
[3]: https://www.nature.com/articles/414883a