PQC readiness “is mostly actuarial/risk management—even if the chance of building a CRQC by, say, 2030 is very low (say 5 percent), the downside risk is huge,” he explained. “Combine that with very long transition engineering times, and you should have started already.”
The largest number reliably factored by Shor’s algorithm, rather than some other quantum method, is 21 which was factored in 2012.[26][27] The number 15 had previously been factored by several labs and subsequent attempts to factorise 35 failed.[27
Large-scale fault-tolerant quantum computers capable of implementing Shor’s algorithm are not yet available, preventing relevant benchmarking experiments. Recently, several authors have attempted quantum factorizations via reductions to SAT or similar NP-hard problems. While this approach may shed light on algorithmic approaches for quantum solutions to NP-hard problems, in this paper we study and question its practicality. We find no evidence that this is a viable path toward factoring large numbers, even for scalable fault-tolerant quantum computers, as well as for various quantum annealing or other special purpose quantum hardware.
I’ll be concerned when we start seeing primes being factored when they’re not using compiled Shor algorithm primes. So far, most of the big “factorization records” cheat and use primes with only the LSBs differing, and aren’t remotely close to anything used in a real RSA prime. There was a good discussion of it on Security Now episode 1034 for those who are interested.
Exec summary: For QC to “break RSA”, we need 7-8 orders of magnitude more physical qbits than we had in 2019.
Error correction cleverness and better controls have knocked down 3-4 of those. Which is impressive. Now just 1,000x - 100,000x more improvement needed. When you hear a QC advance, remember the questions are
how many logical qbits? (error correction is a bitch)
how long to get data into/out of the QC? (actually a major issue)
Relevant paragraph:
Also, relevant paragraph from the wiki page for integer factorization records:
And a relevant excerpt from this study looking at “factored” primes above 21
I’ll be concerned when we start seeing primes being factored when they’re not using compiled Shor algorithm primes. So far, most of the big “factorization records” cheat and use primes with only the LSBs differing, and aren’t remotely close to anything used in a real RSA prime. There was a good discussion of it on Security Now episode 1034 for those who are interested.
i have more faith in this endpoint happening than gen ai but i’m just tired of tech news reporting hype
This. Anyone interested in quantum computing owes it to themselves to watch this Caltech conference keynote:
Quantum Computing: Reality vs. Hype - John Preskill
Exec summary: For QC to “break RSA”, we need 7-8 orders of magnitude more physical qbits than we had in 2019.
Error correction cleverness and better controls have knocked down 3-4 of those. Which is impressive. Now just 1,000x - 100,000x more improvement needed. When you hear a QC advance, remember the questions are
how many logical qbits? (error correction is a bitch)
how long to get data into/out of the QC? (actually a major issue)