Technology

Shor’s algorithm resource requirements for RSA-2048 fell by an order of magnitude in under a year

The cryptographic migration window is narrowing as algorithmic advances compress Q-Day timelines faster than infrastructure can respond
Susan Hill

The encryption protecting global banking, government communications, and digital identity does not fail when a quantum computer is finally built. It fails the moment adversaries acquire enough quantum capability to process the encrypted data they have already collected. That inversion — the threat arriving before the machine does — defines the actual architecture of the Q-Day problem and explains why a readiness gap measured today translates directly into a security breach measured years from now.

The mechanism at risk is not exotic. RSA encryption, the dominant public-key cryptographic standard, depends on a single mathematical asymmetry: multiplying two large prime numbers is computationally trivial, but recovering those primes from their product scales in difficulty so steeply that no classical computer can reverse the operation for key sizes of 2048 bits within any practical timeframe. The entire edifice of trusted digital communication — TLS handshakes securing web traffic, certificate authorities authenticating identities, digital signatures validating financial transactions — rests on this asymmetry holding.

Shor’s algorithm, first formalized in 1994, demonstrated that quantum computation dissolves this asymmetry entirely. By exploiting quantum superposition and quantum Fourier transforms to find the period of a modular arithmetic function that encodes the factoring problem, a sufficiently large quantum computer could recover RSA private keys in hours rather than the billions of years a classical machine would require. The algorithm has been known for three decades. What changed in the past year is the resource estimate for running it.

The hardware requirements for a cryptographically relevant quantum computer were, until recently, so enormous they functioned as a practical barrier. Early estimates placed the physical qubit count needed to factor RSA-2048 at roughly a billion. By 2021, Gidney and Ekerå had driven that estimate down to approximately 20 million qubits operating for eight hours — still far beyond any system then in existence, but no longer astronomically so. Then, in the span of fewer than twelve months stretching from 2024 into 2025, three algorithmic developments collapsed the estimate by another order of magnitude.

The first was a restructuring of how modular exponentiation — the core computational operation in Shor’s algorithm — is actually performed. The classical approach required quantum registers large enough to hold entire 2048-bit numbers simultaneously, driving qubit counts upward. Approximate modular arithmetic, developed by Chevignard, Fouque, and Schrottenloher, replaced this with a segmented approach that computes exponentiation in pieces using far smaller registers, tolerating controlled errors that can be corrected later. The quantum computer no longer needs to hold the entire problem in memory at once. The second advance addressed the dominant cost bottleneck in fault-tolerant quantum computation: generating the specialized quantum resource states needed for non-error-correctable gate operations. Traditional magic state distillation required enormous overhead — thousands of physical qubits and dozens of error-correction cycles to produce a single high-fidelity state. Magic state cultivation, developed at Google Quantum AI, grows high-fidelity states from lower-quality ones with dramatically reduced overhead, cutting the associated qubit cost by orders of magnitude. The third development, synthesized in a 2025 paper by Craig Gidney, combined both techniques and reduced the total Toffoli gate operations required from roughly two trillion to approximately 6.5 billion — a factor of over one hundred improvement in computational efficiency.

The combined result: factoring RSA-2048 now appears technically feasible with approximately one million physical qubits operating for roughly one week. The hardware gap between this requirement and existing systems remains real — today’s most advanced quantum processors operate in the hundreds of qubits — but the trajectory of compression has changed qualitatively. What took twelve years to reduce from a billion to twenty million qubits took under a year to reduce from twenty million to under one million. That acceleration is the analytically important signal.

Parallel hardware developments have reinforced this trajectory. Google’s Willow chip, demonstrated in late 2024, provided the first experimental confirmation that quantum error correction can suppress noise below the surface code threshold — the physical proof that the noise assumptions underlying all resource estimates are actually achievable. IBM’s published roadmap projects the first large-scale fault-tolerant quantum computer carrying approximately 200 logical qubits by 2029. Multiple independent platforms have demonstrated two-qubit gate fidelities at or above 99.9%. The gap between theoretical resource requirements and demonstrated hardware capability has compressed from multiple orders of magnitude to something closer to a single order.

This compression gives material urgency to a threat that was previously treated as comfortably distant: harvest now, decrypt later. Nation-state and sophisticated non-state actors that have been collecting encrypted network traffic for years — a practice that intelligence assessment has long confirmed as ongoing — hold ciphertext that becomes readable the moment a cryptographically relevant quantum computer exists. The correct temporal frame for evaluating Q-Day risk is therefore not “when will quantum computers be built” but “how long will the data I encrypt today need to remain confidential.” Any organization protecting data with a secrecy horizon extending to the early 2030s has already entered the exposure window.

The cryptographic response to this threat has a name, a set of standards, and a compliance timeline. Post-quantum cryptography replaces the integer factorization and discrete logarithm problems underlying RSA and elliptic curve cryptography with mathematical structures believed to resist both classical and quantum attack. The primary family adopted by global standards bodies is lattice-based cryptography, which grounds its security in the hardness of the shortest vector problem and related geometric challenges in high-dimensional spaces. A quantum computer running Shor’s algorithm offers no meaningful speedup against these structures; the best known quantum attacks provide only marginal improvement over classical approaches. In August 2024, the National Institute of Standards and Technology finalized three post-quantum cryptographic standards: ML-KEM for key encapsulation, ML-DSA for digital signatures, and SLH-DSA as a hash-based backup. A fifth algorithm, HQC, was selected in March 2025 as a code-based alternative to ML-KEM, providing algorithmic diversity in case lattice-based assumptions are eventually weakened.

The existence of standards does not resolve the migration problem. It begins it. Cryptographic transitions of this scope have historically required fifteen to twenty years for full infrastructure penetration — and this migration is structurally more complex than any predecessor. Previous transitions, such as the shift from DES to AES for symmetric encryption, involved updating software implementations of a single algorithm class. The post-quantum migration requires re-engineering public-key infrastructure at every layer: hardware security modules that store and manage keys must be replaced or upgraded; certificate authorities must issue new credential hierarchies; TLS implementations across billions of endpoints must be updated; protocols embedded in embedded systems, industrial control infrastructure, and long-lived financial systems must be audited and replaced. Many of these systems lack the cryptographic agility — the design property that allows algorithms to be swapped without architectural changes — that would make migration straightforward.

The regulatory framework has responded with a compressed timeline that reflects the urgency of the hardware trajectory. NSA’s Commercial National Security Algorithm Suite 2.0 mandates that all new national security systems be quantum-safe by January 2027. NIST’s deprecation schedule calls for quantum-vulnerable algorithms to be removed from approved standards after 2035, with high-risk systems transitioning substantially earlier. The European Union’s Network and Information Systems Cooperation Group published a coordinated implementation roadmap in 2025. The US Quantum Computing Cybersecurity Preparedness Act requires federal agencies to inventory vulnerable systems and report migration progress annually. Apple’s iMessage, Zoom, Signal, Cloudflare, and Google Chrome have each begun deploying hybrid schemes that combine classical algorithms with post-quantum alternatives, providing a transitional security posture while full migration proceeds. The IBM Institute for Business Value’s 2025 quantum-safe readiness assessment found a global average score of 25 out of 100 — a quantitative measure of the gap between the regulatory deadline and institutional preparation.

The financial sector faces particular complexity. Global banking infrastructure routes transactions through cryptographic endpoints numbering in the tens of thousands, with deep dependencies on hardware security modules, payment protocol stacks, and regulatory compliance frameworks that were not designed with algorithm agility in mind. The regulatory mandate for PQC migration collides with change-management processes that typically operate on multi-year timescales even for minor protocol updates. Banco Sabadell undertook a four-month pilot project to assess PQC adoption complexity; the results illustrated that even institutions with significant technical capacity face non-trivial integration challenges across legacy systems that cannot be simply patched.

The practical advice that emerges from this technical landscape is not panic — it is phased, prioritized action. Cryptographic inventory is the prerequisite: organizations cannot migrate what they cannot locate, and most large institutions have cryptographic dependencies embedded in systems that predate modern documentation practices. Once inventoried, systems handling data with long secrecy horizons — classified government communications, long-term financial records, health data, intellectual property — must be prioritized for early migration even before the regulatory deadline forces it. Hybrid cryptographic deployments, combining ML-KEM with classical key exchange algorithms in parallel, offer a practical bridge: data protected by a hybrid scheme requires an adversary to break both the classical and post-quantum components simultaneously, substantially raising the cost of any harvest now, decrypt later attack.

What the algorithmic developments of 2024 and 2025 have fundamentally altered is the uncertainty distribution around Q-Day. The previous consensus placed cryptographically relevant quantum computation comfortably in the 2030s, with significant error bars extending toward the 2040s. The compression of resource estimates to under one million qubits, combined with IBM’s roadmap targeting fault-tolerant systems by 2029 and Google’s experimental confirmation of below-threshold error correction, has shifted credible estimates meaningfully earlier and narrowed the uncertainty range. The question of whether Q-Day arrives in 2030 or 2033 matters less than the question of whether the entities responsible for global cryptographic infrastructure treat the migration deadline as a hard constraint or a soft aspiration.

The transition to post-quantum cryptography does not end with the deployment of lattice-based algorithms. It creates a new cryptographic surface whose long-term security depends on assumptions about the hardness of geometric problems in high-dimensional spaces — assumptions that have so far resisted decades of classical cryptanalysis but have not been tested by the quantum computers that will eventually exist at scale. The field has responded with algorithmic diversity: the inclusion of hash-based and code-based alternatives alongside lattice schemes is precisely a hedge against the possibility that any single mathematical assumption eventually yields. What the present moment requires is not certainty about when quantum computation will mature, but a clear-eyed assessment of what it means to build an institution whose security posture is still founded on the assumption that reversing a prime factorization is hard. That assumption has an expiration date. The responsible question is not whether to prepare, but whether preparation is already overdue.

Discussion

There are 0 comments.

```
?>