This is a thought provoking new paper out from Google research about what it would take to actually do RSA factorization with a quantum computer. Schor’s initial algorithm was a groundbreaking result that reduced the runtime of factoring to polynomial time on a quantum computer. However, the original paper didn’t try to really tune the bounds to understand the practicality of implementation. When the paper first came out in 1994, quantum computers were much further from reality than they are today. In addition, the lack of experimental data meant that it wasn’t possible to model the actual computation needed closely.
The last two decades have seen a lot of progress on these fronts. Google recently presented Bristlecone, a 72 qubit superconducting processor. It’s now possible to project from Google’s research roadmap what the error rates for realistic superconducting qubits will look like. The current work does all calculations assuming a planar grid of qubits, where each qubit is connected to its nearest 4 neighbors, a very reasonable design for practical engineering.
In addition, considerable refinement has occurred on the core algorithms. The most expensive part of Shor’s algorithm is the modular exponentiation operation. A number of quantum algorithmic tricks have emerged that allow for additional speedups. “Windowed arithmetic” uses lookup tables cleverly to reduce the gate complexity, “coset representations” allow for integers mod N to be added more efficiently, and “oblivious carry runways” allow for carry operations to be handled efficiently.
With these optimizations in place, it’s possible to use reasonable assumptions from current superconducting qubit implementations to achieve good estimates of the complexity of designing this circuit in practice. I’m not an expert at quantum architecture, but the layouts presented in the paper seem to hew closely to real designs, so I’m inclined to take the estimates seriously. There’s also an in-depth analysis of how hard it would be to crack various cryptographic schemes with various bit lengths. The numbers vary based on the length of the key used, but the threat appears serious in all cases.
There’s also a lengthy section of potential future improvements in the paper as well. The really interesting bit is that it’s clear that the science of practical quantum implementation is just beginning to take form. It’s starting to seem likely that there might be scope for considerable improvements on all numbers presented in the work.
What does this mean for those of us in crypto? It’s worth remembering that both Bitcoin and Ethereum use ECDSA signatures which would be vulnerable to attacks presented in this paper. I think that both BLS and Schnorr signatures (crucial to planned upgrades for Ethereum and Bitcoin respectively) are also quantum vulnerable. There’s been some early research underway on post-quantum crypto and public blockchains (notably on STARKs), but it seems like accelerating this work might be a smart move.