There are often exaggerated expectations regarding when 'quantum computers capable of posing a real threat to existing cryptography' will arrive — leading to calls for immediate, large-scale migration to post-quantum cryptography.
Article author: Justin Thaler, a16z research partner
Source: GaryMa 吴说区块链
Original link:
https://a16zcrypto.com/posts/article/quantum-computing-misconceptions-realities-blockchains-planning-migrations/
However, these calls often overlook the costs and risks of premature migration, as well as the completely different risk profiles faced by different cryptographic primitives:
Post-quantum encryption must be deployed immediately, even if costly: 'harvest now, decrypt later' (HNDL) attacks are already occurring, because when quantum computers truly arrive, even if that is decades away, the sensitive data protected by encryption today will still have value. Although post-quantum encryption carries performance overhead and implementation risks, for data that needs long-term confidentiality, HNDL attacks mean there is no alternative.
The considerations for post-quantum signatures are entirely different. They are not affected by HNDL attacks, and their costs and risks (larger size, performance overhead, immature implementation, and potential vulnerabilities) mean that migration should be approached cautiously, rather than implemented immediately.
These distinctions are very important. Various misunderstandings can distort cost-benefit analyses, causing teams to overlook more critical security risks — such as vulnerabilities themselves.
The true challenge of successfully transitioning to post-quantum cryptography is matching 'urgency' with 'real threat'. Below, I will clarify common misconceptions about quantum threats and their implications for cryptography — including encryption, signatures, and zero-knowledge proofs — and particularly focus on the impact of these issues on blockchain.
What stage are we currently at?
The likelihood of 'quantum computers posing a real threat to cryptography (CRQC)' appearing in the 2020s is extremely low, despite some high-profile claims that have drawn attention.
ps: quantum computers with a real threat to cryptography / cryptographically relevant quantum computer will henceforth be referred to as CRQC.
The term 'quantum computer posing a real threat to cryptography' refers to a fault-tolerant, error-corrected quantum computer capable of running Shor's algorithm at sufficient scale to attack elliptic curve cryptography or RSA (for example, breaking secp256k1 or RSA-2048 within a reasonable timeframe of up to one month of continuous computation).
Based on publicly available milestones and resource assessments, we are still far from such a quantum computer. Although some companies claim that CRQC is likely to appear before 2030 or even before 2035, the visible public progress does not support these claims.
In context, none of the quantum computing platforms in all current architectures — ion traps, superconducting qubits, and neutral atom systems — is close to operating Shor's algorithm to attack RSA-2048 or secp256k1, which requires hundreds of thousands to millions of physical qubits (the exact number depends on the error rate and error correction schemes).
The limiting factors are not just the number of qubits, but also gate fidelity, qubit connectivity, and the sustainable depth of error-correcting circuits necessary to execute deep quantum algorithms. While some systems now exceed 1,000 physical qubits, looking solely at quantity can be misleading: these systems lack the connectivity and gate fidelity required to perform cryptographically relevant computations.
Recent systems are approaching the physical error levels where quantum error correction becomes feasible, but no one has yet demonstrated more than a handful of logical qubits with sustainable error-correcting circuit depth — let alone the thousands of high-fidelity, deep circuit, fault-tolerant logical qubits needed to run Shor's algorithm. There remains a significant gap between theoretically proving quantum error correction is feasible and actually achieving the scale necessary for cryptographic breaking.
In short: unless the number and fidelity of qubits increase by several orders of magnitude simultaneously, 'quantum computers posing a real threat to cryptography' remain out of reach.
However, corporate press releases and media reports can easily lead to misunderstandings. Common misconceptions include:
Claims of achieving 'quantum advantage' often target artificially constructed problems. These problems are not chosen for their practicality, but because they can be run on existing hardware while seemingly demonstrating significant quantum speedup — a point that is often deliberately downplayed in the promotions.
Companies claim to have achieved thousands of physical qubits. But this usually refers to quantum annealers, not the gate-model quantum computers needed to run Shor's algorithm to attack public-key cryptography.
Companies' casual use of the concept of 'logical qubits'. Physical qubits are inherently noisy, while quantum algorithms require logical qubits; as mentioned, Shor's algorithm requires thousands of logical bits. Using quantum error correction, one logical bit typically needs to be made up of hundreds to thousands of physical bits (depending on the error rate). However, some companies have abused the term to an absurd extent. For example, a certain company recently claimed to have achieved 48 logical qubits using just two physical qubits for each logical qubit with a distance of 2. This is obviously unreasonable: a distance-2 code can only detect errors, not correct them. True fault-tolerant logical qubits each require hundreds to thousands of physical qubits, not just two.
More generally, many quantum computing roadmaps use 'logical qubits' to refer to qubits that only support Clifford operations. These operations can be efficiently simulated by classical algorithms, making them insufficient to run Shor's algorithm, which requires thousands of error-corrected T gates (or more generally, non-Clifford gates).
Therefore, even if a roadmap claims 'thousands of logical qubits will be achieved in a certain year X', it does not mean the company expects to run Shor's algorithm to break classical cryptography in the same year X.
These practices severely distort the public's (including industry professionals') perception of 'how close we are to a true CRQC'.
Nevertheless, some experts are indeed excited about the progress. For instance, Scott Aaronson recently wrote that given the 'stunningly rapid development of current hardware,' he now believes that a fault-tolerant quantum computer running Shor's algorithm before the next U.S. presidential election is a real possibility.
But Aaronson later clarified that his statement does not imply a quantum computer with cryptographic capabilities: even if a fully fault-tolerant Shor's algorithm runs successfully only to factor 15 = 3×5 — a number you could compute faster with paper and pencil — he would consider his point satisfied. The standard here is still just a miniature scale execution of Shor's algorithm, not at a cryptographically relevant scale; the previous quantum factorization of 15 used simplified circuits, rather than a complete fault-tolerant Shor. Furthermore, the fact that quantum experiments continue to choose to factor 15 is not accidental: because arithmetic calculations modulo 15 are extremely simple, while factoring slightly larger numbers (like 21) is much more difficult. Thus, some claims of quantum experiments factoring 21 often rely on hints or shortcuts.
In short, there is no public progress supporting the expectation that a quantum computer capable of breaking RSA-2048 or secp256k1 will appear within the next five years (which is what cryptography actually cares about).
Even ten years is still an aggressive prediction. Considering how far we are from truly cryptographically relevant quantum computers, it is entirely possible to maintain excitement about progress while coexisting with a timeline of over ten years.
So what does the U.S. government's target year of 2035 for the overall migration of government systems to post-quantum cryptography mean? I believe this is a reasonable timeline for completing such a large-scale migration. However, this does not imply an expectation of 'CRQC will emerge by then.'
In what scenarios do HNDL attacks apply (and which do they not)?
'Harvest now, decrypt later' (HNDL) attacks refer to attackers storing all encrypted communication data now, waiting for a future day when 'quantum computers posing a real threat to cryptography' appear, to decrypt it. It is certain that nation-state attackers have been archiving U.S. government encrypted communications at scale, in order to decrypt them when quantum computers truly emerge in the future. This is why cryptographic systems must begin migrating from today — at least for entities that need to maintain confidentiality for over 10–50 years.
However, digital signatures — the technology all blockchains rely on — are different from encryption: they do not possess a 'confidentiality' that can be attacked afterward.
In other words, when quantum computers truly arrive, they will indeed make forging digital signatures possible from that moment on, but past signatures do not 'hide' some secret like encrypted messages. As long as it can be confirmed that a digital signature was generated before CRQC appeared, it cannot be forged.
Therefore, the migration to post-quantum digital signatures is not as urgent compared to cryptographic systems.
Actions by major platforms also reflect this: Chrome and Cloudflare have deployed hybrid X25519+ML-KEM in web transport layer security (TLS) encryption. [In this article, I refer to these as 'encryption schemes' for readability, although technically TLS and other secure communication protocols use key exchange or key encapsulation mechanisms rather than public-key encryption.]
The 'hybrid' here means simultaneously employing a post-quantum secure scheme (ML-KEM) alongside an existing scheme (X25519), thereby obtaining security from both. This method aims to prevent HNDL attacks through ML-KEM, while X25519 provides traditional security assurances in case ML-KEM is proven insecure against current computers.
Apple's iMessage has also deployed similar hybrid post-quantum encryption in its PQ3 protocol, and Signal has implemented this mechanism in its PQXDH and SPQR protocols.
In contrast, the migration of critical web infrastructure to post-quantum digital signatures will be postponed until 'truly approaching the emergence of CRQC' because current post-quantum signature schemes bring significant performance degradation (which will be discussed later in this article).
zkSNARKs — zero-knowledge, succinct, non-interactive proofs — which are core to the scalability and privacy of future blockchains — are similar to digital signatures regarding quantum threats. The reason is that even if some zkSNARKs do not possess post-quantum security (because they use the same elliptic curve cryptography as current encryption and signatures), their 'zero-knowledge' nature remains post-quantum secure.
The zero-knowledge nature guarantees that the proof will not leak any information about the secret witness — even in the face of quantum attackers — thus there is no confidential data that can be 'collected' in advance and decrypted later.
Thus, zkSNARKs are not affected by HNDL attacks. Just as today's non-post-quantum digital signatures are secure, as long as the zkSNARK proof is generated before the emergence of CRQC, it is trustworthy (i.e., the statement of the proof must be true) — even if the zkSNARK uses elliptic curve cryptography. Only after the emergence of CRQC could an attacker construct a 'seemingly valid but actually incorrect' proof.
What does this mean for blockchains?
Most blockchains are not exposed to HNDL attacks: most non-privacy chains — such as today's Bitcoin and Ethereum — mainly use non-post-quantum cryptography for transaction authorization, meaning they use digital signatures rather than encryption.
Again, it is emphasized that digital signatures are not susceptible to HNDL attacks: 'collect first, decrypt later' attacks only apply to encrypted data. For example, the Bitcoin blockchain is public; the quantum threat lies in forging signatures (deriving private keys to steal funds), not decrypting already public transaction data. This means that HNDL attacks do not pose an immediate cryptographic urgency for the current blockchain.
Unfortunately, some trusted institutions (including the U.S. Federal Reserve) still incorrectly claim that Bitcoin is susceptible to HNDL attacks in their analyses, a mistake that exaggerates the urgency of migrating to post-quantum cryptography.
However, 'reduced urgency' does not mean Bitcoin can wait indefinitely: due to the immense social coordination required for protocol upgrades, Bitcoin faces different time pressures. (The unique challenges of Bitcoin will be discussed in more detail below.)
A current exception is privacy chains, many of which conceal recipients and amounts through encryption or other means. Such confidential information can be pre-'collected', and once quantum computers can break elliptic curve cryptography, it may be de-anonymized afterward.
For such privacy chains, the severity of attacks varies depending on the design of the chain. For example, regarding Monero's elliptic curve-based ring signatures and key images (a unique linkable label for each output to prevent double spending), it is sufficient to reconstruct the entire transaction flow graph in the future solely based on the public ledger. However, in other privacy chains, the extent of damage may be more limited — see the discussions by Zcash cryptographic engineer and researcher Sean Bowe for further details.
If users consider it very important that 'transactions will not be exposed due to the emergence of quantum computers in the future', privacy chains should migrate to post-quantum primitives (or hybrid schemes) as soon as possible. Alternatively, they should adopt architectures that do not place decipherable secrets on-chain.
The unique challenge for Bitcoin: governance mechanism + abandoned coins
For Bitcoin, two real factors make the transition to post-quantum digital signatures urgent, and these two factors are entirely unrelated to quantum technology itself. The first concern is governance speed: Bitcoin's evolution is extremely slow. Any contentious issue, as long as the community cannot reach a consensus on an appropriate solution, could trigger a destructive hard fork.
The second concern is that the transition of Bitcoin to post-quantum signatures cannot be completed through passive migration: the holders of the coins must actively migrate their funds. This means that coins that have been abandoned but are still exposed to quantum threats cannot be protected. Some estimates suggest that the number of quantum-vulnerable and possibly abandoned BTC could be in the millions, valued at hundreds of billions of dollars at current prices (as of December 2025).
However, quantum threats are unlikely to cause a sudden 'catastrophic collapse' of Bitcoin overnight... it is more likely to present as a selective, gradual unfolding attack process. Quantum computers will not break all encryption schemes at once — Shor's algorithm must break public keys one target at a time. The cost of early quantum attacks will be extremely high and slow. Therefore, once a quantum computer can break a single Bitcoin signature key, attackers will prioritize the highest value wallets.
Moreover, as long as users avoid address reuse and do not use Taproot addresses (which would directly expose public keys on-chain), they are essentially protected even if the protocol itself has not yet upgraded: their public keys remain hidden behind hash functions before spending. When they eventually broadcast a spending transaction, the public key becomes public, creating a brief 'real-time race window': honest users need to have their transactions confirmed quickly, while quantum attackers try to find the private key before the transaction is confirmed and spend the coins first. Thus, the truly vulnerable coins are those whose public keys have been exposed for many years: early P2PK outputs, reused addresses, and Taproot holdings.
For those already abandoned vulnerable coins, there is currently no easy solution. Alternatives include:
The Bitcoin community reached a consensus to set a 'flag day', after which all un-migrated coins will be considered destroyed.
Letting all abandoned coins exposed to quantum risks be seized by anyone possessing CRQC.
The second approach will bring serious legal and security issues. Using quantum computers to seize funds without private keys — even claiming it is for legitimate ownership or good faith — will touch on theft and computer fraud laws in many jurisdictions.
Furthermore, the term 'abandoned' itself is based on an assumption of inactivity, but no one can know for sure if these coins have truly lost active holders of the keys. Even if someone can prove they once held these coins, they may not have the legitimate authority to breach the cryptographic protections to 'recover' them. This legal ambiguity makes it highly likely that these abandoned coins, exposed to quantum risks, will fall into the hands of malicious attackers who ignore legal constraints.
Another unique issue for Bitcoin is its extremely low transaction throughput. Even if the migration plan is ultimately finalized, moving all funds exposed to quantum threats to post-quantum secure addresses will still take months at Bitcoin's current transaction rate.
These challenges mean that Bitcoin must start planning for post-quantum migration now — not because CRQC is likely to appear before 2030, but because coordinating governance, reaching consensus, and the technical logistics of migrating billions of dollars worth of funds will take years to complete.
The quantum threat facing Bitcoin is real, but the time pressure comes from Bitcoin's own structural constraints, not the approach of quantum computers. Other blockchains also face issues of quantum vulnerable funds, but Bitcoin is particularly unique: the earliest transactions used pay-to-public-key (P2PK) outputs, directly exposing public keys on-chain, leaving a considerable proportion of BTC exposed to quantum threats. Its technical history, combined with a long chain age, high value concentration, low throughput, and rigid governance, exacerbates the problem.
It is important to note that the vulnerabilities mentioned above only apply to the cryptographic security of Bitcoin digital signatures — they do not involve the economic security of the Bitcoin blockchain. The economic security of Bitcoin comes from its proof-of-work (PoW) consensus mechanism, which is not as easily susceptible to quantum attacks as signature schemes, for three reasons:
PoW relies on hash functions, so it will only be affected by the quadratic speedup brought by Grover's search algorithm, and will not be affected by the exponential speedup brought by Shor's algorithm.
The practical costs of implementing Grover's search are immense, making it extremely unlikely for any quantum computer to gain even limited practical acceleration on Bitcoin's PoW.
Even if quantum computers can achieve significant acceleration, their effect will only give large miners with quantum computing power a relative advantage, and will not fundamentally undermine the economic security model of Bitcoin.
Costs and risks of post-quantum signatures
To understand why the blockchain should not hastily deploy post-quantum signatures, we need to consider both performance costs and our evolving confidence in post-quantum security.
Most post-quantum cryptography is based on one of the following five categories of methods: hashing, codes (error-correcting codes), lattices, multivariate quadratic equations (MQ), and isogenies.
Why are there five different approaches? The reason is that the security of any post-quantum cryptographic primitive relies on one assumption: that quantum computers cannot efficiently solve a specific mathematical problem. The stronger the structure of the problem, the more efficient cryptographic protocols we can construct.
But this is a double-edged sword: more structure also means a larger attack surface, and algorithms are easier to break. This creates a fundamental tension — stronger assumptions provide better performance, but at the cost of potential security vulnerabilities (i.e., the likelihood of the assumption being proven wrong is higher).
Overall, from a security perspective, hash-based methods are the most conservatively robust, as we are most confident that quantum computers cannot efficiently attack them. However, their performance is also the worst. For example, the NIST standardized hash signature scheme has a signature size of 7–8 KB even in the minimum parameter setting. In contrast, today’s elliptic curve digital signatures are only 64 bytes, about 100 times smaller.
Lattice schemes are the focus of current deployment. The only cryptographic scheme selected by NIST, along with two of the three signature algorithms, are based on lattices. One of the lattice signatures (ML-DSA, formerly Dilithium) has a signature size of 2.4 KB at a 128-bit security level and 4.6 KB at a 256-bit security level — about 40–70 times larger than current elliptic curve signatures. Another lattice scheme, Falcon, has smaller signatures (Falcon-512 is 666 bytes, Falcon-1024 is 1.3 KB), but relies on complex floating-point operations; NIST has marked it as a significant challenge during implementation. Thomas Pornin, one of the Falcon designers, called it 'the most complicated cryptographic algorithm I have implemented to date.'
In achieving security, lattice signatures are far more difficult than elliptic curve schemes: ML-DSA contains more sensitive intermediate values and complex rejection sampling logic, all of which require side-channel and fault attack protection. Falcon adds complexity to constant-time floating-point operations; several successful side-channel attacks against Falcon implementations have been reported that recover private keys.
The risks posed by these issues are immediate, completely different from the distant threat of 'quantum computers that pose a real threat to cryptography'.
Maintaining caution regarding higher-performance post-quantum cryptographic schemes is well justified. Historically leading schemes, such as Rainbow (MQ-based signatures) and SIKE/SIDH (isogeny-based encryption), have been 'classically' broken — meaning they have been broken by today's computers rather than quantum computers.
This occurs at a stage where the NIST standardization process has already progressed quite deeply. This certainly reflects a healthy scientific process, but it also indicates that premature standardization and deployment may have counterproductive effects.
As previously mentioned, internet infrastructure is taking a cautious approach to advancing signature migration. This is worth noting, as once the cryptographic transition of the internet begins, it often takes years to complete. Even though hash functions like MD5 and SHA-1 have been formally deprecated by internet standards organizations for years, their actual migration has continued for many years, and they are still not fully eliminated in some scenarios. These algorithms are fully broken, not just 'possibly breakable in the future'.
Unique challenges of blockchain vs. internet infrastructure
Fortunately, blockchains maintained by the open-source community (like Ethereum and Solana) are easier to upgrade quickly than traditional internet infrastructure. On the other hand, internet infrastructure benefits from frequent key rotation, meaning the attack surface changes faster than early quantum computers can keep up — and blockchains do not have that, as coins and their keys can be exposed indefinitely. But overall, blockchains should still advance signature migration cautiously, borrowing from the internet's prudent approach. Both are not impacted by signature-type HNDL attacks, and the costs and risks of prematurely migrating to immature post-quantum schemes remain significant, regardless of key lifespan.
Furthermore, there are several challenges that make premature migration especially dangerous and complex for the blockchain: for instance, the unique requirements of the blockchain for signature schemes, particularly the need for 'rapid aggregation of large numbers of signatures'. The commonly used BLS signatures are popular for their efficient aggregation capabilities, but they do not possess post-quantum security. Researchers are exploring SNARK-based post-quantum signature aggregation schemes. While progress is promising, it is still in its early stages.
For SNARK itself, the current community primarily focuses on hash-based post-quantum structures. However, a significant shift is on the horizon: I am confident that over the next few months and years, lattice schemes will become a highly attractive alternative route. They will offer better performance across multiple dimensions, such as shorter proof lengths — lattice signatures are shorter than hash signatures.
Current more serious issue: achieving security
In the coming years, achieving vulnerabilities will be far more realistic and serious than 'quantum computers truly threatening cryptography'. For SNARK, the primary concern is vulnerabilities (bugs).
Vulnerabilities are already a major challenge in digital signatures and encryption algorithms, and the complexity of SNARK is much higher. In fact, a digital signature scheme can be viewed as a highly simplified zkSNARK used to prove 'I know the private key corresponding to the public key and I authorized this message.'
For post-quantum signatures, the current truly urgent risks also include implementation attacks, such as side-channel attacks and fault injection attacks. These types of attacks have extensive empirical evidence and can extract private keys from real systems. Their threats are far more pressing than 'distant future quantum attacks'.
The community will continue to identify and fix vulnerabilities in SNARK over the coming years and strengthen the implementations of post-quantum signatures to resist side-channel and fault injection attacks. Premature migration during the phase where post-quantum SNARK and signature aggregation schemes have not yet stabilized may risk locking the blockchain into suboptimal solutions — once better options emerge or current solutions reveal significant implementation vulnerabilities, it may have to migrate again.
What should we do? Seven recommendations
Based on the reality discussed above, I offer the following recommendations to different stakeholders — from developers to policymakers. The overall principle is: take quantum threats seriously, but do not act under the assumption that 'a quantum computer posing a real threat to cryptography will necessarily appear before 2030.' Current technological progress does not support this premise. However, there is still much we can, and should, begin to prepare for now:
1. Immediately deploy hybrid encryption
At least in scenarios where long-term confidentiality is important and performance costs are acceptable. Many browsers, CDNs, and messaging applications (like iMessage and Signal) have already deployed hybrid solutions. Hybrid schemes — post-quantum + classical cryptography — can both resist HNDL attacks and guard against potential weaknesses in the post-quantum schemes themselves.
2. Immediately use hash signatures in scenarios that can tolerate large signature sizes
Software/firmware updates and other low-frequency, size-insensitive scenarios should immediately adopt hybrid hash signatures. (Hybrid is to guard against implementation vulnerabilities in the new scheme, not because there are doubts about the hash security assumption.) This is a conservative and prudent approach that can provide society with a clear 'lifeboat' in case quantum computers arrive unexpectedly ahead of schedule. If there are no already deployed post-quantum signature software update mechanisms, we will face a bootstrapping problem once CRQC appears: it will not be safe to distribute the cryptographic updates needed to counter quantum threats.
3. Blockchains do not need to hastily deploy post-quantum signatures — but should start planning now
Blockchain developers should learn from the practices of web PKI and prudently advance the deployment of post-quantum signatures. This allows post-quantum signature schemes time to mature in terms of performance and security understanding. At the same time, it gives developers time to redesign systems to accommodate larger signatures and develop better aggregation technologies. For Bitcoin and other Layer 1 chains: the community needs to establish migration paths and policies regarding quantum-vulnerable and abandoned funds. Passive migration is impossible, so planning is crucial. The challenges Bitcoin faces are mostly not technical — slow governance and the large number of high-value potentially abandoned quantum-vulnerable addresses — further emphasize that the Bitcoin community should begin planning as early as possible.
Meanwhile, the research on post-quantum SNARKs and aggregatable signatures needs to continue to mature (which may take several years). Again, I emphasize that premature migration may lead to being locked into suboptimal solutions or having to migrate again upon discovering implementation vulnerabilities.
A note on Ethereum's account model: Ethereum supports two types of accounts that have different impacts on post-quantum migration: externally owned accounts (EOAs) controlled by secp256k1 private keys, and smart contract wallets with programmable authorization logic.
In non-urgent scenarios, when Ethereum adds support for post-quantum signatures, upgradable smart contract wallets can switch to post-quantum validation through contract upgrades — while EOAs may need to transfer assets to new post-quantum secure addresses (although Ethereum may also provide a dedicated migration mechanism for EOAs). In urgent quantum scenarios, Ethereum researchers have proposed hard fork solutions: freezing vulnerable accounts to allow users to recover assets using post-quantum secure SNARKs by proving they possess the mnemonic phrase. This mechanism is applicable to both EOAs and un-upgraded smart wallets.
The practical impact on users is: well-audited, upgradable smart wallets may provide a slightly smoother migration path — but the difference is not significant, along with trust trade-offs regarding wallet providers and upgrade governance. More important than account types is the continued advancement of post-quantum primitives and contingency plans by the Ethereum community.
Broader design insight: many blockchains tightly couple account identities with specific cryptographic primitives — for example, both Bitcoin and Ethereum are bound to secp256k1, while other chains are bound to EdDSA. The difficulties of post-quantum migration highlight the value of decoupling account identities from specific signature schemes. The evolution of Ethereum towards smart accounts, along with trends in account abstraction on other chains, reflects this direction: allowing accounts to upgrade their authentication logic while retaining on-chain history and state. This will not make post-quantum migration simple, but it significantly enhances flexibility compared to fixing accounts to a single signature scheme. (This also enables other functionalities, such as payment delegation, social recovery, and multi-signatures.)
4. For privacy chains, migration should be prioritized as long as performance allows
These chains encrypt or hide transaction details, so user privacy is currently exposed to HNDL attacks — though severity varies by design. Chains that can fully anonymize solely based on public ledgers are at highest risk. Hybrid schemes (post-quantum + classical) can be adopted to prevent the post-quantum schemes themselves from being proven insecure in classical scenarios, or structural changes can be made to avoid placing decipherable secrets on-chain.
5. Recent priority on achieving implementation security — rather than mitigating quantum threats
Especially for complex primitives like SNARK and post-quantum signatures, vulnerabilities and implementation attacks (side-channel, fault injection) will be far more realistic and urgent than CRQC for many years to come. Now is the time to invest in auditing, fuzz testing, formal verification, and multi-layer defenses — do not let concerns about quantum threats overshadow the more urgent real threat of vulnerabilities!
6. Support the development of quantum computing
From a national security perspective, we must continue to invest in research and development of quantum computing and talent training. If major rival countries achieve CRQC before the U.S., it will pose significant national security risks to the U.S. and the world.
7. Maintain the correct perspective on quantum computing-related announcements
As quantum hardware matures, many milestone news events will emerge in the coming years. Ironically, the frequency of these news events is itself evidence of how far we remain from CRQC: each milestone is just one of many bridges to the ultimate goal, and crossing each bridge triggers a wave of media attention and excitement. Press releases should be viewed as progress reports that require critical evaluation, not as signals demanding immediate action.
Of course, unexpected breakthroughs may occur in the future, accelerating timelines; serious bottlenecks may also arise, slowing timelines.
I want to emphasize: I do not believe that the emergence of CRQC within five years is 'impossible'; it is just 'highly unlikely'. The above suggestions are robust against this uncertainty and can help us avoid more direct and realistic risks: vulnerabilities, hasty deployments, and various errors common in cryptographic migrations.
Justin Thaler is a research partner at a16z and an associate professor in the Department of Computer Science at Georgetown University. His research interests include verifiable computing, complexity theory, and algorithms for large-scale datasets.


