The Quantum Computer Dream is Falling Apart
Based on Sabine Hossenfelder's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
High-precision classical calculations for the FIMO co-actor undercut one of the best-known arguments for near-term quantum advantage in chemistry.
Briefing
Quantum computing’s promise is running into a double bind: researchers are making steady progress on the hardware and error correction, yet practical “quantum advantage” for real problems keeps slipping away. Recent results show that tasks once used to justify quantum machines can be matched—or even outperformed—by conventional computing, while broader analyses argue that the most celebrated quantum use cases have not delivered evidence of a purely quantum speedup, even for small problem sizes.
A concrete example involves simulating a biologically important molecule known as FIMO co-actor, which plays a role in bacterial nitrogen enrichment in soil. The standard pitch is that quantum computers could compute the molecule’s properties—specifically the ground state energy—because conventional computers struggle with the underlying quantum equations. But a Caltech team reported in a new preprint that a conventional computer cluster can calculate the ground state energy with “stunning precision.” The significance isn’t that quantum computers are useless; it’s that the supposed path to a near-term, defensible advantage is harder than expected.
The skepticism extends beyond chemistry. The traveling salesman problem—often treated as a flagship benchmark for quantum approaches—has faced repeated attempts to recast it into a form where quantum hardware would outperform classical methods. A new paper reviews two decades of such efforts and concludes there’s little reason for optimism about purely quantum approaches delivering advantages. The authors argue for sticking with classical or hybrid classical-quantum systems instead, framing the key question as whether there’s evidence that fully quantum methods can solve even small traveling salesman instances better than hybrid strategies.
At the same time, the field’s technical trajectory is not collapsing. Error correction has been demonstrated on small circuits, qubit quality and precision have improved, and headlines have begun to use “transistor moment” language—an analogy to how microchips took off once transistors became cheaper and smaller. But the cost structure for quantum machines is different. Quantum systems rely on demanding cryogenic cooling and noise buffering, and those requirements don’t automatically get cheaper as more qubits are added.
There’s also an energy problem that complicates the “faster” narrative. A recent estimate by Olivier Rati suggests that once error-corrected quantum computers are large enough to do genuinely useful calculations, they could consume power on the scale of entire supercomputing clusters—potentially more depending on design. Even if quantum algorithms offer speedups for specific subroutines, achieving accurate results typically requires repeating computations many times, stretching runtimes and increasing the total energy and cost.
In short, quantum computing’s engineering progress is real, but the route to economical, demonstrable advantage is narrowing—especially when conventional clusters can already deliver high-precision results on some of the most persuasive targets.
Cornell Notes
Quantum computing is advancing on the engineering front—error correction has been tested on small circuits and qubit quality keeps improving—but the case for near-term, practical “quantum advantage” is weakening. A Caltech preprint reported highly precise ground-state energy calculations for the FIMO co-actor using a conventional computer cluster, undercutting a common argument that quantum machines are required for such chemistry. Meanwhile, a new review of traveling salesman problem attempts finds little evidence that purely quantum approaches outperform hybrid classical-quantum methods, even for small instances. The remaining obstacles include high system costs (cryogenic cooling and noise buffering) and potentially massive energy use for large, error-corrected machines, plus the need to repeat computations to reach accuracy.
Why does the FIMO co-actor example matter for the quantum advantage debate?
What does the traveling salesman problem have to do with quantum computing’s marketing—and what’s changing?
How can quantum computers be “faster” yet still not finish sooner overall?
What cost drivers make scaling quantum computers potentially expensive even if transistors get cheaper?
Why does energy consumption become a serious issue for large, error-corrected quantum machines?
Review Questions
- Which recent classical results weaken the argument that quantum computers are necessary for certain chemistry calculations?
- What conclusion does the traveling salesman review reach about purely quantum approaches versus hybrid methods?
- List two non-hardware reasons quantum advantage may be harder to achieve than expected (beyond “qubits are hard”).
Key Points
- 1
High-precision classical calculations for the FIMO co-actor undercut one of the best-known arguments for near-term quantum advantage in chemistry.
- 2
A new review of traveling salesman problem attempts finds little evidence that fully quantum methods outperform hybrid classical-quantum approaches, even for small instances.
- 3
Hardware progress (error correction on small circuits and improving qubit quality) is real, but it doesn’t automatically translate into practical, defensible speedups.
- 4
Quantum cost scaling is dominated by cryogenic cooling and noise buffering, which may not become cheaper as qubit counts rise.
- 5
Energy use is a major constraint: large, error-corrected quantum systems could consume power on the scale of entire supercomputing clusters.
- 6
Quantum speedups for subroutines don’t guarantee shorter total runtimes because repeated runs are often needed to reach accuracy.