Get AI summaries of any video or article — Sign up free
The Quantum Computer Dream is Falling Apart thumbnail

The Quantum Computer Dream is Falling Apart

Sabine Hossenfelder·
5 min read

Based on Sabine Hossenfelder's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

High-precision classical calculations for the FIMO co-actor undercut one of the best-known arguments for near-term quantum advantage in chemistry.

Briefing

Quantum computing’s promise is running into a double bind: researchers are making steady progress on the hardware and error correction, yet practical “quantum advantage” for real problems keeps slipping away. Recent results show that tasks once used to justify quantum machines can be matched—or even outperformed—by conventional computing, while broader analyses argue that the most celebrated quantum use cases have not delivered evidence of a purely quantum speedup, even for small problem sizes.

A concrete example involves simulating a biologically important molecule known as FIMO co-actor, which plays a role in bacterial nitrogen enrichment in soil. The standard pitch is that quantum computers could compute the molecule’s properties—specifically the ground state energy—because conventional computers struggle with the underlying quantum equations. But a Caltech team reported in a new preprint that a conventional computer cluster can calculate the ground state energy with “stunning precision.” The significance isn’t that quantum computers are useless; it’s that the supposed path to a near-term, defensible advantage is harder than expected.

The skepticism extends beyond chemistry. The traveling salesman problem—often treated as a flagship benchmark for quantum approaches—has faced repeated attempts to recast it into a form where quantum hardware would outperform classical methods. A new paper reviews two decades of such efforts and concludes there’s little reason for optimism about purely quantum approaches delivering advantages. The authors argue for sticking with classical or hybrid classical-quantum systems instead, framing the key question as whether there’s evidence that fully quantum methods can solve even small traveling salesman instances better than hybrid strategies.

At the same time, the field’s technical trajectory is not collapsing. Error correction has been demonstrated on small circuits, qubit quality and precision have improved, and headlines have begun to use “transistor moment” language—an analogy to how microchips took off once transistors became cheaper and smaller. But the cost structure for quantum machines is different. Quantum systems rely on demanding cryogenic cooling and noise buffering, and those requirements don’t automatically get cheaper as more qubits are added.

There’s also an energy problem that complicates the “faster” narrative. A recent estimate by Olivier Rati suggests that once error-corrected quantum computers are large enough to do genuinely useful calculations, they could consume power on the scale of entire supercomputing clusters—potentially more depending on design. Even if quantum algorithms offer speedups for specific subroutines, achieving accurate results typically requires repeating computations many times, stretching runtimes and increasing the total energy and cost.

In short, quantum computing’s engineering progress is real, but the route to economical, demonstrable advantage is narrowing—especially when conventional clusters can already deliver high-precision results on some of the most persuasive targets.

Cornell Notes

Quantum computing is advancing on the engineering front—error correction has been tested on small circuits and qubit quality keeps improving—but the case for near-term, practical “quantum advantage” is weakening. A Caltech preprint reported highly precise ground-state energy calculations for the FIMO co-actor using a conventional computer cluster, undercutting a common argument that quantum machines are required for such chemistry. Meanwhile, a new review of traveling salesman problem attempts finds little evidence that purely quantum approaches outperform hybrid classical-quantum methods, even for small instances. The remaining obstacles include high system costs (cryogenic cooling and noise buffering) and potentially massive energy use for large, error-corrected machines, plus the need to repeat computations to reach accuracy.

Why does the FIMO co-actor example matter for the quantum advantage debate?

FIMO co-actor is a biologically significant molecule involved in bacterial nitrogen enrichment in soil, with potential downstream benefits for fertilizer and food production. The usual justification for quantum computing is that conventional computers can’t solve the molecule’s quantum equations efficiently, so a quantum computer would be needed to compute properties like ground state energy. However, a Caltech team reported in a new preprint that a conventional cluster can calculate the ground state energy with very high precision. That result doesn’t prove quantum computers can’t help, but it shows that some high-profile targets may be reachable with classical resources, making it harder to claim a clear, near-term quantum edge.

What does the traveling salesman problem have to do with quantum computing’s marketing—and what’s changing?

The traveling salesman problem (finding the shortest route through given locations) became a go-to benchmark for quantum computing because related optimization tasks appear across logistics, finance, and even 3D printing. Over roughly two decades, researchers tried to reformulate it so quantum hardware would have an advantage. A new paper reviews those attempts and concludes there’s little cause for optimism about purely quantum approaches solving even small traveling salesman instances better than hybrid classical-quantum methods. The authors recommend focusing on classical or hybrid strategies rather than expecting a clean quantum-only win.

How can quantum computers be “faster” yet still not finish sooner overall?

Speedups often apply to specific parts of a computation, not necessarily to the full end-to-end task. Quantum algorithms also typically require repeating runs many times to achieve the desired accuracy. Even if the peak computational speed is higher, the need for repeated sampling can make total runtime longer, which in turn affects cost and energy.

What cost drivers make scaling quantum computers potentially expensive even if transistors get cheaper?

Microchips succeeded partly because transistors became dramatically smaller and cheaper to manufacture. Quantum computers don’t scale with the same economics. Their major cost drivers include cryogenic cooling and noise buffering—requirements that are likely to remain expensive or intensify as more qubits are added. So the cost curve may not improve the way it did for classical semiconductor scaling.

Why does energy consumption become a serious issue for large, error-corrected quantum machines?

Large quantum computers that can run useful, error-corrected calculations may require power comparable to entire supercomputing clusters. Olivier Rati’s estimate suggests that once quantum systems reach the size needed for meaningful work, their energy use could be extremely high—possibly even higher depending on the design. That means quantum computing isn’t just expensive to build; it could also be costly to operate.

Review Questions

  1. Which recent classical results weaken the argument that quantum computers are necessary for certain chemistry calculations?
  2. What conclusion does the traveling salesman review reach about purely quantum approaches versus hybrid methods?
  3. List two non-hardware reasons quantum advantage may be harder to achieve than expected (beyond “qubits are hard”).

Key Points

  1. 1

    High-precision classical calculations for the FIMO co-actor undercut one of the best-known arguments for near-term quantum advantage in chemistry.

  2. 2

    A new review of traveling salesman problem attempts finds little evidence that fully quantum methods outperform hybrid classical-quantum approaches, even for small instances.

  3. 3

    Hardware progress (error correction on small circuits and improving qubit quality) is real, but it doesn’t automatically translate into practical, defensible speedups.

  4. 4

    Quantum cost scaling is dominated by cryogenic cooling and noise buffering, which may not become cheaper as qubit counts rise.

  5. 5

    Energy use is a major constraint: large, error-corrected quantum systems could consume power on the scale of entire supercomputing clusters.

  6. 6

    Quantum speedups for subroutines don’t guarantee shorter total runtimes because repeated runs are often needed to reach accuracy.

Highlights

Caltech reported highly precise ground state energy calculations for the FIMO co-actor using a conventional computer cluster, challenging a common “quantum-only” chemistry claim.
A two-decade review of quantum approaches to the traveling salesman problem finds little cause for optimism about purely quantum advantages, recommending hybrid strategies instead.
Even with error correction progress, cryogenic cooling, noise buffering, and potentially supercomputer-scale energy consumption complicate the path to economical quantum computing.

Topics

Mentioned