Get AI summaries of any video or article — Sign up free
This Physicist Says Black Holes are Quantum Computers thumbnail

This Physicist Says Black Holes are Quantum Computers

Sabine Hossenfelder·
6 min read

Based on Sabine Hossenfelder's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Higher-energy collisions do not indefinitely probe shorter distances because sufficiently energetic collisions can form black holes, with the crossover near the Planck energy scale.

Briefing

Black holes may function as quantum computers because the physics that governs them blends short-distance quantum behavior with long-distance gravitational behavior—and that combination could, in principle, support fast storage and rapid processing of information. The core chain starts with a constraint on how “higher energy” probes nature: in ordinary particle physics, higher-energy probes shorter distances, but pushing collisions to extreme energies eventually forms a black hole instead of revealing ever-finer structure. That turning point sits near the Planck energy scale—around 14 orders of magnitude above energies accessible to the Large Hadron Collider—implying that black holes act like a bridge between the ultraviolet (high-energy) regime and the infrared (low-energy) regime.

A key concept drawn from this is “UV/IR mixing,” where black holes effectively merge the physics of quantum fields at high energies with gravity at low energies. Taking that seriously leads to a picture in which black holes are not just classical objects but are best understood in quantum terms as systems dominated by gravitons, the hypothetical quantum carriers of gravity. In this view, information about what falls in is encoded in the black hole’s quantum degrees of freedom, and the information is not lost: it remains accessible on the horizon. That claim aligns with the holographic principle, a framework associated with Jacob Bekenstein and Stephen Hawking that has become a leading candidate for resolving the black hole information loss problem.

The argument then shifts from storage to dynamics. Over the past decade, research has increasingly portrayed black holes as chaotic rather than inert. Anything that falls in gets converted into information and redistributed across the horizon extremely quickly, a behavior summarized by the idea that black holes are “fast scramblers.” The result is a system with two standout properties: unusually high information capacity per unit surface area and unusually rapid information scrambling times. Together, those traits resemble the requirements for computation—especially for tasks that benefit from storing large amounts of data and rapidly mixing it.

From there, the speculative leap becomes more concrete: an advanced civilization could use black holes as data centers. The proposal is to create many small black holes, grow them to an “optimal” size, and arrange them in arrays to perform calculations. The optimal mass is framed as a tradeoff: larger black holes store more information, but retrieving or manipulating that information takes longer; smaller ones are faster but hold less. The estimate given for the computationally optimal size is on the order of a thousand tons—still far smaller than an atomic nucleus.

If such black-hole data centers existed, they would likely radiate a distinctive “exhaust” signature from Hawking-like radiation. That radiation could, in principle, be detectable in sky surveys, offering another angle in the search for extraterrestrial intelligence—echoing older ideas like Freeman Dyson’s notion of large engineered structures around stars.

Even with the speculative framing, the through-line is consistent: the black-hole quantum/gravity picture, combined with fast scrambling and horizon-accessible information, makes black holes look less like cosmic dead ends and more like information-processing machines. The most provocative implication is that if black holes are that capable at storing and processing information, they might also be capable of forms of intelligence—an idea that goes beyond physics into science fiction, but is presented as a logical consequence of the underlying information-theoretic claims.

Cornell Notes

The argument links black holes to quantum computation by combining three ideas: (1) at sufficiently high energies, attempts to probe shorter distances instead produce black holes, leading to UV/IR mixing; (2) black holes encode information in quantum gravitational degrees of freedom (gravitons) and keep it accessible on the horizon via the holographic principle; and (3) black holes behave like fast scramblers, rapidly converting infalling matter into information and distributing it across the horizon. Those traits—high information capacity per surface area and extremely fast scrambling—make black holes plausible candidates for data storage and processing. A speculative extension suggests an advanced civilization could build arrays of small, optimally sized black holes (estimated around a thousand tons) and detect them through characteristic radiation signatures.

Why doesn’t “more collision energy” keep probing shorter distances forever?

In standard intuition, higher-energy probes have shorter wavelengths and therefore better spatial resolution (the same logic behind microscopes and particle colliders). But if energy keeps increasing in particle collisions, the system eventually forms a black hole rather than revealing finer structure. Pushing further makes larger black holes. The crossover scale is estimated near the Planck energy, roughly 14 orders of magnitude above the energy scale of the Large Hadron Collider. That implies a limit to the usual UV probing picture and motivates treating black holes as a regime where short-distance quantum behavior and long-distance gravity both matter.

What is UV/IR mixing, and why does it matter for viewing black holes as quantum information systems?

UV/IR mixing is the idea that high-energy (ultraviolet) physics and low-energy (infrared) gravity physics become intertwined in black-hole contexts. Instead of treating them as separate regimes, black holes effectively connect the two: high-energy quantum behavior influences what happens, while gravity governs the large-scale structure. This motivates a unified quantum-gravity description of black holes rather than a purely classical one.

How do gravitons and the holographic principle enter the information story?

Taking the UV/IR mixing picture seriously leads to a quantum-gravity view where black holes are described in terms of gravitons—the hypothetical quantum units of gravity. In that framework, information about what formed the black hole or what falls in is stored in the black hole’s quantum degrees of freedom. The holographic principle then claims that this information remains accessible on the horizon, tying the internal information content to surface data. That combination supports the idea that information is not destroyed, even if it becomes scrambled.

What does “fast scrambling” mean, and how does it relate to computation?

Fast scrambling describes how quickly a black hole spreads information across its horizon after something falls in. The research trend referenced here characterizes black holes as unusually efficient at converting infalling information into a distributed, highly mixed state. For computation, that matters because many algorithms (especially those relying on mixing or rapid propagation of information) benefit from systems that can store large amounts of data and then rapidly redistribute it.

How is the “optimal” black hole size for computation estimated, and what tradeoff drives it?

The tradeoff is between storage capacity and retrieval/processing time. Larger black holes can store more information because information capacity grows with mass (or equivalently with horizon area), but the time scales associated with interacting with or extracting information also increase with mass. Smaller black holes are faster but hold less. The estimate given for the computationally optimal size is around a thousand tons—tiny compared with atomic nuclei, yet large enough to balance the competing requirements.

What observational signature could such black-hole data centers produce?

The speculative extension predicts characteristic “exhaust” radiation from the black holes, tied to Hawking-like radiation. If many black holes were used in arrays, their combined radiation could produce a detectable signature in sky surveys. That would offer a nontraditional route to searching for extraterrestrial technology, complementing older ideas such as Dyson-style megastructures.

Review Questions

  1. What physical argument sets a limit on using higher-energy collisions to probe shorter distances, and what energy scale is associated with that limit?
  2. How do UV/IR mixing, the holographic principle, and fast scrambling each contribute to the case for black holes as information processors?
  3. Why does the proposed computationally optimal black hole mass involve a tradeoff between information storage and information retrieval time?

Key Points

  1. 1

    Higher-energy collisions do not indefinitely probe shorter distances because sufficiently energetic collisions can form black holes, with the crossover near the Planck energy scale.

  2. 2

    UV/IR mixing links high-energy quantum behavior with low-energy gravitational behavior in black-hole physics, motivating a unified quantum-gravity description.

  3. 3

    A graviton-based quantum picture suggests black holes store information in quantum gravitational degrees of freedom rather than destroying it.

  4. 4

    The holographic principle frames that stored information as accessible on the horizon, aligning with leading approaches to the black hole information loss problem.

  5. 5

    Black holes are increasingly characterized as chaotic “fast scramblers,” rapidly converting infalling information into a horizon-wide distribution.

  6. 6

    If black holes combine high information capacity with rapid scrambling, they could be engineered as data centers using many small black holes in arrays.

  7. 7

    The speculative detectability hinges on characteristic radiation from such black holes, potentially observable in sky surveys.

Highlights

At extreme energies, attempts to probe ever-smaller distances give way to black-hole formation, implying a fundamental limit to the usual “higher energy = shorter distance” intuition.
UV/IR mixing treats black holes as a bridge between quantum high-energy physics and classical/low-energy gravity.
Fast scrambler behavior turns black holes into unusually efficient information mixers, not inert cosmic objects.
An estimated computational sweet spot for black-hole data centers is around a thousand tons, balancing storage capacity against retrieval time.
If engineered, black holes could emit a distinctive radiation signature that might be detectable in astronomical surveys.

Topics

Mentioned