Get AI summaries of any video or article — Sign up free
How Much Information is in the Universe? thumbnail

How Much Information is in the Universe?

PBS Space Time·
6 min read

Based on PBS Space Time's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

The Bekenstein bound limits maximum information in a region by surface area in Planck units, not by volume.

Briefing

The maximum amount of information that can fit inside any region of space is set by its surface area, not its volume—an idea tied to black hole entropy and the Bekenstein bound. That surface-area rule is surprising because everyday intuition treats storage like thumb drives: more space means more capacity. But when physicists translate “capacity” into the information needed to specify every possible state within a region, the limit behaves differently. A region can’t be independently “filled” with arbitrary detail in its interior; once the information density gets too high, the region’s contents collapse behind an event horizon.

The argument starts by estimating how much information would be required to describe the observable universe at the smallest meaningful length scale. Using the Planck length (about 1.6 × 10^-35 meters) as a minimal distance, the universe’s radius—roughly 47 billion light-years, or on the order of 10^61 Planck lengths—implies about 10^183 Planck-volume cells. If each cell held one bit (a yes/no “empty or full” quantum voxel), the count would be around 10^180 bits. That’s already a rough underestimate because real quantum states depend on more than position: momentum, spin direction, and other degrees of freedom increase the number of possible quantum states.

Yet the Bekenstein bound cuts deeper. It limits information in a volume by the number of Planck areas on the region’s boundary. For the observable universe, the surface area corresponds to roughly 10^120 to 10^124 Planck areas, meaning the information limit is about 10^60 orders of magnitude below the naive “one bit per Planck volume” estimate. The universe therefore must be describable using far fewer independent degrees of freedom than a volume-based grid would suggest.

One way to reconcile the mismatch is to recognize that most of space is empty and that information is concentrated in occupied quantum states. Counting occupied phase-space elements leads to a particle-based estimate: the observable universe contains on the order of 10^80 protons, with comparable numbers of electrons, and far more information carried by abundant particles like neutrinos and photons. The Cosmic Microwave Background alone contributes about 10^89 photons across the observable universe. Rounding up for the dominant contributions yields roughly 10^90 bits of information in particles—comfortably below the Bekenstein bound.

Black holes then become the real information bottleneck. Black hole entropy is proportional to the area of the event horizon, and it measures hidden information: the number of distinct initial states that could produce the same black hole. For a supermassive black hole like the one at the center of the Milky Way (Sagittarius A*), with an event horizon area around 10^90 to 10^91 Planck units, the hidden information content matches the entropy scale of the rest of the universe’s matter and radiation. With hundreds of billions of galaxies, the total black-hole information in the observable universe is estimated around 10^101 to 10^102 bits, still below the universe-wide Bekenstein limit.

The bound isn’t just a bookkeeping rule; it predicts what happens if information exceeds capacity. Stuffing too much information into a region would force gravitational collapse, producing a black hole whose event horizon expands to the scale of the region’s boundary—effectively ending the possibility of storing the excess information as ordinary interior degrees of freedom.

The episode closes with a challenge: estimate how large a “black hole computer” would need to be—using memory capacity at the Bekenstein bound on an event horizon—to simulate the entire observable universe, while simplifying assumptions like ignoring dark matter and high-entropy components (neutrinos, cosmic background radiation). A second extra-credit prompt asks how long such a simulator would take, referencing Seth Lloyd’s “computational capacity of the universe” framework.

Cornell Notes

The observable universe’s information capacity is limited by the Bekenstein bound: the maximum information in a region scales with its surface area measured in Planck units, not with its volume. A naive Planck-volume estimate gives an enormous number of bits (around 10^180), but the surface-area limit for the observable universe is far smaller (about 10^120–10^124 Planck areas, corresponding to roughly 10^60 fewer bits than the volume-based count). The actual information in particles is estimated around 10^90 bits, with neutrinos and the Cosmic Microwave Background dominating. Black holes carry far more hidden information because their entropy equals horizon area; a Milky Way–scale black hole already matches the entropy of the rest of the universe’s matter and radiation. Even so, the total black-hole information remains below the universe-wide Bekenstein limit, and exceeding it would trigger gravitational collapse into a black hole.

Why does the information limit scale with surface area instead of volume?

The Bekenstein bound ties maximum entropy (hidden information) in a region to the number of Planck-area elements on its boundary. The key insight comes from black holes: black hole entropy is proportional to the area of the event horizon, not the interior volume. Since entropy is a measure of hidden information, the same area-based limit applies to any region—if you try to pack too much information into the interior, the region collapses into a black hole whose horizon reflects the bound.

How large is the naive “one bit per Planck volume” estimate for the observable universe?

Using the Planck length (~1.6 × 10^-35 m) as the smallest distance scale, the observable universe’s radius is about 47 billion light-years, or on the order of a few × 10^61 Planck lengths. That implies roughly 4πR^3/3 Planck volumes, giving about 10^183 Planck-volume cells. If each cell held one bit, the estimate lands near 10^180 bits (noting this is a lower bound because quantum states depend on more than just occupancy of position cells).

What does the Bekenstein bound imply numerically for the observable universe’s information capacity?

The observable universe’s surface area corresponds to about 10^120 to 10^124 Planck areas, depending on rounding conventions. Because the bound scales with surface area, the maximum information is about 10^60 lower than the naive volume-element count. In other words, the universe can’t store independent information for every Planck-volume cell; the boundary-area limit is the controlling factor.

Where does most of the universe’s particle information come from?

Most information is concentrated in occupied quantum states, not empty space. A particle-based estimate counts abundant species: protons and electrons are around 10^80, but neutrinos and photons dominate because they are far more numerous. The Cosmic Microwave Background contributes about 10^89 photons across the observable universe, and neutrinos are similarly abundant. Rounding for dominant contributions yields roughly 10^90 bits of information in particles.

How do black holes change the information picture?

Black holes dominate entropy because their entropy equals the event-horizon area. The entropy corresponds to the number of possible initial states that could collapse into the same black hole, meaning the hidden information is enormous. For Sagittarius A* (mass ~4 million Suns), the event horizon area is about 10^90 to 10^91 Planck units—comparable to the entropy scale of the rest of the universe’s matter and radiation. With hundreds of billions of galaxies, the total black-hole information is estimated around 10^101 to 10^102 bits, still below the universe-wide Bekenstein bound.

What happens if information exceeds the Bekenstein bound in a region?

Exceeding the bound would force gravitational collapse. The region would become a black hole with an event horizon sized to the relevant scale (described as reaching the current cosmic horizon scale in the thought experiment). The episode frames the bound as applying to engineered storage too: once the capacity is exceeded, the interior can’t remain as ordinary stored degrees of freedom.

Review Questions

  1. What assumptions lead to the naive 10^180-bit estimate, and why is it likely an underestimate?
  2. How does black hole entropy connect to the Bekenstein bound, and why does that make surface area the key quantity?
  3. Compare the estimated information in particles (~10^90 bits) with the estimated information in black holes (~10^101–10^102 bits). What does that imply about where entropy is stored?

Key Points

  1. 1

    The Bekenstein bound limits maximum information in a region by surface area in Planck units, not by volume.

  2. 2

    Black hole entropy provides the core evidence: it scales with event-horizon area, so hidden information follows the same rule.

  3. 3

    A Planck-volume “one bit per voxel” estimate for the observable universe is around 10^180 bits, but the area-based bound is roughly 10^60 lower.

  4. 4

    Most particle information is carried by neutrinos and Cosmic Microwave Background photons, with a rough total around 10^90 bits.

  5. 5

    Black holes dominate entropy because their horizon area encodes hidden information about many possible prior states.

  6. 6

    If a region is forced to exceed its information capacity, it should collapse into a black hole with a horizon matching the region’s scale.

  7. 7

    A proposed challenge asks how large a black-hole-based computer would need to be to simulate the observable universe under simplifying assumptions.

Highlights

The universe’s maximum information capacity scales like area: the boundary matters more than the interior.
A naive “one bit per Planck volume” count overshoots the Bekenstein bound by about 60 orders of magnitude.
Sagittarius A*’s event-horizon entropy is comparable to the entropy of the rest of the observable universe’s matter and radiation.
Packing too much information into a region would trigger black hole formation, effectively enforcing the bound.

Mentioned