How Much Information is in the Universe?
Based on PBS Space Time's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
The Bekenstein bound limits maximum information in a region by surface area in Planck units, not by volume.
Briefing
The maximum amount of information that can fit inside any region of space is set by its surface area, not its volume—an idea tied to black hole entropy and the Bekenstein bound. That surface-area rule is surprising because everyday intuition treats storage like thumb drives: more space means more capacity. But when physicists translate “capacity” into the information needed to specify every possible state within a region, the limit behaves differently. A region can’t be independently “filled” with arbitrary detail in its interior; once the information density gets too high, the region’s contents collapse behind an event horizon.
The argument starts by estimating how much information would be required to describe the observable universe at the smallest meaningful length scale. Using the Planck length (about 1.6 × 10^-35 meters) as a minimal distance, the universe’s radius—roughly 47 billion light-years, or on the order of 10^61 Planck lengths—implies about 10^183 Planck-volume cells. If each cell held one bit (a yes/no “empty or full” quantum voxel), the count would be around 10^180 bits. That’s already a rough underestimate because real quantum states depend on more than position: momentum, spin direction, and other degrees of freedom increase the number of possible quantum states.
Yet the Bekenstein bound cuts deeper. It limits information in a volume by the number of Planck areas on the region’s boundary. For the observable universe, the surface area corresponds to roughly 10^120 to 10^124 Planck areas, meaning the information limit is about 10^60 orders of magnitude below the naive “one bit per Planck volume” estimate. The universe therefore must be describable using far fewer independent degrees of freedom than a volume-based grid would suggest.
One way to reconcile the mismatch is to recognize that most of space is empty and that information is concentrated in occupied quantum states. Counting occupied phase-space elements leads to a particle-based estimate: the observable universe contains on the order of 10^80 protons, with comparable numbers of electrons, and far more information carried by abundant particles like neutrinos and photons. The Cosmic Microwave Background alone contributes about 10^89 photons across the observable universe. Rounding up for the dominant contributions yields roughly 10^90 bits of information in particles—comfortably below the Bekenstein bound.
Black holes then become the real information bottleneck. Black hole entropy is proportional to the area of the event horizon, and it measures hidden information: the number of distinct initial states that could produce the same black hole. For a supermassive black hole like the one at the center of the Milky Way (Sagittarius A*), with an event horizon area around 10^90 to 10^91 Planck units, the hidden information content matches the entropy scale of the rest of the universe’s matter and radiation. With hundreds of billions of galaxies, the total black-hole information in the observable universe is estimated around 10^101 to 10^102 bits, still below the universe-wide Bekenstein limit.
The bound isn’t just a bookkeeping rule; it predicts what happens if information exceeds capacity. Stuffing too much information into a region would force gravitational collapse, producing a black hole whose event horizon expands to the scale of the region’s boundary—effectively ending the possibility of storing the excess information as ordinary interior degrees of freedom.
The episode closes with a challenge: estimate how large a “black hole computer” would need to be—using memory capacity at the Bekenstein bound on an event horizon—to simulate the entire observable universe, while simplifying assumptions like ignoring dark matter and high-entropy components (neutrinos, cosmic background radiation). A second extra-credit prompt asks how long such a simulator would take, referencing Seth Lloyd’s “computational capacity of the universe” framework.
Cornell Notes
The observable universe’s information capacity is limited by the Bekenstein bound: the maximum information in a region scales with its surface area measured in Planck units, not with its volume. A naive Planck-volume estimate gives an enormous number of bits (around 10^180), but the surface-area limit for the observable universe is far smaller (about 10^120–10^124 Planck areas, corresponding to roughly 10^60 fewer bits than the volume-based count). The actual information in particles is estimated around 10^90 bits, with neutrinos and the Cosmic Microwave Background dominating. Black holes carry far more hidden information because their entropy equals horizon area; a Milky Way–scale black hole already matches the entropy of the rest of the universe’s matter and radiation. Even so, the total black-hole information remains below the universe-wide Bekenstein limit, and exceeding it would trigger gravitational collapse into a black hole.
Why does the information limit scale with surface area instead of volume?
How large is the naive “one bit per Planck volume” estimate for the observable universe?
What does the Bekenstein bound imply numerically for the observable universe’s information capacity?
Where does most of the universe’s particle information come from?
How do black holes change the information picture?
What happens if information exceeds the Bekenstein bound in a region?
Review Questions
- What assumptions lead to the naive 10^180-bit estimate, and why is it likely an underestimate?
- How does black hole entropy connect to the Bekenstein bound, and why does that make surface area the key quantity?
- Compare the estimated information in particles (~10^90 bits) with the estimated information in black holes (~10^101–10^102 bits). What does that imply about where entropy is stored?
Key Points
- 1
The Bekenstein bound limits maximum information in a region by surface area in Planck units, not by volume.
- 2
Black hole entropy provides the core evidence: it scales with event-horizon area, so hidden information follows the same rule.
- 3
A Planck-volume “one bit per voxel” estimate for the observable universe is around 10^180 bits, but the area-based bound is roughly 10^60 lower.
- 4
Most particle information is carried by neutrinos and Cosmic Microwave Background photons, with a rough total around 10^90 bits.
- 5
Black holes dominate entropy because their horizon area encodes hidden information about many possible prior states.
- 6
If a region is forced to exceed its information capacity, it should collapse into a black hole with a horizon matching the region’s scale.
- 7
A proposed challenge asks how large a black-hole-based computer would need to be to simulate the observable universe under simplifying assumptions.