Get AI summaries of any video or article — Sign up free
Can Space Be Infinitely Divided? thumbnail

Can Space Be Infinitely Divided?

PBS Space Time·
5 min read

Based on PBS Space Time's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

The Planck length (~1.6×10^-35 m) is treated as the smallest meaningful scale for distance because measurement uncertainty grows when probes get more energetic.

Briefing

Halving the distance between two perfectly tracked points runs into a hard wall at the Planck length: around 1.6×10^-35 meters. The reason isn’t a lack of imagination or coordination—it’s that, at extremely small scales, the usual idea of a smooth, continuous space stops being operationally meaningful. Physicists treat the Planck length as the scale where “distance” itself becomes too uncertain to define, so attempts to keep subdividing space don’t translate into measurable structure.

That Planck-length limit traces back to Max Planck’s quantum breakthrough in the late 19th century. While analyzing blackbody (thermal) radiation, Planck found that energy could not be infinitely divisible; it came in discrete quanta. The Planck constant—nonzero even if tiny—sets the size of that quantum “chunkiness.” Combining the Planck constant with the gravitational constant and the speed of light yields the Planck length, a derived scale thought to mark where space “goes quantum.” The key point is that this number isn’t just a mathematical curiosity: it emerges as the scale where measurement itself runs into quantum and gravitational constraints.

A thought experiment—Heisenberg’s microscope—shows why. Measuring distance with a laser requires timing the light’s round trip, but the measurement can only be pinned down to about one wavelength. Shorter wavelengths improve timing precision, yet they also carry more momentum and energy. That extra momentum transfer feeds back into the target’s position through the Heisenberg uncertainty principle, producing a tradeoff between how precisely position can be measured and how much momentum (and thus energy) becomes uncertain.

Einstein’s ingredients sharpen the limit. If the measuring photon is pushed to ever shorter wavelengths, its energy becomes large enough to create a gravitational field. Even though photons are massless, energy acts like effective mass (via E=mc^2), stretching spacetime and adding a new uncertainty to the distance being measured. The argument balances two competing effects: the usual quantum improvement from shorter wavelengths versus the growing gravitational distortion. When the photon wavelength reaches the Planck length, the two uncertainties match; pushing smaller makes the total uncertainty worse. In this picture, the Planck length becomes the best possible resolution for distance.

Trying to measure an object smaller than that scale triggers more dramatic consequences. A photon energetic enough to probe below the Planck length would effectively generate a black hole with a horizon of comparable size, swallowing the region it was meant to resolve. Another route—localizing an electron—also fails. Confining an electron’s energy to a Planck-length-diameter volume forces energy uncertainty up to the electron’s full mass-energy, enabling pair production: virtual electron–positron pairs pop into existence, making the electron’s position effectively “flit” rather than settle.

All of this doesn’t prove that space is literally made of discrete Lego-like chunks. Instead, it supports a narrower claim: distances are undefined on the Planck scale. General relativity is expected to break down there because spacetime curvature can’t be defined sharply. The leading intuition is that spacetime becomes a quantum froth—often dubbed “spacetime foam,” with virtual fluctuations and even transient black holes—until a full theory of quantum gravity clarifies what “space” and “time” really mean at the smallest scales.

Cornell Notes

The Planck length (~1.6×10^-35 m) acts as the smallest meaningful scale for distance because measurement runs into combined quantum and gravitational limits. Heisenberg’s microscope shows that improving distance precision by using shorter-wavelength light increases momentum/energy uncertainty. Adding Einstein’s ideas, highly energetic photons also curve spacetime, creating an additional distance uncertainty that grows as the wavelength shrinks. When the wavelength reaches the Planck length, the quantum and gravitational uncertainties balance; further shrinking makes measurement worse. Attempts to probe smaller regions would effectively form black holes or trigger pair production, and at that scale spacetime curvature becomes fundamentally uncertain—so “distance” itself loses operational definition.

Why does repeatedly halving a distance eventually stop being meaningful in physics?

Because defining smaller and smaller distances requires more precise measurements, and quantum mechanics plus gravity impose a minimum resolution. As the probe wavelength shrinks, quantum uncertainty from momentum/energy transfer decreases, but gravitational effects from the probe’s energy increase. At the Planck length, these effects balance; below it, the total uncertainty grows, so “distance” can’t be defined sharply enough to claim smaller structure.

How does Heisenberg’s microscope connect measurement precision to the Planck constant?

Distance measurement relies on timing light’s travel time, which is only well-defined to about one wave-cycle—roughly one wavelength. Using shorter wavelengths improves timing precision, but photons then carry larger momentum (momentum ~ h/λ). That momentum transfer feeds into the target’s position via the Heisenberg uncertainty principle, yielding a fundamental tradeoff between position precision and momentum uncertainty. The Planck constant sets the scale of that quantum tradeoff.

What new uncertainty appears when photons get extremely energetic?

Einstein’s mass–energy equivalence implies that energy gravitates. Even though photons are massless, their energy corresponds to an effective mass (energy/c^2), producing a gravitational field. That field changes the geometry between the measuring apparatus and the target, stretching space by a factor tied to the effective mass and gravitational constant over c^2. This spacetime warping adds an uncertainty to the measured distance that grows as photon energy increases.

Why does the Planck length mark the “best possible” distance resolution in this argument?

Two uncertainties compete: (1) the usual quantum measurement uncertainty that improves with shorter wavelength, and (2) the gravitational distortion uncertainty that worsens as wavelength decreases. The argument sets the photon wavelength equal to the Planck length when the two contributions become comparable. Past that point, the gravitational term dominates, so shrinking the wavelength further makes the measurement less precise.

What happens if someone tries to measure a region smaller than the Planck length?

In the light-based picture, the required photon energy would be enough to create a black hole with an event horizon of roughly Planck-length size, preventing the region from being resolved. In the particle-localization picture, confining an electron’s energy to a Planck-length-diameter volume drives energy uncertainty up to the electron’s mass-energy scale, enabling pair production (virtual electron–positron pairs). The resulting continual creation and annihilation makes the electron’s position effectively unstable.

Does this mean space is made of discrete chunks?

Not necessarily. The argument supports that distances are undefined at the Planck scale, not that space must be a simple lattice of fixed-size pieces. It suggests that spacetime curvature becomes fundamentally uncertain and that quantum fluctuations of spacetime—sometimes described as “spacetime foam” with virtual black holes and wormholes—dominate, while the precise nature of spacetime requires a theory of quantum gravity.

Review Questions

  1. In Heisenberg’s microscope, why does using shorter-wavelength light improve timing precision but worsen the position measurement through momentum transfer?
  2. How do gravitational effects from a high-energy photon change the uncertainty budget, and why does that lead to a minimum meaningful length?
  3. What two different failure modes appear when trying to probe below the Planck length—one using black-hole formation and one using pair production?

Key Points

  1. 1

    The Planck length (~1.6×10^-35 m) is treated as the smallest meaningful scale for distance because measurement uncertainty grows when probes get more energetic.

  2. 2

    Heisenberg’s microscope links distance measurement limits to wavelength: timing precision is about one wave-cycle, and shorter wavelengths increase photon momentum.

  3. 3

    The Planck constant sets the scale of quantum “chunkiness,” and it feeds into the Planck length through a specific combination of fundamental constants.

  4. 4

    Einstein’s E=mc^2 implies that energetic photons gravitate, so shrinking wavelength increases spacetime warping and adds a new distance uncertainty.

  5. 5

    When photon wavelength reaches the Planck length, quantum and gravitational uncertainties balance; further reduction makes measurements worse.

  6. 6

    Probing smaller than the Planck length would require energies that can form black holes with Planck-scale horizons or trigger pair production that prevents stable localization.

  7. 7

    At the Planck scale, spacetime curvature and even the concept of distance become fundamentally uncertain, suggesting the need for quantum gravity to define what “space” is doing there.

Highlights

At the Planck length, two uncertainty sources—quantum measurement limits and gravitational distortion from the probe’s energy—become comparable, and pushing smaller makes things less measurable.
A photon energetic enough to probe below the Planck length would effectively create a black hole of similar size, blocking the attempt to resolve smaller structure.
Localizing an electron to within a Planck-length-diameter volume drives energy uncertainty to the point where pair production prevents a stable position definition.
The Planck length is derived from fundamental constants and is interpreted as the scale where “distance” stops being operationally well-defined, not necessarily where space becomes a simple discrete lattice.

Topics