Get AI summaries of any video or article — Sign up free
The Quantum Experiment that Broke Reality | Space Time | PBS Digital Studios thumbnail

The Quantum Experiment that Broke Reality | Space Time | PBS Digital Studios

PBS Space Time·
5 min read

Based on PBS Space Time's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Single-particle double-slit experiments build an interference pattern in the distribution of detection locations, even though each event deposits energy at one spot.

Briefing

A single-particle double-slit experiment delivers the central shock: interference patterns emerge even when photons (or electrons, or even large molecules) are fired one at a time. Each detection event lands at a single, well-defined spot—yet the long-run distribution of those spots builds the same alternating bright and dark bands expected from wave interference. That mismatch between “particle-like” detection and “wave-like” statistics forces a rethink of what quantum objects are doing between preparation and measurement, and it matters because it exposes how quantum reality can’t be reduced to the everyday intuition of objects traveling along definite paths.

The explanation begins with the familiar water-and-light analogy. In classical wave physics, two slits produce an interference pattern because waves add constructively when peaks meet peaks and destructively when peaks meet troughs. Light historically fit this wave picture: Thomas Young’s 1801 double-slit results showed alternating light and dark stripes, and later work by James Clerk Maxwell established light as an electromagnetic wave. But quantum theory adds a twist. Light also arrives in indivisible energy packets—photons—an idea tied to Einstein’s photoelectric effect and Max Planck’s quantized energy levels. If a photon can’t split into two halves, it should choose one slit or the other. Still, the interference pattern appears in the final positions even when photons are sent individually.

The pattern doesn’t come from any spreading of a single photon’s energy across the screen. Each photon deposits its energy at one location. Instead, the interference emerges statistically: after many independent photons, the landing positions match the probability distribution produced by a wave that passes through both slits. The same behavior shows up with electrons and with whole atoms and molecules under special conditions, including buckminsterfullerene (“buckyballs”), a 60-carbon spherical molecule. The implication is stark: each individual quantum entity behaves as if it travels through both slits in a wave-like way, and that wave-like behavior determines where it is most likely to be detected.

Quantum mechanics formalizes this with a wave function: a mathematical description of how properties like position (and, in other experiments, momentum, energy, and spin) behave as wave-like distributions. The wave function encodes possible outcomes at every stage of the journey, effectively mapping a family of possible paths from source to screen. The remaining mystery is what turns that cloud of possibilities into a single detected result.

One influential answer is the Copenhagen interpretation associated with Werner Heisenberg and Niels Bohr. It treats the wave function not as a physical object but as pure possibility. Before detection, it’s meaningless to assign definite properties to the particle; only when measurement occurs does the wave function “collapse,” selecting a specific location and path. In this view, different possible paths act like competing alternatives whose interactions shape the final probability distribution, while the actual outcome remains fundamentally random within the constraints of the wave function.

Other interpretations keep the same experimental predictions but change the ontology—some give the wave function physical reality, and the many-worlds approach (teased for later) replaces collapse with branching outcomes. Either way, the double-slit result remains the benchmark: reality yields interference not just from many events, but from single events whose statistics only make sense through wave-like quantum behavior.

Cornell Notes

Single-particle double-slit experiments produce interference patterns even when photons, electrons, or large molecules are fired one at a time. Each detection is a single, localized event, but the accumulated landing positions form the same bright-and-dark bands expected from classical wave interference. Quantum theory captures this with a wave function, which encodes probabilities for outcomes and possible paths between preparation and detection. The key interpretive divide is what the wave function “is” and what causes the transition from many possibilities to one observed result. The Copenhagen interpretation (Heisenberg and Bohr) treats the wave function as pure possibility and describes measurement as wave-function collapse, while other interpretations assign more physical reality to the wave function or avoid collapse altogether.

Why does the double-slit experiment still produce an interference pattern when particles are fired one at a time?

In the classical picture, interference arises because waves from two slits add constructively or destructively. Quantum particles behave differently at the level of individual detections: a photon (or electron) deposits all its energy at one spot on the screen. Yet after many single-particle trials, the distribution of those single landing positions matches the interference pattern expected from a wave passing through both slits. The pattern therefore reflects the probability structure encoded by the wave function, not a literal splitting of energy into two halves that later recombines.

What does the wave function represent in the quantum-mechanics description?

The wave function is the mathematical object that describes wave-like distributions of quantum properties. In the double-slit setup, it encodes possible final positions and also the possible positions (and paths) at intermediate stages between preparation and detection. More broadly, quantum behavior shows similar waviness for momentum, energy, and spin in other experiments. The wave function’s role is central because it determines the probabilities of measurement outcomes.

What is the Copenhagen interpretation’s account of measurement and “collapse”?

The Copenhagen interpretation treats the wave function as pure possibility rather than a physical wave with definite properties. Before detection, it argues that assigning definite properties like “which slit” is not meaningful. Measurement triggers “collapse,” selecting a specific location and path consistent with the wave function. In this framework, multiple possible paths act like alternatives whose interactions shape the final probability distribution, but the actual outcome is fundamentally random within those constraints.

How do electrons and buckyballs strengthen the case that quantum behavior isn’t limited to light?

The same qualitative effect—single-event detections building an interference pattern—appears when electrons are fired through a double-slit apparatus. It also shows up for whole atoms and molecules under special conditions, including buckminsterfullerene (“buckyballs”), a 60-carbon spherical molecule. Observing interference with increasingly massive systems supports the idea that wave-like behavior is a general feature of quantum entities, not a peculiarity of photons.

What tension does the double-slit result create with everyday intuition?

Everyday intuition treats objects as having definite trajectories. The double-slit experiment instead produces particle-like detections at single points while requiring wave-like structure to explain the eventual distribution. The result implies that between creation and detection, a quantum object cannot be fully described as traveling along a single definite path; it behaves as if it carries information about multiple possible paths until measurement selects one outcome.

Review Questions

  1. In what sense is the interference pattern “wave-like” if each individual detection event is localized?
  2. According to the Copenhagen interpretation, why is it considered meaningless to define a particle’s properties before detection?
  3. How does observing double-slit interference with electrons and buckyballs affect interpretations of what quantum entities are doing?

Key Points

  1. 1

    Single-particle double-slit experiments build an interference pattern in the distribution of detection locations, even though each event deposits energy at one spot.

  2. 2

    Classical interference depends on constructive and destructive overlap of waves; quantum interference instead reflects probabilities encoded by the wave function.

  3. 3

    Photons are indivisible energy packets, yet their single-event detections still follow interference statistics, implying wave-like behavior beyond simple “which slit” choices.

  4. 4

    Electrons and large molecules like buckminsterfullerene (“buckyballs”) also produce double-slit interference, indicating the effect is not limited to light.

  5. 5

    The wave function is the central mathematical tool for describing wave-like distributions of quantum properties across possible paths and intermediate stages.

  6. 6

    The Copenhagen interpretation (Heisenberg and Niels Bohr) treats the wave function as pure possibility and describes measurement as wave-function collapse.

  7. 7

    Interpretations differ on what the wave function “is” and whether collapse is fundamental, even though they must match the same experimental predictions.

Highlights

Interference emerges from many single, localized detections: the screen’s bright-and-dark bands are a statistical signature of quantum wave-like probability.
A photon can’t split into two halves, yet it behaves as if it carries information about both slits until detection selects one outcome.
The same interference logic extends to electrons and to buckminsterfullerene (“buckyballs”), pushing the phenomenon beyond light.
The Copenhagen interpretation frames the wave function as pure possibility, with “collapse” occurring at detection rather than before.
The wave function encodes possible paths and possible outcomes throughout the journey, not just at the moment of measurement.