Get AI summaries of any video or article — Sign up free
What is NOT Random? thumbnail

What is NOT Random?

Veritasium·
5 min read

Based on Veritasium's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Information is framed as a measure of unpredictability: redundant patterns (like common letter sequences) reduce information and enable compression.

Briefing

The universe isn’t “random” in the everyday sense—many outcomes are predictable—but the arrow of time and the limits of prediction point to a deeper kind of randomness that keeps generating new information. The key idea is that information is tied to entropy: highly ordered states carry little information and compress easily, while maximally disordered states carry so much information that they can’t be compressed. That framing matters because it links what people call “meaning” and “predictability” to the physical laws governing how systems evolve.

At the start, the argument borrows Laplace’s famous thought experiment: if every particle’s position and velocity were known, then the future of the universe would be fully determined. That would imply no randomness anywhere—not even in human behavior—because everything would follow from the universe’s state at a given time. But the discussion then shifts from determinism to what “information” actually is. Information is first described as order—DNA’s molecular sequence, the order of bits in data streams, and the arrangement of letters and words. Yet not all order counts equally. Common patterns are redundant: after a “Q” comes a “U” most of the time, and Shannon estimated English has about 75% redundancy, making it compressible. Even video is compressible because neighboring pixels cluster by color and most pixels barely change from frame to frame.

From there, the concept flips. The most informative objects are those that are as unpredictable as possible: random strings of zeros and ones, or white-noise video where each pixel is independent and changes without pattern. Such states have maximum entropy and can’t be compressed. But they also carry no human meaning—randomness doesn’t build words, organisms, or recognizable structures. Meaning tends to emerge in the middle ground: neither perfectly ordered nor maximally disordered, but structured enough for patterns to be detected.

The discussion then connects this to physics. If Laplace’s determinism held perfectly, the universe’s information content would stay constant. Since information is entropy, that would mean entropy would be constant too. Instead, the second law of thermodynamics says entropy increases over time, implying the universe’s information content rises. The question becomes: where does the new information come from? The best candidate offered is quantum mechanics. Quantum theory is probabilistic: outcomes like where an electron is found can’t be predicted with absolute certainty, only with probabilities. When a measurement pins down a specific result, it yields information that wasn’t determined beforehand.

Those quantum “new bits” are proposed as a driver of increasing entropy, and thus the second law. The argument also reframes the second law as a prerequisite for genuine unpredictability: in a universe where entropy must rise, the future can’t be fully fixed in advance. Finally, the role of chaos—systems so sensitive that tiny differences later explode into large divergences—suggests that even microscopic quantum randomness could matter at macroscopic scales, potentially influencing brain states and supporting the felt experience of free will. The result is a universe where the future is at least somewhat undetermined, not perfectly predictable.

Cornell Notes

The transcript links predictability, information, and thermodynamics. Order makes data compressible and low in information; maximum randomness produces high entropy and can’t be compressed, but it also lacks meaning. If the universe were fully deterministic in Laplace’s sense, entropy (and thus information) would remain constant. Instead, the second law says entropy increases, implying new information is continually generated. Quantum mechanics supplies a mechanism: probabilistic outcomes become definite only when measured, creating information that wasn’t knowable with certainty beforehand; chaos then amplifies tiny quantum differences into large effects.

How does the transcript connect “information” to “order” and “entropy”?

Information is first described as the usefulness of order: DNA’s sequence encodes how an organism is built, and the order of bits, letters, and words encodes messages. But order can be redundant—after “Q” a “U” is likely—so some “order” carries less information because it’s predictable. Shannon’s estimate that English has about 75% redundancy illustrates why patterns compress well. The transcript then defines the extreme: maximum information corresponds to maximum unpredictability (a random string of zeros and ones), which has maximum entropy. In that view, information and entropy are fundamentally the same quantity: low-entropy ordered strings contain little information, while high-entropy random strings contain so much information they can’t be compressed.

Why does maximum randomness carry “no meaning,” even though it contains maximum information?

A maximally random sequence is described as white noise—each pixel changes independently, and no digit can be predicted from others. That makes it incompressible and information-rich in a technical sense. But meaning for humans depends on structure that supports recognizable patterns: random strings of letters don’t generally form words, and random DNA sequences don’t build organisms. So the transcript draws a middle-ground claim: meaning tends to appear neither in perfect order (too little information) nor in perfect disorder (too little structure to interpret), but in complex patterns between the two extremes.

What would Laplace’s determinism imply about entropy, and why is that a problem?

Laplace’s idea is that if the positions and velocities of all fundamental particles were known, the entire future would be determined. The transcript extends this to the universe’s information content: if the future is fully fixed by the present state, then the universe’s information wouldn’t need to grow over time. Since information is equated with entropy, that would mean entropy stays constant. The problem is the second law of thermodynamics, which says entropy increases with time. That observed increase implies the universe’s information content rises, contradicting the idea of perfectly constant information under strict determinism.

How does quantum mechanics supply “new information” in the transcript’s framework?

Quantum mechanics is presented as probabilistic: it can’t predict with absolute certainty where an electron will be at a later time, only the probabilities of where it’s likely to be found. When an interaction or measurement locates the electron at a specific point, that result becomes known and therefore counts as information gained that wasn’t determined beforehand with certainty. The transcript connects these measurement events to the growth of entropy: each quantum “pinning down” generates new information, increasing disorder in the thermodynamic sense.

Why does chaos matter for turning tiny quantum randomness into large outcomes like free will?

The transcript argues that quantum events might seem too small to matter, but chaotic systems are sensitive to initial conditions. In chaos, extremely small differences can produce dramatically different later behavior—often summarized as the butterfly effect. If brains and people are modeled as physical systems with chaotic dynamics, then quantum-level randomness could influence which macroscopic neural states occur. That provides a route from probabilistic quantum events to undetermined future behavior, supporting the idea that free will requires the second law’s unpredictability.

Review Questions

  1. If information is treated as entropy, what does that imply about compressibility of ordered versus random data?
  2. What chain of reasoning links the second law of thermodynamics to the claim that the future is not fully determined?
  3. How do quantum measurement and chaos jointly affect the transcript’s view of unpredictability in human behavior?

Key Points

  1. 1

    Information is framed as a measure of unpredictability: redundant patterns (like common letter sequences) reduce information and enable compression.

  2. 2

    Maximally random states have maximum entropy and can’t be compressed, but they also lack the structured patterns needed for meaning.

  3. 3

    Laplace-style determinism would imply constant information over time, which would contradict the second law’s observed entropy increase.

  4. 4

    The second law is interpreted as evidence that new information is continually generated rather than merely revealed.

  5. 5

    Quantum mechanics is presented as a source of genuinely probabilistic outcomes, with measurement producing information that wasn’t fixed in advance.

  6. 6

    Chaos provides a mechanism for amplifying tiny quantum uncertainties into large-scale differences, potentially affecting brain dynamics and free will.

Highlights

English is estimated to have about 75% redundancy, illustrating why language is compressible: predictable patterns carry less information.
A maximally informative object is essentially incompressible white noise—high entropy doesn’t automatically translate into meaning.
The second law’s entropy increase is treated as a sign that the universe’s information content grows, not stays constant.
Quantum measurement is described as the moment probabilistic possibilities become definite facts, generating new information.
Chaos is used to argue that microscopic quantum randomness could cascade into macroscopic outcomes, including decisions.