What is NOT Random?
Based on Veritasium's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Information is framed as a measure of unpredictability: redundant patterns (like common letter sequences) reduce information and enable compression.
Briefing
The universe isn’t “random” in the everyday sense—many outcomes are predictable—but the arrow of time and the limits of prediction point to a deeper kind of randomness that keeps generating new information. The key idea is that information is tied to entropy: highly ordered states carry little information and compress easily, while maximally disordered states carry so much information that they can’t be compressed. That framing matters because it links what people call “meaning” and “predictability” to the physical laws governing how systems evolve.
At the start, the argument borrows Laplace’s famous thought experiment: if every particle’s position and velocity were known, then the future of the universe would be fully determined. That would imply no randomness anywhere—not even in human behavior—because everything would follow from the universe’s state at a given time. But the discussion then shifts from determinism to what “information” actually is. Information is first described as order—DNA’s molecular sequence, the order of bits in data streams, and the arrangement of letters and words. Yet not all order counts equally. Common patterns are redundant: after a “Q” comes a “U” most of the time, and Shannon estimated English has about 75% redundancy, making it compressible. Even video is compressible because neighboring pixels cluster by color and most pixels barely change from frame to frame.
From there, the concept flips. The most informative objects are those that are as unpredictable as possible: random strings of zeros and ones, or white-noise video where each pixel is independent and changes without pattern. Such states have maximum entropy and can’t be compressed. But they also carry no human meaning—randomness doesn’t build words, organisms, or recognizable structures. Meaning tends to emerge in the middle ground: neither perfectly ordered nor maximally disordered, but structured enough for patterns to be detected.
The discussion then connects this to physics. If Laplace’s determinism held perfectly, the universe’s information content would stay constant. Since information is entropy, that would mean entropy would be constant too. Instead, the second law of thermodynamics says entropy increases over time, implying the universe’s information content rises. The question becomes: where does the new information come from? The best candidate offered is quantum mechanics. Quantum theory is probabilistic: outcomes like where an electron is found can’t be predicted with absolute certainty, only with probabilities. When a measurement pins down a specific result, it yields information that wasn’t determined beforehand.
Those quantum “new bits” are proposed as a driver of increasing entropy, and thus the second law. The argument also reframes the second law as a prerequisite for genuine unpredictability: in a universe where entropy must rise, the future can’t be fully fixed in advance. Finally, the role of chaos—systems so sensitive that tiny differences later explode into large divergences—suggests that even microscopic quantum randomness could matter at macroscopic scales, potentially influencing brain states and supporting the felt experience of free will. The result is a universe where the future is at least somewhat undetermined, not perfectly predictable.
Cornell Notes
The transcript links predictability, information, and thermodynamics. Order makes data compressible and low in information; maximum randomness produces high entropy and can’t be compressed, but it also lacks meaning. If the universe were fully deterministic in Laplace’s sense, entropy (and thus information) would remain constant. Instead, the second law says entropy increases, implying new information is continually generated. Quantum mechanics supplies a mechanism: probabilistic outcomes become definite only when measured, creating information that wasn’t knowable with certainty beforehand; chaos then amplifies tiny quantum differences into large effects.
How does the transcript connect “information” to “order” and “entropy”?
Why does maximum randomness carry “no meaning,” even though it contains maximum information?
What would Laplace’s determinism imply about entropy, and why is that a problem?
How does quantum mechanics supply “new information” in the transcript’s framework?
Why does chaos matter for turning tiny quantum randomness into large outcomes like free will?
Review Questions
- If information is treated as entropy, what does that imply about compressibility of ordered versus random data?
- What chain of reasoning links the second law of thermodynamics to the claim that the future is not fully determined?
- How do quantum measurement and chaos jointly affect the transcript’s view of unpredictability in human behavior?
Key Points
- 1
Information is framed as a measure of unpredictability: redundant patterns (like common letter sequences) reduce information and enable compression.
- 2
Maximally random states have maximum entropy and can’t be compressed, but they also lack the structured patterns needed for meaning.
- 3
Laplace-style determinism would imply constant information over time, which would contradict the second law’s observed entropy increase.
- 4
The second law is interpreted as evidence that new information is continually generated rather than merely revealed.
- 5
Quantum mechanics is presented as a source of genuinely probabilistic outcomes, with measurement producing information that wasn’t fixed in advance.
- 6
Chaos provides a mechanism for amplifying tiny quantum uncertainties into large-scale differences, potentially affecting brain dynamics and free will.