Where Does Complexity Come From? (Big Picture Ep. 3/5)
Based on minutephysics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Entropy measures how many microscopic arrangements correspond to the same macroscopic state, while complexity measures how hard it is to describe macroscopic properties.
Briefing
The universe’s march toward higher entropy doesn’t prevent complex structures from appearing—it often helps explain why they show up in the first place. The key distinction is that entropy and complexity measure different things: entropy tracks how many microscopic arrangements produce the same macroscopic state, while complexity tracks how difficult it is to describe the macroscopic state in detail. That difference resolves the apparent contradiction between the Second Law of Thermodynamics (increasing disorder overall) and the emergence of intricate systems like stars, life, and even cats.
Entropy can decrease locally without violating the Second Law, as long as the rest of the universe pays the cost. Cooling water to form ice is the classic example: order increases in the water, but the surrounding environment gains at least as much disorder. Yet even with that accounting, the deeper question remains—why do complex, information-rich patterns ever arise if the overall trend is toward greater disorder?
The transcript’s answer hinges on how complexity evolves during mixing. Consider a cup initially split into half coffee and half milk. At first, the setup has relatively low entropy because swapping coffee molecules with each other (or milk with each other) doesn’t change the macroscopic appearance much, but swapping coffee with milk would. The system is also simple to describe: milk sits on top, coffee on the bottom.
As the liquids mix, entropy rises because the macroscopic state becomes less sensitive to which specific molecules occupy which micro-positions. Swapping coffee and milk molecules becomes increasingly unnoticeable once the mixture is thoroughly blended. Complexity, however, behaves differently at first: describing the evolving, interwoven “tendrils” of coffee and milk requires increasingly detailed information. In other words, entropy increases while complexity can initially grow because the system’s macroscopic features become harder to summarize.
Eventually the mixture reaches equilibrium, where coffee and milk are thoroughly mixed. At that point, entropy is high and complexity decays again: the system looks like a uniform blend, so there’s little intricate structure left to specify. The general pattern is therefore cyclical: as entropy increases, complexity tends to rise early, peak, and then fall as equilibrium simplifies the macroscopic description.
This framework is then applied to the universe. The early universe is described as smooth and dense—low entropy and extremely simple. The far future is expected to be smooth again but very dilute—high entropy and simple once more. The “complex” era is the middle ground: medium entropy, where structures like stars, galaxies, mineral veins, swirling clouds, amino acids, proteins, and living beings can form. Just as the coffee-and-milk mixture becomes simpler at equilibrium, the transcript suggests that in the far future complicated structures will be simplified out of existence as the universe approaches a more uniform, high-entropy state.
Cornell Notes
Entropy and complexity are not the same measure. Entropy counts how many microscopic arrangements correspond to the same macroscopic state, while complexity measures how hard it is to describe the macroscopic properties in detail. In a coffee-and-milk mixing example, entropy rises as the liquids blend and become less sensitive to which molecules are where, while complexity initially grows because the interwoven structure becomes harder to summarize. Once equilibrium is reached and the mixture becomes uniform, complexity drops even though entropy is high. The same pattern is used to frame cosmic history: the early universe is low-entropy and simple, the far future is high-entropy and simple, and the middle era is where complexity peaks.
Why doesn’t the Second Law automatically rule out organized structures like living beings?
How does the coffee-and-milk example show entropy increasing while complexity can first rise and then fall?
What does “equilibrium” mean in this context, and why does it reduce complexity?
How is the pattern from mixing liquids mapped onto the universe’s timeline?
What’s the central conceptual takeaway about “order” versus “complexity”?
Review Questions
- In the coffee-and-milk scenario, what changes at the point of equilibrium that causes complexity to drop even though entropy remains high?
- How does the transcript’s definition of complexity differ from entropy, and why does that distinction matter for understanding the emergence of complex structures?
- What does the “medium-entropy middle” imply about when complexity should peak in the universe’s history?
Key Points
- 1
Entropy measures how many microscopic arrangements correspond to the same macroscopic state, while complexity measures how hard it is to describe macroscopic properties.
- 2
Local decreases in entropy are allowed as long as increases elsewhere compensate, so organization can arise without violating the Second Law.
- 3
In mixing systems, entropy tends to rise as micro-level details become less observable at the macro level.
- 4
Complexity can increase during the early stages of mixing because the evolving structure becomes harder to summarize.
- 5
At equilibrium, high entropy coincides with low complexity because the system becomes homogeneous and easy to describe.
- 6
The universe is framed as low-entropy/simple in the early era, high-entropy/simple in the far future, with complexity peaking during the intermediate period.
- 7
The emergence of stars, galaxies, chemistry, and life is treated as a natural consequence of the entropy–complexity interplay during the universe’s middle stage.