Get AI summaries of any video or article — Sign up free
The Most Misunderstood Concept in Physics thumbnail

The Most Misunderstood Concept in Physics

Veritasium·
6 min read

Based on Veritasium's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Carnot’s ideal heat engine is reversible but still cannot achieve 100% efficiency because each cycle requires dumping heat to a colder reservoir.

Briefing

Earth receives a steady stream of energy from the sun, but the deeper mystery is what that energy *doesn’t* do: it doesn’t simply vanish, and it doesn’t spread out in a way that would make life impossible. The thread tying storms, engines, and even the arrow of time together is the second law of thermodynamics—energy tends to spread out, becoming less able to do work. That “spreading” is quantified by entropy, and it’s why heat flows one way, why useful energy degrades, and why the universe has a preferred direction from past to future.

The story begins with a deceptively simple question: what does Earth get from the sun? People answer with light and warmth—then the conversation turns to energy accounting. If Earth absorbs energy and later radiates it back to space, why doesn’t everything just reach equilibrium and stop changing? The answer points to a key misconception: energy isn’t “used up,” but its *usefulness* depends on how concentrated it is. Carnot’s work on heat engines makes this precise. In an ideal, frictionless engine, heat can be converted into work through a cycle that is fully reversible in principle. Yet even then, efficiency is limited because each cycle must dump some heat into a colder reservoir. The maximum efficiency depends only on the temperatures of the hot and cold sides, not on the engine’s materials or design. Reaching 100% efficiency would require an impossible scenario—an infinite hot temperature or an absolute zero cold temperature.

From there, entropy enters as the measure of how spread out energy becomes. Clausius framed the thermodynamic laws in terms of conservation of energy and a relentless rise in the universe’s entropy. That rise explains everyday irreversibility: hot things cool, gases expand, and perpetual motion machines fail—not because energy disappears, but because the fraction of energy that can do work shrinks as it disperses.

Boltzmann’s statistical view clarifies why this directionality appears. Heat flowing from cold to hot isn’t forbidden; it’s just overwhelmingly improbable for large numbers of atoms. In tiny systems, rare fluctuations can occur. Scale up to real materials with astronomically many particles, and the probability becomes so small that it effectively never happens. This same logic resolves a seeming contradiction: air conditioning cools a room by moving heat from cold to hot. The local entropy decrease is paid for by a larger entropy increase elsewhere—at the power plant and in waste heat expelled to the environment.

Life and structure persist because Earth is not a closed system. The sun delivers low-entropy, concentrated energy that organisms can convert step-by-step into more dispersed forms. Each photon arriving from the sun is eventually transformed into many lower-energy photons leaving Earth, and that conversion process is what allows complexity to exist while entropy still increases overall.

The arrow of time then extends to cosmology. The universe began in a remarkably low-entropy state—likely tied to how gravity made an initially uniform, mixed matter distribution extremely unlikely. Over time, matter clumps, kinetic energy turns into heat, and entropy rises. Black holes further complicate the picture by holding enormous entropy proportional to their surface area; Hawking radiation and the thermodynamic properties of black holes reinforce that most of the universe’s entropy is locked away in these objects. Eventually, as black holes evaporate and energy spreads out, the universe approaches heat death, where forward and backward time would become indistinguishable on large scales. Yet complexity doesn’t peak at maximum entropy; it thrives in the middle—like patterns that appear when tea and milk mix—making the low-entropy window of the universe the stage where stars, planets, and life can form.

Cornell Notes

Entropy is the key quantity behind thermodynamics’ arrow of time: energy tends to spread out, making it less able to do work. Carnot’s ideal heat engine shows why perfect efficiency is impossible even without friction—some heat must always be dumped to a colder reservoir, limiting efficiency to a function of hot and cold temperatures. Boltzmann explains why entropy increase looks irreversible: heat can flow “backwards” only through extremely unlikely fluctuations, which become negligible for systems with huge numbers of particles. Earth avoids thermodynamic dead-end because it is not closed; the sun supplies a steady stream of concentrated, low-entropy energy that life can convert into more dispersed forms. The universe’s low-entropy beginning—shaped by gravity and an early near-uniform state—sets the direction from past to future, with black holes later dominating the entropy budget.

Why can’t an ideal heat engine reach 100% efficiency even when it’s reversible?

Carnot’s ideal engine is reversible in the sense that running the cycle backward would restore the system to its original state. But each forward cycle increases the flywheel’s energy by the difference between heat absorbed from the hot reservoir and heat expelled to the cold reservoir. Efficiency is therefore work output divided by heat input, and it stays below 1 because some heat must be transferred to the cold side to complete the cycle. Kelvin’s absolute temperature idea makes the limit depend only on hot and cold temperatures: 100% would require an infinite hot temperature or absolute zero cold temperature—both unattainable.

What does entropy measure, beyond the common “disorder” label?

Entropy tracks how spread out energy is and how much of it remains available to do work. When energy is concentrated in a hot reservoir, entropy is lower; when it spreads into the chamber walls, axle, and surroundings, entropy rises. Clausius summarized the thermodynamic laws as conservation of energy and an entropy tendency toward a maximum in the universe. That rise explains why heat flows from hot to cold and why usable energy degrades even when total energy is conserved.

How does Boltzmann’s probability framework explain why we never see cold-to-hot heat flow?

Boltzmann treated energy packets as hopping randomly among atoms. In a small model, a snapshot can show more energy packets in the initially cold side, meaning heat flowed “uphill.” That outcome is not impossible, just improbable. As the number of atoms and energy packets grows, the probability of such a fluctuation collapses (e.g., from a noticeable fraction in tiny systems to about 0.05% in a larger toy model, and effectively zero in real solids with ~10^14 trillion atoms). The second law emerges statistically: the “wrong” direction happens so rarely it’s never observed.

How can air conditioning cool a room if entropy must increase?

Cooling a room decreases entropy locally because heat moves from cold interior to hot exterior. But the process requires work from outside (typically from a power plant), and that work ultimately produces a larger entropy increase elsewhere—through waste heat released to the environment. The net effect across the whole system still follows the second law: any local entropy decrease is outweighed by a greater global increase.

Why does life persist if the universe’s entropy keeps increasing?

Life depends on a continuous supply of low-entropy energy from outside the closed-system idealization. The sun provides concentrated energy that plants capture to build sugars; animals then consume plants, and energy becomes more dispersed at each trophic step. Eventually, nearly all incoming energy is converted into thermal energy and radiated away, but the ongoing inflow prevents Earth from reaching a static maximum-entropy equilibrium. Without a source of concentrated energy and a way to discard dispersed energy, complexity would fade.

What sets the arrow of time in cosmology?

The arrow of time aligns with the direction from unlikely to more likely macrostates as entropy increases. The universe’s early state is argued to have been exceptionally low entropy—likely because it was hot, dense, and nearly uniform, which gravity would tend to destabilize by clumping matter. Over cosmic time, gravity drives structure formation, turning potential energy into kinetic energy and then into heat, raising entropy. Black holes later dominate entropy due to their enormous entropy proportional to surface area, reinforcing the one-way thermodynamic evolution toward heat death.

Review Questions

  1. How does Carnot’s temperature-based efficiency limit follow even for an ideal, frictionless, reversible engine?
  2. In Boltzmann’s statistical picture, what changes as you scale up the number of atoms that makes entropy-increasing behavior effectively certain?
  3. What role does the sun play in preventing Earth from becoming a closed system that would quickly lose usable energy?

Key Points

  1. 1

    Carnot’s ideal heat engine is reversible but still cannot achieve 100% efficiency because each cycle requires dumping heat to a colder reservoir.

  2. 2

    Maximum efficiency depends only on the hot and cold reservoir temperatures, not on the engine’s construction details.

  3. 3

    Entropy measures how spread out energy becomes and how much energy remains available to do work, not just “messiness.”

  4. 4

    Entropy’s apparent irreversibility comes from probability: cold-to-hot heat flow is possible but overwhelmingly unlikely for macroscopic systems.

  5. 5

    Local entropy decreases (like in air conditioning) require larger entropy increases elsewhere, preserving the second law globally.

  6. 6

    Earth sustains complexity because it is an open system: the sun supplies concentrated, low-entropy energy that life and ecosystems convert into more dispersed forms.

  7. 7

    The universe’s arrow of time is linked to a low-entropy early state and ongoing entropy growth, with black holes holding a dominant share of cosmic entropy.

Highlights

Carnot’s engine can be run backward without leaving traces, yet it still can’t be perfectly efficient because completing the cycle forces heat to be expelled to the cold side.
Boltzmann’s key move turns the second law into a statistical statement: entropy-increasing outcomes are overwhelmingly likely, while entropy-decreasing ones become effectively impossible as systems grow large.
Air conditioning doesn’t violate the second law; it shifts entropy around by trading a room’s entropy decrease for a larger entropy increase in the power plant and waste heat.
Most of the universe’s entropy is tied up in black holes, making them central to how entropy budgets evolve toward heat death.
The sun’s low-entropy energy supply is what allows Earth to maintain structure and life despite the universe’s overall tendency toward maximum entropy.