In 1867 James Clerk Maxwell proposed a thought experiment that seemed to break the second law of thermodynamics. Imagine a container of gas, divided in two, with a tiny trapdoor in the wall between chambers. A small demon sits at the trapdoor and watches the molecules. When a fast molecule approaches from the left, it opens the door to let it through to the right. When a slow molecule approaches from the right, it opens the door to let it back left. After enough time, the right chamber is hotter and the left is cooler — without any work being done. The demon has sorted the molecules using only information, and heat has flowed from cold to hot.
This bothered people for the better part of a century. In 1929 Leo Szilard identified that something must pay the thermodynamic debt, and guessed it was the demon's act of measuring — of looking at each molecule and deciding. Measurement, he thought, must have an unavoidable energetic cost. The second law was saved by making knowledge expensive.
Charles Bennett showed in 1982 that Szilard had the wrong transaction. Measurement is reversible. You can observe a molecule's velocity without disturbing the universe's entropy ledger at all, at least in principle. What you cannot do for free is forget. The demon's memory fills up as it watches molecules. To keep working indefinitely, it eventually has to erase what it has recorded. And that erasure — the act of clearing a stored bit, of converting a definite state back to an unknown one — is what costs. The second law isn't about the price of knowing. It's about the price of unknowing.
Rolf Landauer had worked out the physics of erasure in 1961. He showed that erasing one bit of information in a system at temperature T releases a minimum of kT ln 2 of energy as heat, where k is Boltzmann's constant. At room temperature, that's about 3 × 10⁻²¹ joules — an almost incomprehensibly small amount, but not zero. The bound is physical and fundamental. You cannot write a zero over a one without that transaction appearing somewhere in the entropy balance of the universe.
This remained a theoretical result for fifty years. Then in 2012, Antoine Bérut and colleagues at ENS Lyon measured it directly. They trapped a single silica bead in a double-well potential using focused laser beams — one well on the left, one on the right. The bead could sit in either well, forming a one-bit memory. To erase the bit (force the bead to one well regardless of where it started), they tilted the potential and waited. Measuring the heat exchanged with the surrounding fluid, they found that slow erasures approached Landauer's limit from above, and fast erasures produced more heat than the minimum. The closer they ran the operation to the quasi-static limit, the closer the dissipation came to kT ln 2. The bound is real and the experiment touched it.
What Bennett's resolution implies for computing is striking. If measurement costs nothing and only erasure costs, then a computer that never erases — that stores the result of every intermediate calculation rather than discarding it — could, in principle, compute at near-zero energy cost. This is reversible computing: run the calculation forward, save the output, then run it backward to recover all intermediate states, so nothing is ever lost. Bennett described it in 1973. The challenge is that running backwards takes time, and the intermediate states require storage, and practical chips have always been so far above the Landauer limit that the theoretical minimum has seemed irrelevant. Modern processors waste roughly a billion times more energy per operation than the Landauer bound requires.
But that ratio won't hold indefinitely. As chips scale down and power density becomes a harder constraint, the Landauer limit moves from philosophical curiosity toward engineering target. A startup called Vaire Computing is trying to build reversible processors, using slower operation speeds compensated by parallelism to achieve net efficiency gains. Whether this works at scale is unsettled. What is settled is the physics: the minimum cost of a computation is set by how much information you throw away in the process.
The part I keep returning to is Bennett's inversion of Szilard. Maxwell's demon was unsettling because it seemed to defeat entropy by knowing things that ordinary matter couldn't know. The resolution came out inverted: the demon's knowing is fine. Knowing doesn't cost anything. What costs is the clearing of the slate — the transition from definite record to blank — which must eventually happen in any finite memory. The second law is not a tax on observation. It is a tax on erasure. Every time you empty the trash, the universe collects.