What Is Entropy? A Measure of Just How Little We Really Know.
In investigating the limits of extracting work from their real-world information engine, Bechhoefer and Still have found that, in certain regimes, it can significantly outperform conventional engines. They’ve also tracked the inefficiency associated with receiving partial information about the bead’s state, inspired by Still’s theoretical work.
The information engine is now shrinking to the quantum scale with the help of Natalia Ares, a physicist at the University of Oxford who served on a panel with Still at the retreat. On silicon chips the size of a coaster, Ares traps a single electron inside a thin carbon wire, which is suspended between two pillars. This “nanotube,” which is cooled to within thousandths of a degree of absolute zero, vibrates like a guitar string, and its oscillation frequency is determined by the state of the electron inside. By tracking the nanotube’s minuscule vibrations, Ares and colleagues plan to diagnose the work output of different quantum phenomena.
Ares has a long list of experiments to probe quantum thermodynamics scribbled across chalkboards up and down the halls. “It’s basically all of the industrial revolution, but nano,” she said. One planned experiment takes after Still’s idea. It involves adjusting how perfectly the nanotube’s vibrations depend on the electron (versus other unknown factors), essentially providing a knob for tuning the ignorance of the observer.
Ares and her team are probing the limits of thermodynamics on the smallest scales — the motive power of quantum fire, in a sense. Classically, the limit for how efficiently the motion of particles can be transformed into work is set by Carnot’s theorem. But in the quantum case, with a menagerie of entropies to choose from, it’s much more complicated to determine which one will set relevant bounds — or how to even define work output. “If you have a single electron like we have in our experiments, what does it mean, entropy?” Ares said. “In my experience, we are still very lost here.”
A recent study led by Nicole Yunger Halpern, a physicist at the National Institute of Standards and Technology, shows how common definitions of entropy production that are usually synonymous can disagree in the quantum realm, again because of uncertainty and observer dependence. On this tiny scale, it’s impossible to know certain properties at the same time. And the order in which you measure certain quantities can affect the measurement outcomes. Yunger Halpern thinks we can use this quantum weirdness to our advantage. “There are extra resources available in the quantum world that are not available classically, so we can bend around Carnot’s theorem,” she says.
Ares is pushing these new boundaries in the lab, hoping to pave a path for more efficient energy harvesting, charging of devices or computation. The experiments may also provide insight into the mechanics of the most efficient information processing systems we know of: ourselves. Scientists aren’t sure how the human brain can perform immensely complicated mental gymnastics using only 20 watts of power. Perhaps the secret to biology’s computational efficiency also lies in harnessing random fluctuations at small scales, and these experiments aim to sniff out any possible advantage. “If there is some win in this, there’s a chance that nature actually uses it,” said Janet Anders, a theorist at the University of Exeter who works with Ares. “This fundamental understanding that we’re developing now hopefully helps us in the future understand better how biology does things.”
The next round of Ares’ experiments will take place in a hot-pink refrigeration chamber that dangles from the ceiling of her lab in Oxford. She jokingly suggested the makeover to the manufacturers a few years ago, but they cautioned that metallic paint particles would hamper her experiments. Then the company secretly brought the fridge to an auto shop to cover it in a flashy pink film. Ares sees her new experimental arena as a symbol of changing times, reflecting her hope that this new industrial revolution will be different from the last one — more conscientious, environmentally friendly and inclusive.
“It feels very much like we’re at the start of something big and wonderful,” she said.
Embracing Uncertainty
In September 2024, a few hundred researchers gathered in Palaiseau, France, to pay homage to Carnot on the 200th anniversary of his book. Participants from across the sciences discussed how entropy features in each of their research areas, from solar cells to black holes. At the welcome address, a director of the French National Center for Scientific Research apologized to Carnot on behalf of her country for overlooking the impact of his work. Later that night, the researchers gathered in a decadent golden dining room to listen to a symphony composed by Carnot’s father and performed by a quartet that included one of the composer’s distant descendants.
Carnot’s reverberating insight emerged from an attempt to exert ultimate control over the clockwork world, the holy grail of the Age of Reason. But as the concept of entropy diffused throughout the natural sciences, its purpose shifted. The refined view of entropy is one that sheds the false dreams of total efficiency and perfect prediction and instead concedes the irreducible uncertainty in the world. “To some extent, we’re moving away from enlightenment in a number of directions,” Rovelli said — away from determinism and absolutism and toward uncertainty and subjectivity.
Like it or not, we are slaves of the second law; we can’t help but compel the universe toward its fate of supreme disorder. But our refined view on entropy allows for a more positive outlook. The trend toward messiness is what powers all our machines. While the decay of useful energy does limit our abilities, sometimes a new perspective can reveal a reservoir of order hidden in the chaos. Furthermore, a disordered cosmos is one that’s increasingly filled with possibility. We cannot circumvent uncertainty, but we can learn to manage it — and maybe even embrace it. After all, ignorance is what motivates us to seek knowledge and construct stories about our experience. Entropy, in other words, is what makes us human.
You can bemoan the inescapable collapse of order, or you can embrace uncertainty as an opportunity to learn, to sense and deduce, to make better choices, and to capitalize on the motive power of you.
This work was supported by a fellowship with the MIP.labor. MIP.labor is hosted at Freie Universität Berlin and is funded by the Klaus Tschira Foundation. Quanta Magazine is an editorially independent publication funded by the Simons Foundation.