Entropy in the Physical Sciences

Original by Chris Hillman (Last modified by Chris Hillman 2 Feb 2001.)


The thermodynamical notion of entropy was introduced in 1854 by Rudolph Clausius, who built on the work of Carnot. His ideas were later extended and clarified by Helmholtz and others. In the 1870's, Ludwig Boltzmann found a "statistical" definition of entropy which, he claimed, reduced to the earlier notion of Clausius. Around the same time, Josiah Willard Gibbs introduced a slightly different statistical notion of entropy. Here are some pages discussing these ideas:

Around 1930, John von Neumann (inventor of the programmable computer) introduced the entropy operator, which is the analog of numerical entropy for quantum mechanics. When I get a chance, I hope to write a very brief explanation of his (really very simple) idea. These days, various physicists are attempting to generalize von Neumann's operator entropy in the context of noncommutative C-* algebras (the mathematical setting for the rigorous theory of statistical mechanics introduced by Ruelle and others).

In 1957, E. T. Jaynes discovered a beautiful connection, the Principle of Maximal Entropy, between statistical mechanics and the entropy introduced by Shannon. See:

A very detailed, modern presentation of statistical mechanical entropies in classical and quantum systems from the point of view iniated by Jaynes may be found in the following document:

In 1975, Stephen Hawking and Jacob Beckenstein discovered an astonishing connection between thermodynamical entropy, quantum mechanics, and general relavitity, which has inspired much current work on quantum gravity. For more information, see:

The theory of non-equilibrium statistical mechanics began with Lars Onsager (who won a Nobel Prize for related work). For the past several decades, Ilya Prigogine and his coworkers have been developing this theory (Prigogine has also recieved a Nobel Prize for his contributions in this area).

Entropy is closely related to the multifractal "thermodynamical formalism" which has become very popular in the numerical study of dynamical systems, and new entropic quantities are introduced (it seems) daily as physicists (among others) struggle to come to grips with the problem of pinning some kind of meaningful number on dynamical phenomena which are too complex to model in the conventional way. For example, see:

There are some very severe conceptual difficulties in reconciling statistical mechanics with classical thermodynamics. Shortly after Boltzmann claimed to have reduced thermodynamics to the statistics of molecules, Josef Loschmidt and Ernst Zermelo (among others) pointed out still more paradoxes. Some claim that the famous Recurrence Theorem proven in 1898 by Henri Poincare resolved these paradoxes, but I have my doubts, when I have time, I hope to explain my own objections at greater length.

Maxwell himself (who was the first to suggest that thermodynamics might simply be the macroscopic effect of the statistics of molecules) introduced his infamous Demon, who continues to bedevil physicists today. In 1929, Leo Szilard claimed to have resolved this paradox using Heisenberg's Uncertainty Principle. Szilard was one of the first, incidently, to use the word "information" in a scientific paper. In 1982, new insights due to R. Landauer and C. Bennett overturned this paradigm and gave rise to a quite different resolution of Maxwell's paradox. According to Landauer, the erasure of a stored bit inevitably dissipates heat and thus increases entropy. You can read about these ideas in the following paper:

Over the years there have been numerous attempts by various people to justify their political or social agenda on the basis of the Second Law of Thermodynamics and other impressive sounding phrases. Here is a paper debunking a recent instance of such entropic pseudoscience:

The various aspects of entropy that arise in a variety of applications in statistical physics, thermodynamics and other sciences are given mathematically rigorous and accurate treatment in the book Entropy (edited by Andreas Greven, Gerhard Keller and Gerald Warnecke), which also offers a nice unifying view of the subject.

Further Reading

There are over a hundred statistical mechanics and thermodynamics textbooks, and at least a dozen books have so far been published on the arrow of time alone. Some of the more recent offerings include the following:


Back to Entropy on the World Wide Web.