Original by Chris Hillman
(Last modified by Chris Hillman 2 Feb 2001.)
The thermodynamical notion of entropy was introduced in 1854
by Rudolph Clausius, who built on the work of Carnot.
His ideas were later extended and clarified by Helmholtz and others.
In the 1870's,
found a "statistical"
definition of entropy
which, he claimed, reduced to the earlier notion of Clausius.
Around the same time,
Josiah Willard Gibbs
introduced a slightly different statistical notion of entropy.
Here are some pages discussing these ideas:
John von Neumann
(inventor of the programmable computer) introduced the entropy operator,
which is the analog of numerical entropy for quantum mechanics.
When I get a chance, I hope to write a very brief explanation of his (really very simple)
These days, various physicists are attempting to generalize von Neumann's
operator entropy in the context of noncommutative C-* algebras (the mathematical setting
for the rigorous theory of statistical mechanics introduced by Ruelle and others).
The Page of Entropy,
by Dave Slaven
(Physics, Saginaw Valley State University) offers a very nice,
non-technical introduction to the statistical notion of
How to teach statistical physics,
(Physics, Seoul National University, Korea) offers some
Java applets illustrating micrononical ensembles, thermal relaxation,
fluctuations in entropy, the Maxwell-Boltmann distribution and other good stuff.
E. T. Jaynes discovered a beautiful connection,
the Principle of Maximal Entropy,
between statistical mechanics and the entropy introduced by Shannon.
A very detailed, modern presentation of statistical mechanical entropies in
classical and quantum systems from the point of view iniated by Jaynes
may be found in the following document:
Information and correlation in statistical mechanical systems,
the PhD thesis of David Richard Wolf (Physics, University of Texas, Austin).
"In this dissertation the question of how information is carried in a physical system is examined. The systems
studied here are simple, as are all systems which have a presentable analysis. The point of view that the
states of the physical system of interest may be treated probabilistically, that there is an underlying
distribution which describes the probability that a particular state occurs, is taken thoroughly. Certainly the
thermodynamic systems studied here are treated on this basis, but more generally whenever such a
distribution exists and is known, or is learnable, the methods of this work apply."
A particularly interesting aspect of Wolf's work is the introduction of higher order information measures.
In 1975, Stephen Hawking and Jacob Beckenstein discovered an astonishing connection
between thermodynamical entropy, quantum mechanics, and general relavitity, which has
inspired much current work on quantum gravity.
For more information, see:
The theory of non-equilibrium statistical mechanics began with Lars Onsager (who won
a Nobel Prize for related work).
For the past several decades, Ilya Prigogine
and his coworkers have been developing this
(Prigogine has also recieved a Nobel Prize for his contributions in this area).
Entropy is closely related to the multifractal "thermodynamical formalism" which has become
very popular in the numerical study of dynamical systems, and new entropic quantities are
introduced (it seems) daily as physicists (among others) struggle to come to grips with
the problem of pinning some kind of meaningful number on dynamical phenomena which are too
complex to model in the conventional way. For example, see:
Local Entropy Characterization of
Correlated Random Microstructures,
by C. Andraud (Laboratoire d'Optiques des Solides, Universite Pierre et Marie Curie),
A. Beghdadi (LPMTM-CNRS, Institut Galilee, Universite Paris Nord),
E. Haslund (Institute of Physics, University of Oslo),
R. Hilfer (Institute of Physics, University of Oslo and Institut für Physik,
J. Lafait (Laboratoire d'Optiques des Solides, Universite Pierre et Marie Curie),
and B. Virgin (Institute of Physics, University of Oslo).
There are some very severe conceptual difficulties in reconciling statistical
mechanics with classical thermodynamics.
Shortly after Boltzmann claimed to have reduced thermodynamics to the statistics of
molecules, Josef Loschmidt and
(among others) pointed out still more paradoxes.
Some claim that the famous Recurrence Theorem proven in 1898 by
resolved these paradoxes, but I have my doubts, when I have time, I hope
to explain my own objections at greater length.
Maxwell himself (who was the first to suggest that thermodynamics might simply
be the macroscopic effect of the statistics of molecules) introduced his infamous
Demon, who continues to bedevil physicists today.
In 1929, Leo Szilard claimed to have resolved this paradox
using Heisenberg's Uncertainty Principle. Szilard was one of the first,
incidently, to use the word "information" in a scientific paper.
In 1982, new insights due to R. Landauer and C. Bennett overturned this paradigm
and gave rise to a quite different resolution of Maxwell's paradox.
According to Landauer, the erasure of a stored bit inevitably dissipates heat and thus
You can read about these ideas in the following paper:
Over the years there have been numerous attempts by various people
to justify their political or social agenda on the basis of the Second Law
of Thermodynamics and other impressive sounding phrases.
Here is a paper debunking a recent instance of such entropic
The various aspects of entropy that arise in a variety of applications
in statistical physics, thermodynamics and other sciences are given
mathematically rigorous and accurate treatment in the book
Entropy (edited by Andreas Greven, Gerhard Keller and Gerald Warnecke),
which also offers a nice unifying view of the subject.
There are over a hundred statistical mechanics and thermodynamics textbooks,
and at least a dozen books have so far been published on the arrow of time alone.
Some of the more recent offerings include the following:
Statistical Dynamics, a stochastic approach to nonequilibrium thermodynamics,
by R. F. Streater. Imperial College Press, 1995.
Covers entropy in classical statistical dynamics (including the Boltzmann map and Legendre duality)
and also in quantum statistical dynamics.
- Engines, energy, and entropy: a thermodynamics primer,
by John B. Fenn, W.H. Freeman, 1982.
- Thermodynamics and statistical mechanics,
by Walter Greiner, Ludwig Neise, and Horst Stocker, Springer-Verlag, 1995.
Covers quantum statistical mechanics.
- Time's arrow : the origins of thermodynamic behavior,
by Michael C. Mackey, Springer-Verlag, 1992.
- Thermodynamics of chaotic systems,,
by C. Beck and F. Schloegl, Cambridge University Press, 1993.
A graduate level introduction.
Complexity : hierarchical structures and scaling in physics,
by R. Badii and A. Politi, Cambridge University Press, 1997.
Warmly recommended to me by Thierry D. de Wit (Centre de Physique Theorique, Marseille),
this book covers entropy, thermodynamical formalism,
algorithmic and grammatical complexities, heierarchical scaling complexities, and more.
Written at somewhat higher level than the book by Beck and Schloegl.
- Evolution of complex systems: self-organization, entropy, and development,
by Rainer Feistel and Werner Ebeling, Kluwer Academic Publishers, 1989.
- The physical basis of the direction of time,
by H.-Dieter Zeh, Springer-Verlag, 1989.
- Information and the internal structure of the universe:
an exploration into information physics,
by Tom Stonier, Springer-Verlag, 1990.
- Entropy, large deviations, and statistical mechanics,
by Richard S. Ellis, Springer-Verlag, 1985.
Has a good discussion of the connection, via Legendre-Fenchel duality,
between the entropy of statistical mechanics and Shannon's entropy.
- Quantum entropy and its use,
by Masanori Ohya and Denes Petz. Springer-Verlag, 1993.
A research monograph on current efforts to find the proper formulation of entropy
in terms of noncommutative C-* algebras.
Entropy on the World Wide Web.