Last modified by Roland Gunesch on Sept 1, 2003.
Original by Chris Hillman.
In the mid 1950's,
Andrei Kolmogorov
imported Shannon's probabilistic notion of entropy
into the theory of dynamical systems and showed how entropy can (sometimes) be used
to tell whether two dynamical systems are non-conjugate (i.e., non-isomorphic).
His work inspired a whole new approach in which entropy appears as a numerical
invariant of a class of dynamical systems.
Kolmogorov's metric entropy is an invariant of measure theoretical dyamical systems
and is closely related to Shannon's source entropy.
It was a spectacular vindication of Kolmogorov's ideas when Donald Ornstein showed that
metric entropy suffices to completely classify two-sided Bernoulli processes,
a basic problem which for many decades appeared completely intractable.
(Recently, Selim Tuncel has shown how to classify one-sided Bernoulli processes;
this turns out to be quite a bit harder).
In 1961, Roy Adler et al. introduced topological entropy, which is the analogous invariant
for topological dynamical systems.
It soon turned out that there is a simple relationship between these quantities:
maximizing the metric entropy over a suitable class of measures defined on a dynamical system
gives its topological entropy.
Here are two expository papers and two on-line (graduate level) textbooks which stress the role of entropy in
understanding dynamical systems:
-
Symbolic Dynamics and Coding Applications,
by
Brian Marcus (IBM Almaden Research Center).
An abridged version of the Plenary Lecture
given at the 1995 IEEE International Symposium on Information Theory.
An elementary introduction to symbolic dynamics motivated by two problems:
studying "natural" dynamical systems by highly idealized symbolic models,
and efficient encoding of a digital data stream.
-
An Entropy Primer,
by Chris Hillman,
An introduction to Shannon's discrete entropy from the dynamical systems point of view,
at an advanced undergraduate to beginning graduate level.
Covers Shannon's Noiseless Coding Theorem and a much more general
version of the Equipartition Theorem than the version in Thomas & Cover.
Together, these two theorems give Shannon's way of understanding what his entropy "means".
-
Dynamical Systems and Ergodic Theory,
by Mark Pollicott (Mathematics, Manchester University)
and Michiko Yuri (Mathematics, Sapporo University) (also available in its published form
in the LMS Student Series).
An excellent introduction to topological and measure-theoretic entropy and their relationship
via the variational principle, ergodic measures, and other elementary concepts of ergodic
theory and topological dynamics.
Highly recommended!
-
Equilibrium States in Ergodic Theory,
by Gerhard Keller (Mathematics, Erlangen) makes an excellent companion
to Pollicott and Yuri and is also highly recommended background reading.
-
Attractors and Attracting Measures,
by
Karl Petersen
(Mathematics, University of North Carolina),
offers an in-depth introduction to the ergodic theory treatment of equilibrium states
and especially their use in Pesin theory (part of the theory of chaotic dynamical systems),
including an introduction to the variational principle relating metric and topological entropy.
Highly recommended!
(See also Prof. Petersen's textbook on ergodic theory, published by Cambridge University Press.)
-
Entropy of Compact Group Automorphisms,
by
T,om Ward,
(Mathematics, University of East Anglia, Norwich).
These lecture notes are effectively a book length graduate level exposition of the theory
of metric and topological entropy from the ergodic theory viewpoint, at a more sophisticated level
than the previous two items,
and including material (e.g. Fourier analysis on locally compact abelian groups)
not found elsewhere.
There is a principle in dynamical systems which says that
- there are a great variety of ways to define an "entropy" in the context of dynamical systems,
- nonetheless, these quantities are all closely related to one another
(typically, they agree in an appropriate limit).
These relations show that entropy can be regarded quite generally as a measure of:
- unpredictability,
- incompressiblity,
- asymmetry,
- delayed recurrence.
Entropy can also be regarded as the fractal dimension of an appropriate compact set.
Here are two papers which discuss these ideas:
-
All Entropies Agree for an SFT,
an expository paper by myself.
Discusses the relationships between various notions of entropy including metric entropy,
topological entropy, algorithmic entropy (see below) and galois entropy,
which relates entropy to the notion of "geometric asymmetry".
-
Typical Sequences and All That: Entropy, Pattern Matching, and Data Compression,
by the late
Aaron D. Wyner
(formerly of the Mathematics of Communication Department, Bell Labs).
This is an excellent expository paper which
describes (among other things) the recurrence interpretation of entropy.
There is another general principle which says that
- in one dimensional symbolic systems, entropies are often very easy to compute,
- in higher dimensional symbolic systems, they can be almost impossible to compute
(this is very bad news because entropy is one of the crudest of all dynamical invariants,
so this implies that higher dimensional systems are almost impossible to classify).
Here are two papers which discuss the first and second parts (respectively) of this principle:
-
Symbolic Dynamics in the Interior of the Mandlebrot Set,
by myself.
This was originally a posting to a newsgroup,
and unfortunately I found it neccessary to assume considerable knowledge of complex dynamics,
but until something better comes along it will have to do.
-
The Hard Square Entropy Constant
,
by Steven Finch (Mathsoft).
This is an expository article concerning the difficult problem
of computing the topological entropy of a certain two dimensional
shift of finite type, although Finch doesn't use the language of symbolic dynamics.
A two dimensional shift is a set of functions Z^2 --> B, where B = {0,1}, and Z is the integers,
which is invariant under the "horizontal" and "vertical" shifts and which is closed
with respect to the product topology. A shift of finite type can be described by giving
a finite list of forbidden patterns; in this case these patterns are the two by two patterns
of zeros and ones which have at least one pair of "adjacent" ones (adjacent in the sense discussed
in Finch's essay). Features many links to other high quality web pages.
Another web resource is
Further Reading
Hundreds of books have already been published on dynamical system theory,
of which several dozen at least stress the role of the various kinds of dynamical entropies.
Recent offerings include the following:
-
Coping with Chaos,
by Edward Ott, Tim Sauer, and James A. Yorke,
Wiley-Interscience, 1994.
Includes concise introductions to key concepts (including Lyapunov
exponents and metric entropy) and reprints of landmark papers.
Oriented toward the experimental analysis of "real" chaotic
dynamical systems.
-
Symbolic Dynamics and Coding
by Doug Lind and Brian Marcus, Cambridge U Press, 1995.
A very readable introduction aimed at undergraduates to this important field.
Covers all the important invariants including topological entropy,
with several useful techiques for computing (topological) entropies.
The first half of this book, in fact, I regard as well-nigh indispensable
for anyone who uses entropy.
The second half, unfortunately, will be useful only for those exclusively
interested in certain kinds of coding problems or classification problems
in dynamical systems theory; I would have prefered to see several chapters
worth of exposition on the applications of symbolic dynamics to the study
of dynamical systems arising elsewhere in mathematics or in applications.
As a very poor substitute, see my sketch of how techniques explained in
this book can help you to compute entropies of shifts of finite type
hiding inside the interior of the Mandlebrot set.
-
The Ergodic Theory of Discrete Sample Paths,
by Paul C. Shields.
This is the only recent textbook to fully treat such indispensable topics
as entropy estimation, recurrence, cutting and stacking, as well as a simplified
proof of the all important Birkhoff ergodic theorem, based on a packing lemma.
Probably the most useful graduate level text for the typical entropy users
interested in learning more about the ergodic theory viewpoint.
Note that a "sample path" is called a "word" in the book by Lind and Marcus.
-
Introduction to the Modern Theory of Dynamical Systems
by Anatole Katok and Boris Hasselblatt, Cambridge University Press, 1995.
An authoratative and extremely ambitious attempt to survey and organize
the ENORMOUS field of dynamical systems. Carefully distinguishes between
various levels of structure (topological, measure-theoretic, etc.) and
is remarkable for presenting not just statements but proofs of theorems.
Describes Brin-Katok entropy and an interesting algebraic entropy
due, I think, to R. Bowen, as well as the standard notions of metric and
topological entropy. This book will surely become a standard and
oft-quoted reference in the field.
-
Smooth Ergodic Theory of Random Dynamical Systems,
by P.-D. Liu and M. Qian (Peking University, Beijing),
Springer-Verlag, 1996.
Lecture Notes in Mathematics Vol. 1606.
Focuses on the central role of Pesin's formula, which connects
the metric entropy of a smooth dynamical system with it's Liapunov coefficients.
-
Dynamical Systems of Algebraic Origin,
by Klaus Schmidt (Mathematics, University of Vienna).
In one dimensional symbolic dynamics, the dimension is usually interpreted as time.
However, it makes perfect sense to define higher dimensional symbolic systems; indeed, these
arise naturally in statistical mechanics and related fields (cellular automata are nothing but
endomorphisms of higher dimensional symbolic dynamical systems).
Unfortunately, it has been known for many years that even two dimensional symbolic dynamics is
far more complex than one dimensional dynamics.
Thus, Schmidt's book, which introduces and ``tanmes'' a large class of higher dimensional symbolic
dynamical systems is tremendously inspiring.
In particular, he explicitly computes the entropy as well as some more sophisticated invariants
characterizing these systems.
Back to
Entropy on the World Wide Web.