A few years after his revolutionary application of Shannon's concept of entropy to the study of dynamical systems, Kolmogorov saw how to introduce a new concept of entropy, the

Here are a several expository papers explaining these ideas:

**Lecture Notes on Descriptional Entropy and Randomness,**by Peter Gacs (Computer Sciences, Boston University). A good introduction to the notion of algorithmic entropy (a.k.a. "descriptional entropy" or "Kolmogorov complexity").

**Randomness and Mathematical Proof,**by Gregory J. Chaitin (IBM Research). This is an electronic reprint of a 1975 Scientific American article in which Chaitin discussed the connection he discovered betweeen Goedel's Theorem and algorithmic complexity.

**Randomness in Arithmetic,**by Gregory Chaitin (a reprint of a 1987 Scientific American article).

**How to Run Algorithmic Information Theory on a Computer**by Gregory Chaitin, who has continued to develop increasingly sophisticated versions of this theory over the past thirty odd years. In the "classical" versions of this theory, it is very difficult to actually compute or even estimate the algorithmic entropy of a specific binary string; indeed, these entropies are formally uncomputable. In this talk, Chaitin proposes a new version of algorithmic information theory which gets around this limitation by fixing a particular universal Turing machine (i.e., roughly speaking, working with a specific general programming language related to LISP). You can find a web tutorial which allows you to play around with the theory, plus many, many other on-line papers/books at Chaitin's home page.

**The Discovery of Algorithmic Probability,**by Ray Solomonoff (Oxbridge Research). A fascinating and very personal account of the inception of algorithmic probability.

Computer scientists are, for (obvious?) reasons, interested in the mathematical theory of logical reasoning, which turns out to have some connections to maximal entropy. See:

**Random Worlds and Maximum Entropy,**by Adam J. Grove (NEC Research Institute), Joseph Y. Halpern (IBM Almaden Research Center), Daphne Koller (Computer Science, Stanford University). This paper studies a kind of axiomatic system, the "random world", which includes not only axioms (specific statements accepted as true) but also "statistical facts". For each world, the authors define an entropy (relative to some statement) and show how maximal entropy methods can be used to compute "the degree of belief" in the statement.

**An Introduction to Kolmogorov Complexity and its Applications,**by Ming Li and Paul Vitanyi, Springer-Verlag, 1993. This readable introduction is still the only textbook on algorithmic information theory; the second edition has recently appeared.

**The Limits of Mathematics,**by G. J. Chaitin (IBM Research), Springer, in press. "This course uses algorithmic information theory to show that mathematics has serious limitations, and features a new more didactic approach to algorithmic information theory using lisp and Mathematica software. The thesis of the book is that the incompleteness phenomenon discovered by Gödel is much more widespread and serious than hitherto suspected."

Back to Entropy on the World Wide Web.