Entropy in Logic and the Theory of Algorithms

Original by Chris Hillman (Last modified by Chris Hillman 2 Feb 2001.)

A few years after his revolutionary application of Shannon's concept of entropy to the study of dynamical systems, Kolmogorov saw how to introduce a new concept of entropy, the algorithmic complexity of a finite sequence of bits. At virtually the same time, Ray Solomonoff devised a notion of algorithmic probability and Martin-Loef used similar ideas to develop a theory of algorthmic randomness. Meanwhile, Gregory Chaitin (then an undergraduate at CUNY) independently discovered many of the same ideas. Later, Chaitin discovered a a startling connection with the notorious incompleteness theorem of Kurt Goedel.

Here are a several expository papers explaining these ideas:

Computer scientists are, for (obvious?) reasons, interested in the mathematical theory of logical reasoning, which turns out to have some connections to maximal entropy. See:

Further Reading

Back to Entropy on the World Wide Web.