Biologists were quick to realize the potential applications of Shannon's ideas in their subject, but this field has so far lagged far behind its brethern in development. Some of the early work by ecologists attempted to use Shannon's entropy as a measure of ecological diversity; my impression is that this line of thought has since died a merciful death. Evolutionary biologists continue to wrestle with the program of explaining in detail the fact of evolution as a (nonequilibrium) thermodynamic process; closely related to this work are the many attempts to quantify hierarchical organization in ecosystems using entropy measures (see for instance the books by Ayers, Brooks and Wiley, Wicken, and Oyama cited below).

See A General Review of Dynamical Systems,
by John Guckenheimer, for a survey of dynamical system theory especially aimed at ecologists
and population biologists.

Information theory has also played an important role in neurobiology; see the book by Rieke et al. for an up-to-date survey (thanks to Safa Sadeghpour for this reference).

Since DNA sequences by their very nature invite statistical analysis, it is not suprising that molecular biologists continue to use Shannon's concept of entropy in the analysis of patterns in gene sequences; generally speaking they seem to have grown more sophisticated since the earliest such applications (see the book by Yockey for a recent survey). Unfortunately, in a recent and already notorious paper published by Science (which ought to have known better), Mantegna et al. repeated a serious mistake first made by Zipf, discoverer of the very real but not very meaningful Zipf's Law. (More sophisticated biologists were quick to point out the error, which was first explained by Benoit Mandlebrot of fractal fame.)

Another area for possible application of information theory arises from the viewpoint that enzymes are really tiny "molecular machines", with the capacity not only to perform physical work but to "make decisions".

**Significantly Lower Entropy Estimates for Natural DNA Sequences,**by David Loewenstern and Peter N. Yianilos, reports on entropy estimates for natural DNA.

**Molecular Information Theory and the Theory of Molecular Machines,**a set of pages maintained by Tom Schneider (Laboratory of Mathematical Biology, NIH), one of the researchers whose work is profiled in the Economist article. Features several readable articles.

At least 23 books on biological information theory have been published since 1964. I regard all of these as more or less controversial; the issues discussed in these books are subtle and extremely complex. Some of the more recent offerings include:

**Information Theory and Molecular Biology,**by Hubert P. Yockey, Cambridge University Press, 1992.

**Information, entropy, and progress: a new evolutionary paradigm,**by Robert U. Ayres, AIP Press, 1994.

**Evolution as entropy: toward a unified theory of biology**, by Daniel R. Brooks and E.O. Wiley, University of Chicago Press, 1986.

**Spikes : exploring the neural code,**by Fred Rieke et al., MIT Press, 1997.

**Evolution, thermodynamics & information: extending the Darwinian program,**by Jeffrey S. Wicken, Oxford University Press, 1987.

**The ontogeny of information: developmental systems and evolution,**by Susan Oyama, Cambridge University Press, 1985.

Back to Entropy on the World Wide Web.