See A General Review of Dynamical Systems,
by John Guckenheimer, for a survey of dynamical system theory especially aimed at ecologists
and population biologists.
Information theory has also played an important role in neurobiology; see the book by Rieke et al. for an up-to-date survey (thanks to Safa Sadeghpour for this reference).
Since DNA sequences by their very nature invite statistical analysis, it is not suprising that molecular biologists continue to use Shannon's concept of entropy in the analysis of patterns in gene sequences; generally speaking they seem to have grown more sophisticated since the earliest such applications (see the book by Yockey for a recent survey). Unfortunately, in a recent and already notorious paper published by Science (which ought to have known better), Mantegna et al. repeated a serious mistake first made by Zipf, discoverer of the very real but not very meaningful Zipf's Law. (More sophisticated biologists were quick to point out the error, which was first explained by Benoit Mandlebrot of fractal fame.)
Another area for possible application of information theory arises from the viewpoint that enzymes are really tiny "molecular machines", with the capacity not only to perform physical work but to "make decisions".
At least 23 books on biological information theory have been published since 1964. I regard all of these as more or less controversial; the issues discussed in these books are subtle and extremely complex. Some of the more recent offerings include:
Back to Entropy on the World Wide Web.