>... Its simplest demonstration of this is to set
>information/entropy proportional to the number of possible microstates in a
>systems. If a system consists of a population of animals and a microstate is
>a genotype, then as the number of genotypes possessed by that population
>increases, so does the entropy of the system, and thus so does the
>information of the system.
I suspect it was only a slip of the fingers, but what Kevin wrote here can
not be correct as stated. Measures of information/entropy are never
proportional to the number of possibilities or microstates (as Kevin
states above). Rather, they are proportional to the *logarithm* of such a
number.
David Bowman
dbowman@georgetowncollege.edu