Re: Lee Spetner's book

Biochmborg@aol.com
Tue, 14 Sep 1999 16:23:10 EDT

In a message dated 9/14/99 2:04:05 PM Mountain Daylight Time,
dbowman@tiger.georgetowncollege.edu writes:

> I suspect it was only a slip of the fingers, but what Kevin wrote here can
> not be correct as stated. Measures of information/entropy are never
> proportional to the number of possibilities or microstates (as Kevin
> states above). Rather, they are proportional to the *logarithm* of such a
> number.

Thanks for the comment, Dave. The actual relationship is (according to
Brooks and Wiley) Hmax = log A, where Hmax is the maximal entropy capacity of
the system (which is equivolent to the maximal information capacity) and A is
the number of microstates. (Sometimes it can be written as N log A, where N
is the number of entities within the system, or k log A, where k is a
constant that converts the result into bits. Collier has used k N log A.)
Since that is the equality, I figured that the log function would act as a
proportionality constant, so that one can say Hmax is proportional to A. And
it is in essence; as A increases so does Hmax. It's only that the value of A
does not equal the value of Hmax.

My apologies, however, if I committed a mathematical faux paux.

Kevin L. O'Brien