[...]
>
>>PM>And the information theory entropy has no relationship to the
>>entropy as defined by thermodynamics.
>
>>SJ>Someone better tell the author of my daughter's university physics
>>textbook! As he points out, there is a "relationship" between
>>"entropy as defined by thermodynamics" and "information theory":
>
>PM>I disagree since there is not second law of information content.
>
SJ:==
>Your claim was that "information theory entropy has no relationship
>to the entropy as defined by thermodynamics". I responded with a
>quote from a "university physics textbook" that there is a
>"`relationship' between `entropy as defined by thermodynamics' and
>`information theory' ". You replied with an "unsupported assertion"
>that you "disagree since there is not second law of information
>content." Please supply quotes or references to support yoour
>psoition "that information theory entropy has no relationship to the
>entropy as defined by thermodynamics."
>
First let me apologize for not having kept up with this thread.
In particular, I did not see the quote from your daughters
physics text nor is it particularly clear to me exactly why
you want to establish a relation between thermodynamic and
informational entropy. At first glance, such a stance seems
odd to me. Suppose there were a second law for informational
entropy. Such a law would guarantee that the informational
entropy increases with time. But informational entropy is
just another word for information content. So this law
guarantees the increase in information content with time.
Is this what you want?
Well, I'm afraid I agree with Pim on this, but I'll quickly
add that this issue is very controversial with noted
experts on both sides of the aisle. For example, Shannon
and his cohort Weaver disagreed on this point. After bringing
up Shannon I cannot resist repeating the humorous anecdote
as to how Shannon arrived at the name "entropy" for his
measure of information content. Apparently Shannon wanted to
call it a measure of information content but hesitated for
fear of confusion since "information" has so many different
meanings. He then discussed his little "problem" with his friend,
Von Neumann, who advised him to call it "entropy" for
two reasons:
"First, the expression is the same as the expression for
entropy in thermodynamics and as such you should not use
two different names for the same mathematical expression,
and second, and more importantly, entropy, in spite of
one hundred years of history, is not very well understood
yet and so as such you will win every time you use entropy
in an argument."
If only Shannon had not listened to von Neumann ;-)
OK, I'll finally answer your request by giving a few quotes
that outline the reasons for saying that there is no relation
between the informational and thermodynamic entropies. Regardless
of the fact that experts have taken the opposing view, the
arguments presented in the following quotes seem to me to be
pretty convincing:
==========================================================
As a result of its independent lines of development in
thermodynamics and information theory, there are in science
today two "entropies." This is one too many (see also Denbigh,
1982). It is not science's habit to affix the same name to
different concepts, since common names suggest shared meanings.
Given the inevitable tendency for connotations to flow from the
established to the new, the Shannon entropy began from the
beginning to take on colorations of thermodynamic entropy. In
his introductory chapter to Shannon's paper, Weaver revealingly
quotes Eddington as follows: "The law that entropy always
increases-the second law of thermodynamics- holds, I think,
supreme position among the laws of Nature" (Shannon and Weaver,
1949). Shannon [*] goes on to rhapsodize that "thus when one
meets the concept of entropy in communication theory, he has
a right to be excited--a right to expect that one has hold of
something that may turn out to be basic and important" (Shannon
and Weaver, 1949).
<<[*] a misprint, this quote is from Weaver not Shannon--BH>>
To appreciate the importance of restricting "entropy" to
thermodynamic applications--or, at most, to other probabilistic
applications where microstate-macrostate relationships provide
a condition of irreversibility--one need only reflect on these
remarks. While an indisputably important contribution to science,
the Shannon formulation does not make contact with the second
law. Shannon himself made no claims that it did. But as long as
the term "entropy" buttresses the Shannon formula, the second
law remains a steady source of justification for ideas that must
find their own grounds of support. If it were possible to treat
"entropy" simply as an equation, with properties dependent on area
of application, calling Shannon's function by that name would be
relatively unproblematic. In point of fact, most who use the
term "entropy" feel something of Weaver's conviction about
contacting a universal principle that provides sweeping laws
of directional change.
[...]
Brooks and Wiley (1985) correctly point out that the Boltzmann
equation did not originate with Boltzmann or with thermodynamic
relationships, but with the eighteenth-certury mathematician
DeMoivre's analysis of games of chance. This is just more
evidence that equations are not equivalent to concepts. Everything
that bears the stamp
H = - k SUM P_i log P_i
does not have the property of increasing in time. Irreversibility
must be independently demonstrated. If one has a nonequilibrium
system governed by stochastic dynamics and negotiable energy
barriers, then entropy will increase.
--- Wicken, J.S. (1988). "Thermodynamics, Evolution, and
Emergence: Ingredients for a New Synthesis," in
<Entropy, Information, and Evolution>, Editors B.H.
Weber, D.J. Depew and J.D. Smith, MIT Press, Cambridge,
MA, pp. 139-169.
==================================================
That H and S [**] are irreconcilably different should have been
clear from Shannon's original identification of "the entropy
of the set of probabilities." The thermodynamic entropy S is
a state function property of a physical system. A crystal,
an organism, a star, each has a value of S characteristic of
it. H, on the other hand cannot refer to a physical system
since it is identified with a mathematical construct. Pursuing
thermodynamics a step further, the quantity TS is an
energy-related, well-defined physical parameter of any physical
system at uniform temperature. What in the world is the quantity
TH? For that matter, what is T for a "set of probabilities" or
a sequence of symbols? If H and S are related in any way that
is more profound than sharing a mathematical formalism, it
should be possible to identify a temperature-analog that can
be meaningfully and usefully applied to sets and sequences.
Wicken has drawn biologists' attention to the incommensurateness
of physical and informational "entropies": "One should begin
then by distinguishing carefully between two distinct kinds
of entropy.... First is thermodynamic entropy itself, which
involves statistical ensembles of micro-structures.... Second
is the entropy of a sequence of elements, which can be defined
as the minimum algorithm or information required for its
unambiguous specification.... It must be appreciated that
sequence entropies are not thermodynamic entropies" (Wicken,
1983 p. 439). Wicken is correct, but he does not go far enough,
in my view. Retaining the name "entropy" for the minimum
algorithm of a sequence ensures that confusion will accompany
attempts to interpret this quantity. Elsasser has expressed this
problem very cogently. As he points out, "Gravitational and
electrostatic equilibrium are characterized by the same
(Laplace's) differential equation. Nobody would think that
gravitational attraction and electrostatic attraction or
repulsion are otherwise identical". (Elsasser, 1983, p. 107).
The reason that nobody makes this mistake is that gravity and
electrostatics have different names. Let us call the minimum
information of a sequence by some other name than entropy.
Perhaps 'information potential' would do, for reasons expressed
below.
--- Olmsted, J. III (1988). "Observations on Evolution," in
<Entropy, Information, and Evolution>, Editors B.H. Weber,
D.J. Depew and J.D. Smith, MIT Press, Cambridge, MA,
pp. 263-274.
<<[**] H and S are the common symbols for the Shannon and
thermodynamic entropies respectively-- BH>>
======================================================
There are a number of papers in the literature in which an
attempt is made to establish a relation between Shannon entropy
and Maxwell-Boltzmann-Gibbs entropy. Eigen (1971) follows
Brillouin (1962) in confusing Maxwell-Boltzmann-Gibbs entropy
with Shannon entropy. This point is discussed in section 12.1.
There is such a relationship if and only if the probability
spaces are isomorphic; in that case they have the same entropy
(section 2.2.l). For example, the entropy of the probability
space characteristic of the tossing of two dice is not equal to
the entropy of the probability space of the tossing of a fair
coin because these probability spaces are not isomorphic.
Furthermore, there is a Shannon entropy for each alphabet
depending on the arbitrary choice of the number of letters.
There is a relationship between the Shannon entropies of
different alphabets if and only if they are related by a code.
It is clear from section 2.2.1 that Shannon entropy and Maxwell-
Boltzmann-Gibbs entropy do not pertain to probability spaces
that are isomorphic. In addition, these probability spaces are
not related by a code. For that reason Maxwell-Boltzmann-Gibbs
entropy is not applicable to the probability spaces used in
communication theory or in the genetic information system of
molecular biology. Furthermore, in thermodynamics and statistical
mechanics there is an integral of the motion of the system,
namely, the conservation of energy, which has no counterpart
in information theory. For these reasons there is, therefore,
no relation between Maxwell-Boltzmann-Gibbs entropy of statistical
mechanics and the Shannon entropy of communication systems
(Yockey, 1974, 1977c).
-- Yockey, H. (1992). <Information Theory and Molecular Biology>,
Cambridge University Press., p. 70.
=================================================================
Brian Harper
Associate Professor
Applied Mechanics
The Ohio State University
"If cucumbers had anti-gravity,
sunsets would be more interesting"
-- Wesley Elsberry