The above quote reminded me that I had been meaning to say
something about the conservation of information.
First let me nit pick on the above quote. Surely the authors
do not mean genetic code. The code is not itself a storehouse
for information. It merely translates the information stored
in DNA to a different "alphabet", that of proteins. So, they
must mean genetic information instead of genetic code.
OK, one commonly sees statements that information can never
increase but it might decrease (i.e. Spetner). In contrast,
we also see claims about a law of conservation of information
(Dembski). What I wanted to point out is that one cannot hold
both of these positions simultaneously. Now I guess I'm curious
as to whether Spetner endorses Dembski's conservation law. Anyone
know? Anyway, the reason is that if one has a localized decrease
in information then one must also have somewhere an increase in
information or else information is not being conserved.
Well, I had always taken the conservation of information to be
a bold and daring hypothesis. Lest people misunderstand, bold and
daring hypotheses are good things is science :). After all the
discussion surrounding this hypothesis I was really floored to
read the following in <Science>:
#"Basic principles of physics teach that information in
#the universe is preserved: If you had perfect knowledge
#of the present, you could, in theory at least, reconstruct
#the past and predict the future. (Such perfect knowledge
#is impossible in practice, of course.) Suppose you threw
#an encyclopedia into a fire, for example; if you had perfect
#knowledge of the radiation emitted and the ensuing motions
#of all the atoms and molecules, you could, with infinite
#attention to the details, reconstruct the knowledge inside
#the encyclopedia. Physicists refer to their equations as
#'unitary'--that is, they preserve information."
#-- Gary Taubes <The Holographic Universe>,
# Science 285:515, 23 July 1999.
Quite a remarkable statement. I guess the first thing to
occur to me is that what little I know about info theory
is in the context either of communication theory (Shannon
info theory) or complexity theory (algorithmic info theory).
I was aware that physicists also use the word information,
perhaps they mean something different by the term? Any
physicists like to clarify? David Bowman?
More interesting would be if anyone would like to defend
that statement. A bad sign is that the author talks about
the knowledge inside the encyclopedia. The best case scenario
would be that with "infinite attention to the details" one
might be able to reconstruct the physical arrangement of
ink on paper. But this physical arrangement does not itself
reflect knowledge. But even this ideal case I would say is
still impossible. It is reminiscent of Laplace's dream,
which I had thought that most everyone now recognizes as
a pipe dream. Am I wrong about this?
Brian Harper | "If you don't understand
Associate Professor | something and want to
Applied Mechanics | sound profound, use the
The Ohio State University | word 'entropy'"
| -- Morrowitz