>
>Let me try again. If the entire data set of organisms consists of say,
>100,000 genes, and the number of organisms in the data set is (apparently)
>changing through time, through the development of new combinations of these
>genes, doesn't the total complexity of the biome increase without an
>increase in information? This would predispose the history of life to
>exhibit an increase in complexity, just by the different number of
>combinations expressed, wouldn't it? At least this is a manifestation of
>the difficulty I have in equating information and complexity.
Ah, I think I understand what you're getting at now, sorry for
being slow.
I believe this may be an example of the significance of which
measure of information is being used. I just received my Dec 11
<Science> which contains a special section on C elegans which
I believe you were referring to in your previous post. One of
the articles (Ruvkin and Hobert) has some discussion on what I
believe is the primary mechanism for increasing the information
content, namely gene duplication and divergence. Despite what
some individuals have said in the past :), I don't think that
duplication itself results in much of an increase. Thinking
in terms of algorithmic (descriptive) complexity the duplication
itself can be described simply as "repeat". So, the real increase
comes from subsequent divergence of the duplicated section. Once
this occurs a simple "repeat" does not suffice. From the point
of view of algorithmic complexity this duplication/divergence
process would have to be considered an increase in both information
and complexity.
But I think your statement above may be approaching information
content from the Shannon point of view. Unfortunately, our
discussions here seem to often muddle these two information
concepts together when actually there are some important differences.
There are several people on this list that know a great deal more
about this than I do. Hopefully they'll correct me if I botch this
up :). From the Shannon angle it seems one might be able to take
"the entire data set of organisms" that you refer to above (which
I would take to include all possible organisms whether actualized
or not) as the ensemble of messages in Shannon's theory with
the information being the uncertainty as to which is actualized.
If it's possible to look at it this way then I would say that
this information measure would be constant and that evolution
would be a process of actualizing specific possibilities.
If the above is correct then we have an interesting case where
one information measure is stationary while another may be
increasing or decreasing. Interesting idea. Anyone have
comments/refutations?
Brian Harper
Associate Professor
Applied Mechanics
The Ohio State University
"He who establishes his arguments
by noise and command shows that
reason is weak" -- Montaigne