I'm not quite sure what you are getting at with this quote. I
think most of us are aware that Yockey is perhaps the foremost
expert on biological applications of information theory alive
today. Yet he disagrees completely, totally with what you say.
I'm not trying to use this as an argument from authority, just
curious why you're quoting Yockey at this point. One of the major
"campaigns" of his book is to argue that Shannon entropy and
Kolmogorov complexity *do* measure biological information
and he warns his reader *against* the temptation to try to include
meaning or specificity as part of the definition of biological
information.
I think I can sympathize with your concern about specificity. I
have the same concern. But its important not to throw the baby
out with the bath water. The Shannon entropy and K-complexity
provide really great, objective definitions for the amount of
information contained in a message. If it cannot measure specificity,
so what, there are other ways of getting at that. And actually,
we learn some interesting things when we start looking at the
specificity side of it. For example, Yockey estimates that there
are about 9.7x(10)^93 iso-1-cytochrome c sequences that carry the
same specificity. I think this is somewhat contrary to one's
intuitive notion of specificity. For example, many probability
calculations are carried out under the assumption that one and
only one sequence of amino acids will suffice for a particular
function. So, iso-1-cytochrome c does require specificity in
that not just any old sequence will do, yet there is still quite
a bit of arbitrariness in the specificity ;-).
But its important to keep this issue separate from the amount
of information as measured by Shannon entropy or K-complexity.
In this way one can keep distinct two important and independent
properties of functional bio polymers (1) the amount of information
stored in such polymers is extremely large. (2) functionality
requires specificity. I believe that you want to combine the
two. I say, your arguments against naturalistic abiogenesis will
be much stronger if you keep them distinct. It is the combination
of these two features (provided their independence is maintained)
which gives Manfred Eigen headaches. If you allow Eigen the liberty
of combining them by introducing a "value parameter" then his
Hypercycles do a pretty god job of explaining how informational
biomolecules could arise in a primordial soup. If you do not
allow him this luxury, he's cooked in his own soup. Yockey goes
into these things in detail in his book.
I would advise Steve and all creationists to give this some thought.
If you want to include meaning in the definition of information then
certainly you cannot deny Eigen the same luxury.
>SJ>Dembski says that "information" is not only "the transmission of
>signals across a communication channel":
>
>"The fundamental intuition underlying information is not, as is
>commonly thought, the transmission of signals across a communication
>channel, but rather, the ruling out of possibilities. To be sure,
>when signals are transmitted across a communication channel,
>invariably a set of possibilities is ruled out, namely, those signals
>which were not transmitted. But to acquire information remains
>fundamentally a matter of ruling out possibilities, whether these
>possibilities comprise signals across a communication channel or take
>some other form." (Dembski W.A., "Intelligent Design as a Theory of
>Information", January 1997, Indiana, USA)
>
>>BH>Steve Jones wanted to define information in terms of specified
>>complexity but this definition never was particularly clear to me.
>
>SJ>It wasn't *my* definition. That is how origin of life specialist
>Orgel defined it:
>
>"Living organisms are distinguished by their specified complexity.
>Crystals fail to qualify as living because they lack complexity;
>mixtures of random polymers fail to qualify because they lack
>specificity." (Orgel L.E., "The Origins of Life", 1973, p189, in
>Thaxton C.B., et al., "The Mystery of Life's Origin", 1992, p130)
>
>Interestingly, it appears to be also Dawkins' definition, namely a
>"quality, specifiable in advance":
>
>"This has been quite a long, drawn-out argument, and it is time to
>remind ourselves of how we got into it in the first place. We were
>looking for a precise way to express what we mean when we refer to
>something as complicated. We were trying to put a finger on what it
>is that humans and moles and earthworms and airliners and watches
>have in common with each other, but not with blancmange, or Mont
>Blanc, or the moon. The answer we have arrived at is that
>complicated things have some quality, specifiable in advance, that
>is highly unlikely to have been acquired by random chance alone."
>(Dawkins R., "The Blind Watchmaker", 1991, p9)
>
First of all, I appreciate your going to the trouble of backing
up your points with some quotes from authorities. I would like
to try to head off, if I possibly can, one avenue of discussion
before we waste too much time on it. One of the things that makes
this topic so interesting is that one can find a really wide
variety of viewpoints expressed by various "authorities" with
a great amount of arguing and disagreement. So, if your intent
is to present various authorities who disagree with Brian, then
let's get that over with now. I'm pretty confident that for
every authority you can find who disagrees with Brian, I'll find
at least two more that also disagree with Brian :). For example,
Manfred Eigen, Jeffery Wicken, J.P. Ryan, H.A. Johnson also
disagree with Brian. I would be much harder pressed to find
authorities who agree with me. There are some, but I could
probably only name say one for every five or so that disagree.
So, I joyfully admit that mine is a minority view. If you
intend to win the argument by a vote of authorities then let's
end it here. You win hands down ;-).
>BH>The paper by Dembski (which I haven't read completely yet) tries
>to >make the definition of specified complexity (he calls it CSI,
>>complex specified information) more concrete and objective. Looks
>>very interesting.
>
>SJ>Indeed. Here is the abstract:
>
[ skipped abstract]
>----------------------------------------------------------
>
>BH>In particular, he tries to develop a Law of
>>Conservation of Information, which is exactly what I had suggested
>>to Steve needs to be done.
>
>SJ>Brian was actually requiring that *I* do it:
>
Steve was kind enough to provide what I wrote below. Interested
parties can read it and see that as a matter of fact it was
a suggestion of what needed to be done and that no where did
state or imply that I was requiring Steve to do it. For example,
my very first sentence contains the phrase "... I've come with a
suggestion that might be worth pursuing".
>----------------------------------------------------------
>Date: Sun, 01 Dec 1996 20:32:15 -0500
>To: evolution@calvin.edu
>From: "Brian D. Harper" <harper.10@osu.edu>
>Subject: Re: design: purposeful or random?
>
>[...]
>
>I've done a little thinking on this since my initial question to
>you and I've come with a suggestion that might be worth
>pursuing. Note that this is just an intuitive guess on my
>part (a non-expert) but it seems to me that some measure
>of mutual information content probably will satisfy the condition
>of not increasing due to a random mutation (the classical Shannon
>information would increase). I'm also suspecting that mutual
>information may also capture the basic ideas of irreducible
>complexity with the added bonus of being objectively measurable.
>
>I kind of like this idea, I wonder if someone has already thought
>of it. Probably :). Let's suppose that some clever person comes
>up with a measure of mutual information content and can show
>that this information cannot increase due to a random mutation.
>This would be only a small step towards providing any threat to
>neo-Darwinism. One would still have to show that this information
>measure doesn't increase due to the combination random mutation
>plus selection mechanism. Good luck. It seems to me that one is
>searching for some general principle that might be called the
>"conservation of information". Stated this way it almost makes
>me giggle out loud. OK, I confess, I did giggle a little as I wrote
>that :). But who knows, maybe there is such a principle at least
>in some narrower sense, i.e. "information is always conserved
>for certain types of physical processes X, Y and Z".
>
>OK, I've helped you out enough already ;-). When you discover
>the CoI Principle and become famous, don't forget where you
>got the idea ;-).
>----------------------------------------------------------
>
>The unreasonableness of Brian's requirement that I develop a theory
>of "conservation of information" in order to support my claim that:
>
> "...one thing that `random mutation' cannot do is to `create new
> information'"
>
>is seen by the fact that philosopher- mathematician Dembski is only
>just now breaking new ground in developing such a theory himself.
>
See, you missed your big chance already ;-).
Looking back over what I wrote previously I think there is another
good idea (or at least one worth pursuing), namely mutual information.
This is an objective measure which gets at the idea of irreducible
complexity, although perhaps indirectly and without addressing
functionality. What it could do, however, is measure in an objective
way the degree of interconnectedness of various components of a
system.
Just so I'm perfectly clear. In no way shape or form am I requiring
Steve Jones to pursue this topic.
Brian Harper
Associate Professor
Applied Mechanics
Ohio State University
"Aw, Wilbur" -- Mr. Ed