On Mon, 17 Feb 1997 00:50:49 -0500, Brian D Harper wrote:
[...]
>BH>If defined a la Shannon or Kolmogorov then it seems pretty clear
>that an increase in information is expected due to random
>mutations.
>SJ>Agreed. But this may not be what is meant by "information" in
>biology, namely the assignment of specificied meaning:
>
>"...The assignment of meaning or in biology, specificity, to
>certain particular member of the ensemble lies outside information
>theory." (Yockey H.P., Journal of Theoretical Biology, 46, 1974,
>pp371-372)
BH>I'm not quite sure what you are getting at with this quote. I
>think most of us are aware that Yockey is perhaps the foremost
>expert on biological applications of information theory alive
>today. Yet he disagrees completely, totally with what you say.
That's strange, because I agree with what Yockey says above!
BH>I'm not trying to use this as an argument from authority, just
>curious why you're quoting Yockey at this point. One of the major
>"campaigns" of his book is to argue that Shannon entropy and
>Kolmogorov complexity *do* measure biological information
>and he warns his reader *against* the temptation to try to include
>meaning or specificity as part of the definition of biological
>information.
That's OK. We are just using the word "information" differently. He
rules out "The assignment of meaning or in biology, specificity", in
his definition of information, whereas it is exactly what I mean by
it. There is no necessary conflict between us, provided the word
"information" is defined up front, as Yockey does in this article.
BH>I think I can sympathize with your concern about specificity. I
>have the same concern. But its important not to throw the baby
>out with the bath water. The Shannon entropy and K-complexity
>provide really great, objective definitions for the amount of
>information contained in a message.
No doubt. But if it thinks that:
AAAAAAAAAAAAAAAAAAAAAAAAXAAAAAAAAAAAAAAAAAAAAAAA
has more "information" than:
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Then we are using the word "information" differently. The first
string might carry more "information" in terms of information theory,
but both strings would carry no "information" in terms of
specified complexity.
BH>If it cannot measure specificity, so what, there are other ways
of getting at that.
This merely confirms my point: "The Shannon entropy and K-complexity
provide really great, objective definitions for the amount of
information contained in a message" *but* "it cannot measure
specificity". Since "specificity" is what I mean by "information", we
are all agreed.
BH>And actually, we learn some interesting things when we start
>looking at the specificity side of it. For example, Yockey
>estimates that there are about 9.7x(10)^93 iso-1-cytochrome c
>sequences that carry the same specificity. I think this is somewhat
>contrary to one's intuitive notion of specificity. For example,
>many probability calculations are carried out under the assumption
>that one and only one sequence of amino acids will suffice for a
>particular function. So, iso-1-cytochrome c does require
>specificity in that not just any old sequence will do, yet there is
>still quite a bit of arbitrariness in the specificity ;-).
I would not agree that less than perfect specificity means it is
necessarily due to "arbitrariness". It could be exactly as planned
either having degenerated somewhat over time, or having a built-in
redundancy to it, or both.
BH>But its important to keep this issue separate from the amount
>of information as measured by Shannon entropy or K-complexity.
>In this way one can keep distinct two important and independent
>properties of functional bio polymers (1) the amount of information
>stored in such polymers is extremely large.
Agreed. Information theory is useful in that it can measure the
quantity of "information" without being able to measure what it
means.
BH>(2) functionality requires specificity.
Agreed. To do any meaningful function, there must be 'specificity".
BH>I believe that you want to combine the two.
Brian puts words into my mouth again. I am doing my best to
distinguish between the two types of "information", but Brian keeps
claiming I mean the same thing!
BH>I say, your arguments against naturalistic abiogenesis will be
>much stronger if you keep them distinct.
That's exactly what I *am* doing!
BH>It is the combination of these two features (provided their
>independence is maintained) which gives Manfred Eigen headaches. If
>you allow Eigen the liberty of combining them by introducing a "value
>parameter" then his Hypercycles do a pretty god job of explaining how
>informational biomolecules could arise in a primordial soup. If you
>do not allow him this luxury, he's cooked in his own soup. Yockey
>goes into these things in detail in his book.
Agreed, but Eigen's theory fails anyway because it requires chance to
get the origin of life started:
"In the form of the self organization scenario due to Eigen (1971)
and Eigen & Schuster (1977) chance plays its role at the "beginning"
so that the origin of life started from random events. According to
them there is an initial period of stochastic scanning before the
auto catalytic hypercycles appear. The information carriers which he
regards as being nucleic acids are believed to have formed by self
organization from random polymers. "Orderliness" then emerges by
means of self sustaining auto catalytic hypercycles. The
calculations given above show that the required number of high
probability sequences cannot be scanned with the material and the
time available so that the scenario cannot get started." (Yockey
H.P., "Self Organization Origin of Life Scenarios and Information
Theory", Journal of Theoretical Biology, 91, 1981, p23)
And as Thaxton et al point out, dependence on chance is a major
weakness in the whole chemical evolutionary theory:
"In his comprehensive application of non equilibrium thermodynamics
to the evolution of biological systems, Eigen has shown that
selection could produce no evolutionary development in an open system
unless the system were maintained far from equilibrium. The reaction
must be autocatalytic but capable of self-replication. He develops
an argument to show that in order to produce a truly self-
replicating system the complementary base-pairing instruction
potential of nucleic acids must be combined with the catalytic
coupling function of proteins. Kaplan has suggested a minimum of
20-40 functional proteins and a similar number of nucleic acids would
be required by such a system. Yet as has previously been noted, the
chance origin of even one protein of 100 amino acids is essentially
zero....Periodically, we see reversions (perhaps inadvertent ones) to
chance in the theoretical models advanced to solve the problem.
Eigen's model illustrates this well. The model he sets forth must
necessarily arise from chance events and is nearly as incredible as
the chance origin of life itself. The fact that generally chance has
to be invoked many times in the abiotic sequence has been called by
Brooks and Shaw "a major weakness in the whole chemical evolutionary
theory" (Thaxton C.B., Bradley W.L. & Olsen R.L., "The Mystery of
Life's Origin", 1992, p154)
BH>I would advise Steve and all creationists to give this some
>thought. If you want to include meaning in the definition of
>information then certainly you cannot deny Eigen the same luxury.
Brian still fails to realise that the one word "information" is being
used in two different senses.
[...]
>BH>Steve Jones wanted to define information in terms of specified
>complexity but this definition never was particularly clear to me.
>SJ>It wasn't *my* definition. That is how origin of life specialist
>Orgel defined it:
>
>"Living organisms are distinguished by their specified complexity.
>Crystals fail to qualify as living because they lack complexity;
>mixtures of random polymers fail to qualify because they lack
>specificity." (Orgel L.E., in Thaxton C.B., et al., "The Mystery
of Life's Origin", 1992, p130)
>
>Interestingly, it appears to be also Dawkins' definition, namely a
>"quality, specifiable in advance":
>
>"...complicated things have some quality, specifiable in advance,
>that is highly unlikely to have been acquired by random chance
>alone." (Dawkins R., "The Blind Watchmaker", 1991, p9)
BH>First of all, I appreciate your going to the trouble of backing
>up your points with some quotes from authorities.
This is a pleasant change. Brian was claiming it was "abusing
the group" a few weeks ago.
BH>I would like to try to head off, if I possibly can, one avenue of
>discussion before we waste too much time on it. One of the things
>that makes this topic so interesting is that one can find a really
>wide variety of viewpoints expressed by various "authorities" with
>a great amount of arguing and disagreement. So, if your intent is
>to present various authorities who disagree with Brian, then let's
>get that over with now. I'm pretty confident that for every
>authority you can find who disagrees with Brian, I'll find at least
>two more that also disagree with Brian :). For example, Manfred
>Eigen, Jeffery Wicken, J.P. Ryan, H.A. Johnson also disagree with
>Brian. I would be much harder pressed to find authorities who agree
>with me. There are some, but I could probably only name say one for
>every five or so that disagree. So, I joyfully admit that mine is a
>minority view. If you intend to win the argument by a vote of
>authorities then let's end it here. You win hands down ;-).
I am not interested in winning an argument over Brian or anyone
else-I am only interested in finding the truth. I would rather lose
an argument and find truth, than win an argument and remain in
error.
>BH>The paper by Dembski (which I haven't read completely yet) tries
>to make the definition of specified complexity (he calls it CSI,
>complex specified information) more concrete and objective. Looks
>very interesting.
>
>SJ>Indeed. Here is the abstract:
[...]
>BH>In particular, he tries to develop a Law of Conservation of
>Information, which is exactly what I had suggested to Steve needs to
>be done.
>SJ>Brian was actually requiring that *I* do it:
BH>Steve was kind enough to provide what I wrote below. Interested
>parties can read it and see that as a matter of fact it was
>a suggestion of what needed to be done and that no where did
>state or imply that I was requiring Steve to do it. For example,
>my very first sentence contains the phrase "... I've come with a
>suggestion that might be worth pursuing".
That's not how I took it. What else does the "I've helped you out
enough already...When you discover..." mean? It was all part of
Brian's unreasonable request that I come up with an objective
definition of information in terms of specified complexity.
>----------------------------------------------------------
>Date: Sun, 01 Dec 1996 20:32:15 -0500
>To: evolution@calvin.edu
>From: "Brian D. Harper" <harper.10@osu.edu>
>Subject: Re: design: purposeful or random?
[...]
>OK, I've helped you out enough already ;-). When you discover
>the CoI Principle and become famous, don't forget where you
>got the idea ;-).
>----------------------------------------------------------
BH>The unreasonableness of Brian's requirement that I develop a
>theory of "conservation of information" in order to support my claim
>that...is seen by the fact that philosopher- mathematician Dembski
>is only just now breaking new ground in developing such a theory
>himself.
BH>See, you missed your big chance already ;-).
No. There was no "chance" that I could have done it.
BH>Looking back over what I wrote previously I think there is another
>good idea (or at least one worth pursuing), namely mutual information.
>This is an objective measure which gets at the idea of irreducible
>complexity, although perhaps indirectly and without addressing
>functionality. What it could do, however, is measure in an objective
>way the degree of interconnectedness of various components of a
>system.
>
>Just so I'm perfectly clear. In no way shape or form am I requiring
>Steve Jones to pursue this topic.
Whew. That's a relief!
God bless.
Steve
-------------------------------------------------------------------
| Stephen E (Steve) Jones ,--_|\ sejones@ibm.net |
| 3 Hawker Avenue / Oz \ Steve.Jones@health.wa.gov.au |
| Warwick 6024 ->*_,--\_/ Phone +61 9 448 7439 (These are |
| Perth, West Australia v my opinions, not my employer's) |
-------------------------------------------------------------------