Greg expressed his opinion that macroevolution can't occur. He didn't cite
andy data, I cited the last chapter of Gilbert's Developmental Biology.
Why wouldn't you be curious enough to at least go look at that book rather
than accept an opinion that agrees with your own as authoritative?
>
>But I am a bit confused as to what you think create sthe information, is it
>generational or sequential. Firstly we were arguing generational, then you
>changed to sequential which doesn't seem to have much effect on the whole
>organism in terms of major change.
The information of a DNA molecule is sequential. One must measure H via the
nucleotides in the sequential axis NOT the generational axis. The
generational axis was the only way I could explain your statement in your
first post when you said on Tue, 23 Jun 1998 22:17:39 +0800:
>A DNA sequence of AAAAATAAAA will output this each and every
>time eg: AAAAATAAAA AAAAATAAAA AAAAATAAAA
If I take the above statement as being sequential then it is an assertion
that DNA is repeated segments of
AAAAATAAAAAAAAATAAAAAAAAATAAAAAAAAATAAAA..., which is clearly nonsense and
observationally wrong. But if placed in a generational axis, the above
statement makes sense. I was merely trying to give you the benefit of the
doubt.
>
>ie. one cell with a faulty protein isn't going to be passed to the
>offspring. In fact wouldn't it have to be DNA replicating in a sex cell
>which can create the necessary information that you require?
Absolutely. The germ cells are what is passed on and the DNA contained in
them. Thus when a body has a really great advantageous mutation, the body
is more likely to pass on more germ cells to many more offspring thus
multiplying the advantageous germ cell.
>
>I would like a clarification of the position you are arguing, it seems to
>vary a bit as the discussion continues.
I try to stay put but I am responding to various issues and you may
percieve them as different, they aren't. I also think that part of the
problem is that you haven't read Yockey and can't quit understand the
connotation of the words used as Yockey uses them and I follow Yockey's
work, which I find quite credible and consistent.
>> Once again you are equivocating on the terms information as used
>> for knowledge or intelligence rather than information as a
>> mathematically defined concept. See Greg Billock's post
>> yesterday or see Yockey's "Application of Information Theory to
>> the Central Dogma and the
>> sequence hypothesis" Journal of Theoretical biology 46:369-406
>>
>
>I disagree here. Information is related to and requires meaning. You cannot
>have information without meaning. You CAN have information carrying capacity
>without meaning, but no information.
Then you will be disagreeing with what is becoming the norm in information
science. And on this we will continue to disagree. Yockey notes:
"Accordingly, we will keep in mind that the word *information* is the name
of a mathematical function. We must not allow ourselves to ascribe any
other meaning associated with the word *information* as used in ordinary
language. We are, after all, constructing a mathematical formalism to avoid
the lack of exactness characteristics of ordinary discourse. We will see
later that a plausible argument in ordinary language may lead to
conclusions which are more than just inexact; they are false." ~H. P.
Yockey, "An Application of Information Theory to the Central Dogma and the
Sequence Hypothesis," Journal of Theoretical Biology 46(1974):369-406, p. 375
and
"One must further remember that entropy is a name of a mathematical
function, nothing more. One must not ascribe meaning to the function that
is not in the mathematics. For example, the word 'information' in this
book is never used to connote knowledge or any other dictionary meaning of
the word not specifically stated here. It will be helpful to remember that
a communication system, including the genetic information system, is
essentially a data recording and processing system. Thus one avoids much
of the anthropomorpyhism that words convey."~Hubert Yockey, Information
Theory and Molecular Biology, (Cambridge: Cambridge University Press,
1992), p. 68.
[of DNA brad wrote]
>I thought the error rate was pretty small, around 10E-9 if I remember
>correctly, that is easily small enough to compare to a CD.
the error rate only applies to the generational axis. There is NO error
rate in the sequential axis of a non-reproducing DNA sequence.
>
>Either DNA stores information or it does not, which one is it? If it does
>then it can be compared to other information storage devices. I do not know
>what you mean about treating it as source and reciever, I was treating it as
>neither, rather the channel.
We may be having a semantic dispute here in which the different
applications of information theory use the word channel differently. Yockey
separated source from channel and reciever. If your texts don't make that
distinction then that would explain the difference. Let me quote Yockey.
"A discrete memoryless source is defined as one in which there are no
restrictions or intersymbol influence between letters of the alphabet. If
the response of the channel to the input letters is independent of previous
letters, the channel is called a discrete memoryless channel. A channel
that allows input symbols to be transmitted in any sequence is called an
unconstrained channel. A source that transmits messages written in natural
languages is not a memoryless source, since natural languages do have
intersymbol influence. Monod in his philosophy of chance and necessity,
says that protein sequences are due to chance and justifies this on the
ground that one cannot predict missing amino acids from the properties of
their neighbors. That reflects only the fact that there is no intersymbol
influence in proteins as there is in natural languages....
"Thus the DNA-mRNA-protein system is discrete, memoryless and
unconstrained. It transmits the same message to the destination
repeatedly, as in a tape recorder. The particular message recorded in the
DNA is independent of the genetic information apparatus.
"In communication theory the messages that have meaning, or in
molecular biology what is called specificity, are imbedded in the ensemble
of random sequences that have the same statistical structure, that is, the
same entropy. We know the statistical structure of the ensemble but not
that of the individual sequences. For that reason, the output of any
information source, and, in particular, DNA in molecular biology, is
regarded as a random process that is completely characterized by the
probability spaces [Omega, A, p],[Omega, B, p]." H. P. Yockey, Information
Theory and Molecular Biology, Cambridge, 1992, p. 114
As you can see, Yockey separates the channel from the DNA.
>
>ANY information storage device IS a channel. Do not make the mistake of
>thinking anything that has an output is a source. This is not correct.
>
I think there is some difference in terminology as noted above.
>>
>
>I would like to read information theory text to correct your application of
>information theory. It was you who started using info theory (incorrectly)
>and that is what I am debating.
That is still a matter of dispute. Two people have told you that meaning
is not part of information theory and even one of your own quotes
equivocates on that difference.
>> Because you use the CD analogy in a equivocation between
>> information used as 'knowledge' or 'intelligence' rather than
>> information as defined by H =-k sum(-p[I]log[p[I]). My son told
> ^^^^^^^^^^^^^^^^^^^^^^^^^
>This is the most inaccurate method of anaysing anything but the most trivial
>examples of sources. DNA is certainly not acurately modeled by this. I
>showed this in great length in earlier posts and you have never refuted my
>arguments. (or even acknowedged them).
I did acknowledge them and replied to them. You should go look again
before you make such charges.
>White noise does not have any information. White noise has the capacity to
>carry a great deal of information but it isn't carrying it because nobody
>put any into it. do you get this yet?
here you are once again equivocating on information as used in ordinary
language rather than as defined by mathematics. See the Yockey quote
above. I fully agree that white noise doesn't have information in the
ordinary, colloquial usage of the term information. But we are dealing
with a mathematical NOT DICTIONARY definition of information.
>
>Now mutations in electronic devices are EXACTLY what I study and you are
>totally wrong on this one. If you think that random changes add information
>to a CD then you are sadly mistaken and should go ask anyone involved in
>electronics.
As I keep repeating, mathematically information is not the english language
or intelligence or knowledge. And I agree that white noise will not add the
dictionary definiton of information to anything. But that is not what we
are talking about. If this we become stymied at this point there is not
much use in going on. Information is defined as I noted above.
"For this reason we will call H the message enropy of the ensemble of
messages. It is also called the information content since the number of
messages of length N is 2^NH. Each message can be assigned a meaning or
specificity and thus carries *information*[italicized to differentiate it
from the previous use of information--grm], knwoledge or intelligence. H
reflects the character of the probability distribution P[i]. A broad
distribution provides a maximum choice at the source or uncertainty at the
receiver. H is a maximum if all p[i]=1/n." ~H. P. Yockey, "An Application
of Information Theory to the Central Dogma and the Sequence Hypothesis,"
Journal of Theoretical Biology 46(1974):369-406, p. 373
I repeat the last sentence.
" H is a maximum if all p[i]=1/n" This means that the probabilities of the
characters are RANDOM, RANDOM RANDOM. H is maximum if the sequence is RANDOM.
>
>Random errors in a CD will ALWAYS reduce the information, and the meaning
>and everything else.
I agree that they will reduce info in the dictionary sense, but not in the
mathematical sense.
>> Once again you are making a fundamental error in information
>> theory. See Greg Billock's post if you won't believe me.
>
>Hmm, amazing how I can make such fundamental errors in info theory yet still
>achieve high marks while studying it at university level....
Interesting that you won't listen to Yockey, or Greg or me on the idea that
information is not what you think it is in information theory.
>It is entirely possible to know that you have the best compression possible,
>that is what info theory is all about. This has NOTHING to do with
>algoirthmic complexity at all.
>
>By finding the true information content of a signal you can devise
>compression to compress it to as close to 100% efficiency as you like, it
>just takes more processing power. This is known as "Shannon's noiseless
>coding theorum" and involves taking extensions of a source to find the true
>information content and then coding using this.
>
>This was what I did to refute your first post about the addition of
>information. You have never shown any error I made in doing this so the
>result stands.
>
I have answered it. You haven't understood the answer. You wrote:
>>The general rule is that any information source that repeats a
>>sequence will have zero information. In plain english this is:
>>
>>"If we already know what is being sent then there is no information
>>being sent"
>>
>>btw this is the basic principal that all compression software relies on.
I answered you by showing that DNA is not a set of repeats. That
invalidates your analysis.
>> That is why you can't tell whether I am writing real mandarin
>> chinese (pinyin) below or real gibberish.
>>
>> Ni xue yao xuexi hen duo!
>
>No I cannot tell, but with a bit of investigation I could.
Do it without consulting a chinese student or a chinese textbook. Go ahead?
And let me give you two others to tell me which is and isn't information.
Zhe ge mao you mao.
xi gong zuo chi xiao xue.
When you have the answers I will tell you which is and isn't Mandarin,
which I do speak 'ee diar diar".
>Once I know which
>one it is I would be able to find the true information content. If you were
>speaking mandarin I would be able to find the information content of that
>language, if it was gibberish I would just igore you. You see If you speak
>gibberish and I ignore you I gain the same information as if I listen to
>you, this is therefore the ultimate compression.
There is real mandarin in the above. Tell me which!
>
>You are not helping your case with examples like this.
Actually I am. I am showing you that you cannot use information theory to
determine meaning. Which of the following is a terrible curse in Mandarin?
If information theory is about meaning tell me the bad word!
lao wu gui
hen xiao chun
jiu dian zhong
>
>Hehe, well it is also a real exam question on information theory. Tell us
>Glenn, is noise the most informative source?
>
>just post a yes/no answer, its not hard.
By the dictionary definition it is not. By the mathematical definition it
is!!! See Yockey above. H is maximized when the probability for each
character is the same.
I wrote:
>> Now, let me get this straight. You didn't know who Yockey was;
>> originally you said you hadn't read any of his articles; you
>> didn't understand that a Markov transition matrix could be
>> constructed which didn't rely on the previous character (you
>> admitted that I was correct );
>
>You were correct, but I knew it was possible. I just have never heard of
>anyone using it in that way.
So, knowing that is was possible is why you wrote on Sat, 27 Jun 1998
17:06:50 +0800:
>It is therefore impossible by definition to have a markov source where
>the next symbol is not dependent on the previous.
That is an awfully funny way to say you knew it was possible. There must
be some mutations in Australian English that I am unaware of.
I wrote:
>> you didn't know ...
>> using mathematics for a non-degenerate code;
>
>I know that the code is not important in the maths I used.
I cite Yockey again. "The third term in equation (7) is one of the aspects
of information theory in biology which differes from infomration theory in
elelctrical engineering. This is because there is no degeneracy in the
codes used in communications." ~H. P. Yockey, "An Application of
Information Theory to the Central Dogma and the Sequence Hypothesis,"
Journal of Theoretical Biology 46(1974):369-406, p. 37
Can you cite the 3rd term in equation 7 from electrical engineering texts?
I would suggest that if we are not going to make any progress on the issue
of meaning vs. information then we are at a standstill. I have cited Yockey
you have cited your opinion that information means 'meaning'. If all we
are going to do is have you say (after I have cited a textbook)
"information theory deals in meaning" and I repeat the citeation, then we
are at an impasse. I have enjoyed the debate, but I see no way to make
progress beyond this point if you won't believe a textbook.
But if you can tell me which of the chinese statements are meaningful, I
will tell you their meaning. I don't expect to hear from you on this.
glenn
Adam, Apes and Anthropology
Foundation, Fall and Flood
& lots of creation/evolution information
http://www.isource.net/~grmorton/dmd.htm