RE: Emergence of information out of nothing?

From: Glenn Morton (glenn.morton@btinternet.com)
Date: Wed May 01 2002 - 23:30:36 EDT

  • Next message: Glenn Morton: "RE: 2900 BC vs. 2350 BC"

    Peter wrote:

    >The amount of meaningful or semantic information contained in a system
    >may be defined as the minimal length of an algorithm capable of
    >specifying it (M.V. Volkenstein, "Punctualism, non-adaptionism,
    >neutralism and evolution", BioSystems 20 (1987), 289). This would
    >exclude all features irrelevant for meaning or functionality. The
    >meaningful information contained in today's biosphere may be
    >approximated by a (purely theoretical) minimal set of genome parts
    >"streamlined" to include the code for whatever is really required for
    >the organisms represented in the biosphere, but nothing else. Its amount
    >is such that the improbability of its generation by random-variation /
    >natural-selection processes, starting with a prebiotic universe, is
    >vastly transastronomical.

    Those of us who use information theory in our jobs know that what you write
    is not at all based upon information theory. "Meaning" or "semantics" has no
    place in information theory and that is the fundamental misunderstanding of
    many people who try to speak of this area and apply it to
    creation/evolution. This is from the seminal information theory paper by
    Claude Shannon, who died last year or year before. It is in the first or 2nd
    paragraph of the paper which began the field of information theory.

    "The fundamental problem of communication is that of reproducing
    at one point either exactly or approximately a message selected
    at another point. Frequently the messages have _meaning-, that
    is they refer to or are correlated according to some system with
    certain physical or conceptual entities. These semantic aspects
    of communication are irrelevant to the engineering problem." C.
    E. Shannon, " A Mathematical theory of Communication" The Bell
    System Technical Journal, 27(1948):3:379-423, p. 379

    Why is this so? If I want to transmit to you the output of my random number
    generator, that information has NO meaning but it does have information.
    Information is defined as H and is merely a mathematical function which
    anyone can use. Say you are listening to a star on a particular frequency
    and you get the bits

    11010010001011110111101001010111.

    You don't know if it is from an alien civilization saying "I like your
    shoes" or if it is just gibberish. You can measure the information content
    of the message but the meaning is beyond you.

    H = -k sum (p(i)log(p(i)).

    there are 19 1's and 13 0's. That means the probability of getting a 1 ,
    p(1)= 19/32= .59 and the probability of getting a 0 p(0) 13/32 =.41. Letting
    K = 1 for the time being, we find that

    H = -.59log(.59)-.41log(.41)=.61*.21+.41*.16 =.135+.158=.293

    It has information which is measured by H. It has no meaning that either of
    us knows about.

    And to claim, as many do, that mutation can't increase information, mutate
    one of the 1's to a 0 and you increase the information of the sequence.

    It then becomes 18 1's and 14 0's with H = .297

    The information content has increased via mutation. Indeed, it can be shown
    that the sequence which has the most information is the least ordered and
    most random looking. The sequence

    000000000000000000 has no information. H=0. Why? Because the probability of
    getting a 0 in that sequences is exactly 1 and the log of 1 is zero
    [log(1)=0].

    So, information can come out of change. All you have to do to increase the
    informational content of the universe which starts from zero H is to allow
    change.

    Remember what Warren Weaver, the co-founder of info theory, said:
    "The word information, in this theory, is used in a special sense
    that must not be confused with its ordinary usage. In particular,
    _information_ must not be confused with meaning.
            "In fact, two messages, one of which is heavily loaded with
    meaning and the other of which is pure nonsense, can be exactly
    equivalent, from the present viewpoint, as regards information.
    It is this, undoubtedly, that Shannon means when he says that
    'the semantic aspects of communication are irrelevant to the
    engineering aspects.' But this does not mean that the engineering
    aspects are necessarily irrelevant to the semantic aspects.
            "To be sure, this word information in communication theory
    relates not so much to what you do say, as to what you could say.
    That is, information is a measure of one's freedom of choice when
    one selects a message. If one is confronted with a very
    elementary situation where he has to choose one of two
    alternative messages, then it is arbitrarily said that the
    information associated with this situation, is unity. Note that
    it is misleading (although often convenient) to say that one or
    the other message conveys unit information. The concept of
    information applies not to the individual messages (as the
    concept of meaning would), but rather to the situation as a
    whole, the unit information indicating that in this situation one
    has an amount of freedom of choice, in selecting a message, which
    it is convenient to regard as a standard or unit amount." Warren
    Weaver, "Some Recent Contributions to the Mathematical Theory of
    Communication, in Claude E. Shannon and W. Weaver, _The
    Mathematical Theory of Communication_ (Urbana: University of
    Illinois Press, 1949), p. 8, 9

    glenn

    see http://www.glenn.morton.btinternet.co.uk/dmd.htm
    for lots of creation/evolution information
    anthropology/geology/paleontology/theology\
    personal stories of struggle



    This archive was generated by hypermail 2b29 : Wed May 01 2002 - 15:43:54 EDT