Limits on the production of complexity

From: Chris Cogan (ccogan@telepath.com)
Date: Fri Sep 29 2000 - 14:27:00 EDT

  • Next message: Ralph Krumdieck: "Re: WHY DOES THE UNIVERSE WORK?"

    Since, small random changes (including small additions) to *replications*
    of strings of information can produce, ultimately, *any* degree and type of
    complexity information that can be represented at all, we have to ask, What
    might *prevent* some particular kind of complexity from occurring?

    My answers are:
    1. Insufficient number of replications with variations. If we start with a
    one-bit string and can only add one bit per generation, then a
    two-billion-bit string cannot be produced in only one billion generations.

    2. Exclusion of some variations from further replication. If *all* possible
    approaches to a particular complex structure-representation are excluded
    from having "offspring," then there is a kind of "solid wall" around that
    particular structure that prevents it from occurring.

    3. Insufficient exhaustiveness of the changes. If the changes that can
    occur are sufficiently *non-*random in certain ways, certain results can be
    prevented from occurring. For example, if all changes produce *only* "on"
    bits, then all strings will consist solely of "on" bits, and significant
    complexity will not arise.

    What Dembski, et al, must prove, in order to prove design (or at least that
    naturalistic Darwinian theory is essentially wrong), is that there haven't
    been enough generations to allow for the kinds of complexity they talk
    about, or that that complexity is systematically *prevented* from occurring
    by natural processes, or that there is some major failure of modified
    replications to be sufficiently random to produce the specified form of
    complexity.

    Paradoxically, given that 1 and 3 do not apply, the *failure* to produce
    any particular form of complexity would be a better indication of design
    than the production of it, because it would indicate that something is
    *systematically* excluding it. Evolution does select against complexity in
    some cases, because the cost of the complexity outweighs any advantages it
    might provide. But, for it *always* to do so, under *all* possible
    environmental conditions, would be a remarkable fact indeed, one that would
    require unusual explanation. That explanation might be a "designer."

    Since quite an array of complex structures does exist in biology, I
    conclude that *neither* nature nor a designer is excluding complexity, and
    that, therefore, that ID theory cannot effectively use the *lack* of any
    particular type of complexity as evidence for design.

    To be sure, certain complex structures *are* systematically excluded, but
    only because they "cost" too much to the organism's survival. That is, the
    fact of natural selection against some complex structures cannot be used as
    evidence for design, because biological harmfulness is sufficient
    explanation for the exclusion of those structures when they begin to occur.



    This archive was generated by hypermail 2b29 : Fri Sep 29 2000 - 14:31:22 EDT