>> No one has shown that "unguided evolutionary processes" are
>> "probable". Indeed, the very opposite. The famous 1966 symposium at
>> the Wistar Institute of Anatomy and Biology in the University of
>> Pennsylvania entitled 'Mathematical Challenges to the Neo-Darwinian
>> Interpretation of Evolution', examined this question. Murrary Eden,
>> Professor of Engineering at Massachusetts Institute of Technology,
>> said concluded "an adequate scientific theory of evolution must await
>> the discovery and elucidation of new natural laws - physical,
>> physico-chemical and biological'."
I read the paper Eden presented at that symposium. His argument was
essentially the probabilistic argument against getting a functional protein
molecule in a primordial soup. I don't remember anything about cumulative
selection or about evolution of existing life. Eden served the useful
purpose of making some evolutionary biologists and abiogenesis theorists
think about the probabilistic arguments and how to deal with them, but he
did not destroy abiogenesis as some folks have suggested. BTW in the paper
he strikes an attitude of basic acceptance of evolution. His stance is "I
accept the theory, but here are some issues which are puzzling and which
need to be resolved," rather than "here are some issues that in my view
destroy the theory of evolution."
>> Marcel P. Schutzenberger, a
>> computer scientist from the University of Paris, agreed that
>> "spontaneous improvement and enlargement of the code through mutations
>> and natural selection was 'not conceivable'" and that "if we try to
>> simulate such a situation by making changes randomly at the
>> typographic level (by letter or by blocks, the size of the unit does
>> not really matter) on computer programmes, we find that we have no
>> chance (that is, less than one chance in 10^10,000) even to see what
>> the modified programme would compute: it just jams." (Hitching F.,
>> "The Neck of the Giraffe", Ticknor & Fields: New York, 1982, pp82-83)
This is a spurious -- perhaps disingenuous argument for two reasons:
1. He talks about the chances of improving computer programs by mutating
them. I presume he means traditional computer programs (see below for an
alternative). In a conventional computer program, at either the source code
level or the machine code level, _any_ random change is far more likely to
cause a malfunction than an imporvement or even a functional program.
Computers and their conventional languages (machine language, macro, Pascal
(let's not discuss C, ok? :-)) are highly structured, and do not tolerate
experimentation that violates the rules. So Schutzenberger's probabilities
are not news. They are exactly what you would expect. No one in his right
mind would try such an approach(making random modifications in conventional
programs) to developing computer programs.
2. Recently (since sometime in the 70's) genetic programming researchers
have shown that there is another type of computer program that _can_ be
improved by mutation (as well as crossover) and natural selection.
Although I haven't read the literature yet, genetic programming is an
approach to programming in which you characterize a program as a genome and
then use a genetic algorithm to optimize the program to perform some task.
The difference between this and making random modifications on say a Pascal
program is that the programmer provides a mechanism for translating a
genome into a program that, while not necessarily correct, is runnable.
One book which describes genetic programming is
GENETIC PROGRAMMING: ON THE PROGRAMMING OF COMPUTERS BY MEANS OF NATURAL
SELECTION
by John R. Koza
Computer Science
Department Stanford University
840 pages. 270 Illustrations. ISBN 0-262-11170-5.
MIT Press 1992
The abstract on John Koza's home page
(http://www-cs-faculty.stanford.edu/~koza/jaws1.html) is:
The recently developed genetic programming paradigm provides a way to
genetically breed a computer program to solve a wide variety of problems.
Genetic program ming starts with a population of randomly created computer
programs and iteratively applies the Darwinian reproduction operation and the
genetic crossover (sexu al recombination) operation in order to breed better
individual programs. The book describes and illustrates genetic programming
with 81 examples from various fields.
Also relevant is John Holland's work on classifier systems which is covered in
Holland, J. H. (1975) "Adaptation in natural and artificial systems," Ann
Arbor, University of Michigan Press, and
Goldberg, David E.(1989), "Genetic Algorithms in search, optimization and
machine learning," Reading, MA, Addison-Wesley
Genetic programming is the paradigm that should be used if one is going to
compute the probability that programs can be developed by natural
selection.
>
>Ahem.
>
>I seem to recall, several months ago, quite a few voices on this group
>claiming that computer simulations of macroevolution (e.g. increasing
>complexity via stochastic processes; infrequent, rapid morphological
>change from "point mutations") were essentially irrelevant as to whether
>or not biological macroevolution was true.
Well, of course _I_ never sang in that chorus...
>
>Sauce for the goose.... :-)
>
Bill Hamilton | Vehicle Systems Research
GM R&D Center | Warren, MI 48090-9055
810 986 1474 (voice) | 810 986 3003 (FAX)
hamilton@gmr.com (office) | whamilto@mich.com (home)