On Sat, 27 Jun 1998 17:04:22 -0500, Glenn R. Morton wrote:
[...]
GM>BTW, Stephen, If we want to play your son against, mine, my
>MS electrical engineer son has no problem with what I am saying.
[...]
My son Brad is an adult and does his own thing. Since he was
studying Information Theory, I asked him for advice about your
Information Theory posts, and he decided he wanted to respond to
you direct.
If your son wants to do the same, he can. But I would point out that
Brad is not just a (potential) "electrical engineer." He is doing a
degree in *Information* Technology Engineering." That involves the
design of electronic *information* systems, as distinct from electrical
equipment that most electrical engineers are trained in.
Here is a quote from Gitt (Director and Professor at the German
Federal Institute of Physics and Technology), which supports my
claim that Information Theory is a special meaning of information
that cannot be used without qualification in claiming that random
mutations can add information:
"A1 The Statistical View of Information
A1.1 Shannon's Theory of Information
Claude E. Shannon (born l916), in his well-known book "A
mathematical theory of communications" [S7, l948], was the first
person to formulate a mathematical definition of information. His
measure of information, the "bit" (binary digit), had the advantage
that quantitative properties of strings of symbols could be formulated.
The disadvantage is just as plain: SHANNON'S DEFINITION OF
INFORMATION ENTAILS ONLY ONE MINOR ASPECT OF
THE NATURE OF INFORMATION as we will discuss at length.
The only value of this special aspect is for purposes of transmission
and storage. The questions of MEANING, comprehensibility,
correctness, worth or worthlessness ARE NOT CONSIDERED AT
ALL. The important questions about the origin (sender) and for
whom it is intended (recipient) are also ignored. FOR SHANNON'S
CONCEPT OF INFORMATION IT IS COMPLETELY
IMMATERIAL WHETHER A SEQUENCE OF SYMBOLS
REPRESENTS AN EXTREMELY IMPORTANT AND
MEANINGFUL TEXT, or whether it was produced by a random
process. It may sound paradoxical, but in this theory a random
sequence of symbols represents the maximum value of information
content - the corresponding value or number for a meaningful text of
the same length is smaller.
Shannon's concept: His definition of information is based on a
communications problem, namely to determine the optimal
transmission speed. For technical purposes THE MEANING AND
IMPORT OF A MESSAGE ARE OF NO CONCERN, so that these
aspects were not considered. Shannon restricted himself to
information that expressed something new, so that, briefly
information content = measure of newness, WHERE "NEWNESS"
DOES NOT REFER TO A NEW IDEA, A NEW THOUGHT, OR
FRESH NEWS - WHICH WOULD HAVE ENCOMPASSED AN
ASPECT OF MEANING. But it only concerns the surprise effect
produced by a rarely occurring symbol. Shannon regards a message
as information only if it cannot be completely ascertained beforehand,
so that information is a measure of the unlikeliness of an event. An
extremely unlikely message is thus accorded a high information
content. The news that a certain person out of two million
participants has draws winning ticket, is for him more "meaningful"
than if every tenth person stood a chance, because the first event is
much more improbable.
(Gitt W., "In the Beginning was Information," [1994], CLV, English
Edition, 1997, p170. Emphasis mine).
Also here are two quotes by Berlinski. The first distingushes between
the information *of* of the message (ie. its capacity) and the
information *in* the message (ie. its *content* or *meaning*):
"In the simplest of set-ups (and no more is required), a single
message might consist of a single letter, a sequence of messages of a
sequence of letters, separated, perhaps, by marks of punctuation.
Whatever the source, each message that can occur occurs with a
fixed and definite probability. WHAT IS BEING MEASURED IS
THUS THE INFORMATION *OF* THE MESSAGE AND NOT
THE INFORMATION *IN* THE MESSAGE." (Berlinski D.,
"Information," in "Black Mischief," 1988, p66. Asterisks emphases
italicised in original. Capitalised emphasis mine).
In the second quote, Berlinski distinguishes between *noise* and
*information*:
"A communication system Shannon defined as a conceptual object of
four parts. First, there are the messages themselves: discrete bits of
data in the form of letters, words, sentences, or numbers. Then there
is the source putting out messages, the receiver, who gets them, and
the channel over which the messages are sent. Now, in the very
process of communication, something inevitably gets lost. There are
electrical disturbances, stray noises, mysterious glitches. (Indeed, just
recently, two physicists, fine-tuning their astronomical radios,
concluded that the inevitable, ineliminable hum that they heard
represented nothing less than the cosmic cackle of the Big Bang.)
Shannon was interested in THE STUFF THAT GOT THROUGH
DESPITE THE NOISE. AND THIS HE DEFINED AS
INFORMATION." (Berlinski,1988, pp65-66. Emphasis mine).
Steve
"Evolution is the greatest engine of atheism ever invented."
--- Dr. William Provine, Professor of History and Biology, Cornell University.
http://fp.bio.utk.edu/darwin/1998/slides_view/Slide_7.html
--------------------------------------------------------------------
Stephen E (Steve) Jones ,--_|\ sejones@ibm.net
3 Hawker Avenue / Oz \ Steve.Jones@health.wa.gov.au
Warwick 6024 ->*_,--\_/ Phone +61 8 9448 7439
Perth, West Australia v "Test everything." (1Thess 5:21)
--------------------------------------------------------------------