FW: Emergence of information out of nothing?

From: Glenn Morton (glenn.morton@btinternet.com)
Date: Sun May 12 2002 - 23:04:03 EDT

  • Next message: Kamilla Ludwig: "Re: Is there a gay gene?"

    This is Lucien's latest reply.

    >-----Original Message-----
    >From: Lucien Carroll [mailto:ucarrl01@umail.ucsb.edu]
    >Sent: Sunday, May 12, 2002 12:16 AM
    >To: Glenn Morton
    >Subject: Re: Emergence of information out of nothing?
    >
    >
    >I forgot to change the reply field last time, so neither my posting nor
    >yours made it to the list. I don't have a copy of what I wrote. Feel
    >free to forward the conversation to the list, but since this wont make
    >sense to the list without the previous posts, i'm just sending it
    >to you directly.
    >
    >Glenn Morton wrote:
    >> [...]
    >> > The third thing, conventions, is likewise an
    >> >objective feature of the environment.
    >>
    >> I think I only partly agree with this. Conventions are
    >objective, but they
    >> can't be recognized a priori and thus are also part of the subjective.
    >> Obviously for us, the language you and I are communicating in has an
    >> objective standard (dictionary) for what things mean. But the
    >dictionary, as
    >> my professor who taught me Wittgenstein said, is a great
    >tautology. It is a
    >> game in which all words are defined by reference to other words.
    >Because of
    >> this, the structure of a language simply can't be objectivized
    >in the way in
    >> which Shannon entropy can be. Shannon entropy is defined such
    >that a given
    >> sequence will have a given informational content.
    >
    >Doesn't Shannon entropy also depend on the system that a state is drawn
    >from? The microstate 153832 has one information content if it is a
    >decimal string and another if it is a hexadecimal string, because of the
    >partition function. If you take a protein, can you determine the
    >information content without knowing that the amino acids represented are
    >the only ones possible?
    >
    >> But when we speak of
    >> semantic information, we have moved into several different games, as
    >> Wittgenstein called it, in which tautological definitions rule
    >and there is
    >> no objective standard. Chinese zi dan's [I think that is the word for
    >> dictionary] define words in reference to chinese zi's[characters].
    >
    >usually cidian in pinyin
    >
    >>[...]
    >> I think you miss the problem.Don't limit yourself to known
    >language speakers
    >> on earth. Let's say we receive a string of 1's and 0's from
    >Alpha Centauri
    >> on 1630 mHz. People record it and if you bunch it in bytes of 5
    >bits it fits
    >> Zipf's law. For those who don't know, Zipf's law is a
    >mathematical relation
    >> which fits languages. Anyway, if we find this, does it mean that we have
    >> found those proverbial little green men? I don't think so. In
    >fact I am not
    >> sure we would understand that we have a language. How do we
    >know that the
    >> sequence is to be 5 bit bytes? Zipf's law fits many other
    >things, like the
    >> population of cities and no one 'designed' that relationship.
    >>
    >Like the cyphered string, we would likely not be able recognize the
    >meaning of this string (assuming it did have meaning). A string that we
    >can find no meaning in, may or may not have meaning, but one in which we
    >can find meaning does have it. We can certainly find good reason to say
    >a particular string might or might not have meaning, such as the Zipf's
    >law thing perhaps, but we are left in doubt.
    >
    >> [...]
    >> Wo you ee ge wen tea. Ni hui shou Zhong guo hua, ma?
    >>
    >Wo de Zhongwen hai hen cha, danshi wo hui shuo. :)
    >
    >>[...]
    >> >Perhaps Shannon information would be easier to explain to people if
    >> >semantic information were better understood.
    >>
    >> Probably, but as Shannon said, his form of 'information' has
    >nothing what so
    >> ever to do with semantic information. The problem is the logical
    >> equivocation upon the word 'information' and there is where the confusion
    >> comes from.
    >
    >I still don't think they are totally unrelated. I'm getting out of my
    >depth here, (or maybe i have been all along) but i think this is a valid
    >example: There is a limit to how much you can compress a megapixel 16
    >million color image. There are all kinds of ways of making it fill more
    >space, but the image (and its compression algorithm, say, to avoid
    >cheating) must occupy more than some minimum amount of space. I suspect
    >that if that minimum could be achieved, the shannon information content
    >of that space would be at its maximum.
    >
    >Semantic shift and such in language (such as the word "information", for
    >example) makes it clear to me that the semantic kind of information does
    >arise naturally in this kind of system. (Unlike the case of
    >"information", the vast majority of language change is not conscious or
    >by design). Denying that it exists or denying that it is a valid field
    >of scientific study won't make it go away.
    >
    >--
    >Lucien S Carroll ucarrl01@umail.ucsb.edu
    >"All mankind is stupid, devoid of knowledge."
    >-Jeremiah 51:17a



    This archive was generated by hypermail 2b29 : Sun May 12 2002 - 16:58:07 EDT