From: David_Bowman@georgetowncollege.edu
>Regarding Richard Wein's question:
>
>>Dave, thanks for all the info about 2LT. To be honest the details are
beyond
>>me. As a non-physicist, it would be nice to have a simple way of rebutting
>>the creationist/ID argument, if there is one.
>
>Which "creationist/ID argument" do you mean? The general argument that
>ID is detectable in nature, or the specific argument that the 2nd law
>opposes natural self-organization of material systems?
The latter.
> Or something else?
>If you mean the argument that the 2nd law prevents spontaneous-self
>organization, I suppose the simplest rebuttal is to point out that the
>2nd law doesn't address the issue of the macroscopic organization of
>matter or its lack one way or the other, and that some processes in
>nature self-organize and some self-disorganize, and some have complicated
>mixtures of both, that all these cases are fully consistent with the
>operation of the 2nd law, and that for none of these cases is the 2nd
>law "overcome", "circumvented", made inapplicable, etc.
Thanks. I'll remember that.
>>It seems to me sufficient to point out that the 2LT only applies to closed
>>systems and that living organisms are not closed systems. Is this a fully
>>adequate argument?
>
>Not really.
OK. I'll have to be careful to avoid arguing in such oversimplistic terms in
future.
> The 2nd law applies to all systems of a macroscopic number
>of microscopic degrees of freedom, whether or not the system is closed.
>It's just that when the system is closed the formulation of the law is
>simpler and cleaner. When the system is closed the 2nd law states that
>the thermodynamic entropy of the system is nondecreasing in time.
I've read that this is the form in which second law was originally stated,
in the 19th century. So when a creationist argues that the second law tells
us that entropy is nondecreasing over time, could I respond as follows?
This is the "classical" second law, as formulated in the 19th century, and
applies only to closed systems. In the 20th century the law was extended to
include open systems, but the extended form of the law does not require that
entropy be nondecreasing in open systems.
>When
>the system is coupled to its environment so it is not closed off from its
>surroundings (w.r.t. energy-exchanging interactions among the degrees of
>freedom of the system and those of the environment) then the system's
>entropy alone is insufficient as a relevant parameter for the 2nd law.
>For such an open system the system's entropy may go up or down, or
>oscillate in a host of complicated ways with time. *But* whatever it
>does, the *sum* of the system's entropy *and* the contribution to the
>entropy of the system's surroundings pertaining to the environmental
>degrees of freedom that happen to interact with the system and with each
>other *does* increase (or at least doesn't decrease) with time. The
>minimal part of the environment's entropy that needs to be included in
>this sum is the entropy that is generated in the environment by virtue of
>the interaction of of the degrees of freedom of the environment with the
>system and with each other. Because we can always surround any system
>with a bigger environment we are safe to say that the total entropy of
>the universe always increases with time. This last formulation
>effectively takes the whole universe as the system so there are no
>surroundings to worry about.
>
>Since it is a royal pain to have to deal with the goings on of both the
>system at hand and its environment (which could conceivably be the whole
>rest of the universe) it is very desirable to have a formulation of the
>2nd law that only makes reference to what is going on in the system and
>not in the environment. Such a formulation is possible in some
>circumstances such as when the environment plays a relatively innocuous
>background role that merely enforces some macroscopic constraints on the
>system.
>
>For instance, suppose the system's immediate environment is maintained
>at a constant temperature and acts as a heat reservoir for the open
>system (where heat is allowed to be freely exchanged between the system
>and its surroundings). Then in this case the 2nd law is modified to
>*not* refer to the system's entropy, but to its so-called availability or
>free energy. In this case the system's free energy *decreases* (or at
>least doesn't increase) with time. (As the system's free energy is
>monotonically decreasing, its entropy may be increasing, decreasing, or
>oscillating). The system's equilibrium state occurs when the system is
>equilibrated with itself and its environment and has the environment's
>temperature throughout. When the system is in equilibrium then its free
>energy is *minimized*.
>
>Now there are multiple definitions of free energy because which kind of
>free energy is the relevant one for a given situation depends on some of
>the other constraints of interaction that exist between the system and
>its environment. For instance, if the environment of the system forces
>the system to have a fixed constant volume and the system is effectively
>a fluidic system then the pressure exerted on the system by the
>environment is subject to change as time goes on, and the relevant free
>energy is the Hemholtz free energy.
>
>But if, instead, the environment maintains a constant pressure and
>temperature on the system, then a fluidic system's volume is allowed to
>change with time. In this latter case the relevant free energy that
>decreases is the Gibbs free energy. If the boundary between the system
>is permeable to particle fluxes, then the relevant free energy is
>something else. If the system is an anisotropic solid, then the relevant
>free energy is, again, something else. Etc., etc.
>
>>I seem to remember some creationist claiming that there's a form of the
2LT
>>that applies to open systems. Is that true?
>
>Yes. See above.
>
>>Or perhaps he was talking about informational entropy, which has nothing
>>to do with the 2LT. Any comment?
>
>Actually, as I explained in my previous long post, the thermodynamic
>entropy is a *special case* of "informational entropy". All kinds of
>entropy are "informational" in the sense that they all measure the
>average amount of *information* missing in a context of uncertainty
>where, because of the uncertainty, there are multiple possible outcomes
>consistent with what is known about the situation, and a probability
>measure can be defined on the set of those outcomes. The entropy is a
>functional on the probability distribution. Each probability
>distribution has its own (information theoretic) entropy. It just so
>happens that the thermodynamic entropy of a physical system is the
>"informational entropy" associated with the probability distribution of
>the system's microscopic states consistent with the macroscopic
>description of that system. But it is *only* the thermodynamic entropy
>that is subject to the 2nd law of thermodynamics. The other kinds of
>entropy are not required to increase with time for an isolated system.
>In general, the other kinds of entropy don't have to obey any particular
>special laws other than that the entropy be a nonnegative real number
>for any given discrete probability distribution. Also various
>conditional entropies would typically have to satisfy a few sum rules as
>well so as to keep all the different conditional entropy measures
>mutually self-consistent. BTW, there are other kinds of entropy that
>can have physical significance for a physical system besides the
>thermodynamic entropy. But any non-thermodyamic entropy doesn't
>necessarily have any particular trend with time.
Thanks. I think I'm starting to understand!
Richard Wein (Tich)
--------------------------------
"Do the calculation. Take the numbers seriously. See if the underlying
probabilities really are small enough to yield design."
-- W. A. Dembski, who has never presented any calculation to back up his
claim to have detected Intelligent Design in life.
This archive was generated by hypermail 2b29 : Thu Oct 26 2000 - 05:41:37 EDT