Defining Entropy — Molecular Microstates
and Probability
The Second Law of Thermodynamics can be described
at the macroscopic level of everyday
objects, or the microscopic level of tiny atoms and molecules. In an excellent book, Introduction to Chemical Thermodynamics (see description at bottom of page), William
Davies shows how entropy is mathematically related to "the
number of microstates corresponding to each distribution, and hence is
[logarithmically] proportional to the probability of each distribution."
Each microstate is a different way to disperse
the same amount of energy in the microscopic realm of atoms and molecules.
Davies explains how the number of microstates — with energy dispersed
in all possible ways throughout the molecules' energy levels — depends
on the properties of molecules, such as the magnitude and spacing of their energy-levels.
He also explains how microstates are related to entropy and to the equilibrium
state that the chemical system will reach after its molecules have finished
reacting. And he describes a useful application of the Second Law: As a chemical system moves toward its equilibrium state, the number of possible
microstates the system can be in (and still have the same overall macro-state) will increase with time, because entropy (which depends on the number of
microstates) increases with time, and total entropy (of the universe) is maximum
at equilibrium.
Here is another explanation (with a familiar dice-illustration that isn't from Davies) of these ideas:
At the micro-level,
a system's entropy is
a property that depends on the number of ways that
energy can be distributed among the particles in the system. Entropy
is a measure of probability, because if energy
can be distributed in more ways in a certain state, that state is more probable. A
useful analogy is to think about the number of ways that two dice can produce
sum-states of 7 (this can occur in six ways, 16 61 25 52 34 43) and 2 (this
occurs only one way, 11), and why this causes 7 to be more probable than
2. For
similar reasons, because of probability, the chemicals in a system tend to
eventually end up in the particular state (their equilibrium
state) that can occur in the greatest number of ways when energy is
distributed among the zillions of molecules.
Basically,
the Second Law is just a description of probability, simply recognizing that in
every naturally occurring reaction, whatever is most probable (when all things
are considered) is most likely to happen. But probability is
related to entropy and, in a more precise form, the Second Law states that during
any reaction the entropy of the universe will increase. / But
what about real systems, which are always smaller than the universe? An isolated
system (that is "closed" so it cannot exchange energy or
matter with its surroundings) is thermodynamically equivalent to a miniature
universe, so during any spontaneous reaction its entropy will increase. But the entropy
of an open system (able to exchange energy, and even matter, with
its surroundings) can increase,
decrease, or stay constant.
about the book, Introduction to Chemical Thermodynamics, by Paul Davies:
I thoroughly enjoyed studying — not just reading — this book, was impressed (as a chemist & educator) because it cleverly integrated the micro-level properties of molecules with their macro-level behaviors.* Davies clearly explains, using logic and examples, how to integrate perspectives that previously had been unrelated (for me), to show how a wide range of ideas can be elegantly combined in my mind. Due to this learning, he changed the way I now think about the properties-and-reactions of molecules. {a book review in Journal of Chemical Education}
* He shows a reader the relationships between: the quantum mechanics of energy levels (translational, rotational, vibrational, electronic) in molecules; probabilities & statistical mechanics; chemical thermodynamics (to combine the two factors, energy & entropy, in Gibbs Free Energy); and chemical equilibrium.
this page was written by Craig Rusbult