What is Entropy?
It depends on who you are talking to because even within a
language there are different communities of language users
that refer to different ideas using the same words. And
even if they are referring to the same concept, it might
have a different connotation or purpose. Note, the
Christian use of the word entropy, it does not have to
agree with the scientific use of the word entropy in order
to have meaning within its own group of language users.
But what we are interested in is not the meaning of the
word entropy as defined within a group, but the meaning of
the concept of entropy beyond language, as a fundamental
law of the universe.
Penrose brings up an interesting point about the second
law of thermodynamics; it is given in the form of an
inequality. When we think of laws, we think of Newton’s
first law F=ma, or the conservation of energy and
momentum, where we can predict with precision how objects
will behave. But what does it mean that the entropy if a
system will always be greater than or equal to one? What
is the process that causes the continuous generation of
entropy?
Let’s first take a look at how entropy is used in
language. In the study of thermodynamics, entropy is
defined as “a measure of molecular disorder.” A system of
fluids has an associated entropy value assigned to it.
Just as the total internal energy of a system can be
determined, we can also determine the amount of entropy a
system has. When we see the use of entropy in language
(such as in the Thermodynamics textbook I am using this
semester), we see that entropy is something that can be
generated, created, transferred, that there can be a
change in entropy, etc. Entropy sounds like something
tangible similar to energy (of course we cannot create
energy, but we can generate energy through heat and work).
But what exactly is being generated, what is being
transferred? This is what I believe to be the problem
causing the confusion in understanding entropy.
Penrose attempts to explain entropy in the book Cycles
of Time, but sadly he does not explain any of the
terms that he is using so that the only people who would
understand what he is saying already need no explanation.
But from reading through his exposition of entropy, it
seems that at the heart of it, entropy is not a difficult
concept, and I will attempt to demystify what entropy is,
through the words of Penrose. He defines entropy in terms
of coarse-graining regions and phase spaces. I am not
exactly sure what he means by this, but I will try to
summarize the gist of what I think he is saying. He seems
to be giving us graphical representation of the
statistical behavior of many particle systems. Each degree
of freedom in the system is given an entropic dimension n,
so that the entropy of a system can be depicted as having
trillions of dimensions for each of its degree of freedom.
As an infinitesimal slice of time passes, dt, the
possibilities of each degree of freedom increases (not in
the sense that the system actually gains more possible
futures, but in the sense that we cannot possibly know
exactly the state of every degree of freedom), so that the
overall volume of entropy increases. With each increase in
volume, the entropy of the system is surrounded by
possible futures of even larger volumes and also smaller
volumes, but the larger volume entropies far outnumber the
smaller volume entropic states by some large exponential
amount. Again, when I say possible futures, I am saying it
in a statistical sense, in a deterministic sense, there is
only one possible future, but there is no way this can be
determined because of the sheer number of degrees of
freedom. I hope I am interpreting Penrose correctly, but
the basic picture seems to be that entropy is a
statistical statement about a system, not some abstract
essence that is in a system.
However, the way entropy is used in language paints a
different of what entropy is. We get an understanding of
concepts by the way it is used in language. Most of the
time, this kind of linguistic analysis allows us to
understand a concept. For example, we know that running is
a type of action just by its grammatical structure. It is
a gerund that follows the existential verb “be,” which
tells us it is an action done by something. But I think
the problem with entropy is that entropy is defined as
disorder. When we state that entropy is generated in a
system, it sounds as if some physical property like
internal energy or pressure has increased in the system
because of some physical cause, but in reality, nothing in
the system is increasing. When we say there is an increase
in entropy, it is the number of possibilities that we can
be determined probabilistically from a system that is
increased. This has nothing to do with a system being more
chaotic or that a system has gained free will, but simply
that there are so many billions and trillions of particles
each with hundreds of possible configurations, there is no
way to determine with 100% precision its future states.
Entropy seems to be a concept formed from a statistical
fact about systems with many particles.
Imagine that there is one hydrogen atom in a box, and
there was a supercomputer measuring every possible
property of that atom, position, velocity, direction of
motion, temperature, and etc. Would it make sense to use
entropy for this system? How would we define disorder in
this case? Is the atom less disorderly in the center of
the box or in the top-left corner of the box? It does not
really make sense because the notion of entropy is not
useful in this case. What if we put two atoms in a box?
Then the statistical probabilities of the two atoms
increase exponentially by the number of degrees of freedom
they have, but with a supercomputer, everything about the
system is known and it is unnecessary to introduce the
concept of entropy. Entropy only because useful when there
are so many particles in a system that we cannot possibly
keep track of what each individual particle is doing, so
we take its possible state as a statistical whole.
So what is entropy? I believe that entropy is a concept
that came from a thermodynamics context so that
thermodynamics calculation could be made, but it should be
defined in a statistical or a probabilistic sense rather
than in terms of disorder. Looking over our physics
textbook, I think it does an excellent job explaining
entropy and it is very different from the definition and
explanation given by my thermodynamics textbook.