Conclusion
If entropy is just a statistical fact about a system, why
is it often described as something that is being created,
generated, or transferred in a thermodynamic context?
Quite simply, the reason for this is because it is
convenient to think of entropy in those terms. We often
use heat and energy as physical quantities that are put
into and out of a system, so it is easy to think of
entropy in that same way. But I think it would be a
mistake to define entropy as chaos or disorder, but should
instead be described in terms of possible states. Instead
of saying that a system has generated x amounts of
disorder, it would be more accurate to say that the number
of possible states of a system has increased, just as it
would be better to say that there is an increase in
kinetic energy in an object rather than saying that an
object generated kinetic energy. I often hear that entropy
is a difficult concept to understand and that it is
mysterious, but when you look at entropy in terms of
usefulness, I don't believe it to be the case. I am not
saying that the amount of entropy in a system is easy to
calculate, or that I even necessarily understand entropy,
I am just saying that entropy seems to be simple
concept at the heart of it.
From doing this web project, I thought of some interesting
ideas concerning entropy. Zero entropy is defined as a
system with only one possible state (so a system of
particles arrayed in a perfect crystalline lattice at zero
Kelvin would satisfy this condition). If entropy has
constantly been increasing, there is a finite time in the
past when entropy was zero because it makes no sense to
have negative entropy. If this is the case, this is
evidence for the Big Bang because you can only reach an
entropy value of zero when the entire universe is a
singularity. However, it is also possible that entropy
approaches zero but never actually reaches it. This gives
us a paradox in that an increase in temperature gives us
an increase in entropy and it is extremely hot as we
approach the Big Bang.
On another note, the entropy of a black hole and at the
boundary of the universe must be infinite because we
cannot know its possible statistical configurations.
But maybe what I am referring to is informational entropy
rather than thermodynamic entropy. Hawking and Bekenstein
gave us a formula for calculating the entropy of a black
hole but the problem I have with their formula is that
they only account for the area of the event horizon, and I
understand this is to account for the increase of entropy
due to mass, but this does not take into consideration all
the degrees of freedom for every particle entering the
black hole. If entropy were simply a function of mass,
then entropy would not increase with an increase in
temperature. Also, how many degrees of freedom do we
assign to a single unit of dark matter that enters a black
hole? To me, it makes no sense to even attempt to
calculate the entropy of a black hole when its possible
states are impossible to determine.