header


  Entropy

Entropy is a measure of the amount of disorder in a system. If a system is highly ordered, it has low entropy, while is a system is very randomly ordered, it has high entropy. Entropy can also be thought of as the probability that a specific, macroscopic state will occur in a system. So a less ordered state is more likely to occur than a highly ordered state.

Thanks to entropy, we have a very important law of physics that describes nature at large.
The Second Law of Thermodynamics: The entropy of an isolated system will either stay the same or increase, but never decrease.
So basically this means that without outside forces acting on a system, order will move towards disorder and the macroscopic state will become more random.

image_8
[Image from cattime.com]

The probability that disorder will occur in an isolated system is intuitively much higher than the probability that order will randomly occur. Unless the system is at an equilibrium, in which entropy would stay the same, the system will "run down" in order.

image_9
[Image from themetapicture.com]
Homepage
States of Matter
Entropy
Optics
Relativity
Resources