Entropy is a fundamental concept in physics and chemistry that measures energy dispersal within a system. It is often understood as the level of disorder or randomness present in a system’s components. In thermodynamics, this measurable property is represented by the internationally recognized symbol, the capital letter \(S\).
Identifying the Symbol
The symbol \(S\) was introduced in 1865 by the German physicist Rudolf Clausius, a founder of thermodynamics. Clausius coined the term “entropy” from a Greek word meaning “transformation” or “turning,” choosing a name similar to “energy.” Although Clausius never explicitly documented the reason for selecting \(S\), it is believed he chose it to represent a state variable or possibly in homage to Sadi Carnot, whose work was foundational to the theory.
In thermodynamic equations, \(S\) represents the absolute entropy of a system, while \(\Delta S\) represents the change in entropy. While absolute entropy values are theoretical, \(\Delta S\) is the quantity most often measured and calculated in practical applications. The standard SI unit for entropy is Joules per Kelvin (\(J/K\)), which links an amount of energy (heat) to the temperature at which the energy is transferred.
Entropy in Thermodynamics
The classical definition of entropy is rooted in the flow of heat and the system’s temperature. From a macroscopic perspective, the change in entropy (\(\Delta S\)) for a process is defined by the equation \(\Delta S = Q_{rev}/T\). This formula connects the amount of heat transferred (\(Q\)) to the absolute temperature (\(T\)) at which the transfer occurs.
The term \(Q_{rev}\) represents the heat transferred in a reversible process. A reversible process is an idealized theoretical path where the change occurs infinitely slowly, maintaining system equilibrium. This concept is necessary because entropy is a state function, meaning its change depends only on the initial and final states, not the path taken.
By calculating the change along this idealized reversible path, the actual entropy change for any process between the same two states can be determined. For example, during the phase change of ice melting into water, the heat absorbed is the heat of fusion. When ice melts at its normal freezing point (\(273.15\) Kelvin), the change in the system’s entropy is \(\Delta S = \Delta H_{fusion} / T_{melting}\). This shows that as heat is absorbed at a constant temperature, the system’s entropy increases as the ordered solid transforms into the less ordered liquid state.
Statistical Mechanics and Microstates
While the thermodynamic definition links entropy to heat, statistical mechanics provides a microscopic perspective relating entropy to probability and particle arrangement. This view, pioneered by Ludwig Boltzmann, is summarized by his equation: \(S = k \ln W\).
In this equation, \(S\) is the entropy, \(k\) is the Boltzmann constant (approximately \(1.38 \times 10^{-23} J/K\)), and \(W\) represents the number of microstates. Microstates are the specific ways the energy and position of every particle can be arranged while the system appears the same macroscopically. For example, a gas filling a room (a macrostate) can have countless different arrangements of its individual molecules (microstates).
The equation shows that entropy is directly related to the number of accessible microstates. A system naturally tends toward the macrostate with the highest number of possible microstates, as that state is overwhelmingly more probable. A gas spreading out to fill a container is a high-entropy state because there are immense ways the molecules can be distributed. Conversely, molecules spontaneously collecting in one corner is a low-entropy state with a drastically lower \(W\), making it statistically improbable.
The Universal Constraints on Entropy
Entropy’s role is cemented by two fundamental physical laws that place universal limits on all processes. The Second Law of Thermodynamics states that the total entropy of an isolated system can never decrease; it must either increase or remain constant. Since the universe is considered an isolated system, its total entropy is always increasing, driving all natural processes in a specific direction.
This constant increase in universal entropy leads the Second Law to be called the “arrow of time.” The process of breaking a glass, for instance, increases entropy and is irreversible. The Second Law explains why time moves forward and why natural events, such as a hot cup of coffee cooling down, are unidirectional.
The Third Law of Thermodynamics establishes the absolute zero point for entropy. It states that the entropy of a perfect crystal of a pure substance approaches zero as the temperature approaches absolute zero (\(0\) Kelvin). At this theoretical temperature, all thermal motion ceases, and the system exists in a state of perfect order with only one possible microstate (\(W=1\)). This makes \(S=k \ln(1) = 0\) and provides the baseline for calculating the absolute entropy of any substance above zero.