The change in entropy (ΔS) is calculated by dividing the heat transferred reversibly by the absolute temperature: dS = dQ/T. That foundational formula, first derived by Rudolf Clausius, adapts into several practical versions depending on the process: isothermal expansion, heating or cooling, phase changes, and chemical reactions each have their own streamlined equation. The key concept underlying all of them is that entropy is a state function, meaning it depends only on where you start and where you end, not on the path between those states.
The Core Formula
Entropy was defined through the observation that for any reversible cycle, the integral of heat divided by temperature around the entire cycle equals zero. Because of that property, dQ/T must represent the change in some state quantity. Clausius named that quantity entropy, giving us:
ΔS = ∫ dQ_rev / T
The subscript “rev” is critical. You must use the heat transferred along a reversible path, even if the actual process you’re analyzing is irreversible. Entropy is measured in joules per kelvin (J/K), and for per-mole calculations, J/(mol·K).
Isothermal Processes
When temperature stays constant, the integral simplifies because T comes out in front. For an ideal gas expanding or compressing isothermally, the entropy change is:
ΔS = nR ln(V₂ / V₁)
Here, n is the number of moles, R is the gas constant (8.314 J/(mol·K)), V₁ is the initial volume, and V₂ is the final volume. If the gas expands (V₂ > V₁), entropy increases. If it’s compressed, entropy decreases.
You can also express this using pressures. Since pressure and volume are inversely related for an ideal gas at constant temperature, the equivalent form is:
ΔS = nR ln(P₁ / P₂)
Notice the pressures are flipped relative to the volumes: higher final pressure means compression, which lowers entropy.
Heating or Cooling at Varying Temperature
When you heat or cool a substance without a phase change, the temperature itself is changing throughout the process. The formula accounts for this by integrating the heat capacity divided by temperature:
ΔS = ∫ (Cₚ / T) dT, integrated from T₁ to T₂
If the heat capacity (Cₚ) is roughly constant over the temperature range, this simplifies to:
ΔS = Cₚ ln(T₂ / T₁)
For one mole of a substance, use the molar heat capacity. For a specific mass, use mass times specific heat capacity in place of Cₚ. The natural log means that heating something from 200 K to 400 K produces a larger entropy change than heating it from 400 K to 600 K, even though both involve a 200 K increase. Entropy gains are larger at lower temperatures.
In practice, heat capacity does vary with temperature for many substances, especially over wide ranges. Reference tables often give Cₚ as a polynomial in T (like a + bT + cT²). In that case, you substitute the polynomial into the integral and evaluate it term by term.
Phase Transitions
Melting, boiling, and other phase changes happen at a fixed temperature while heat continues to flow in. That makes the calculation straightforward:
ΔS = ΔH_transition / T_transition
ΔH is the enthalpy of the transition (enthalpy of fusion for melting, enthalpy of vaporization for boiling), and T is the temperature at which the transition occurs, in kelvin.
Worked Example: Melting Ice
The enthalpy of fusion for water is 6.01 kJ/mol, and ice melts at 273 K. For 1.0 mole of ice melting to liquid water:
ΔS = 6010 J / 273 K = 22.0 J/K
This positive value makes intuitive sense. Liquid water molecules have more ways to arrange themselves than molecules locked in a crystal, so the system becomes more disordered. Vaporization produces an even larger entropy change because the jump from liquid to gas involves a much bigger increase in molecular freedom.
The same formula applies to freezing, condensation, and deposition, but with the sign reversed. When water freezes, the system loses 22.0 J/K of entropy per mole at 273 K.
Chemical Reactions
For a chemical reaction, you calculate the standard entropy change using tabulated standard molar entropy values (S°) for each substance:
ΔS°_rxn = Σ(coefficients × S° of products) − Σ(coefficients × S° of reactants)
Unlike enthalpy of formation, where elements in their standard state have a value of zero, every substance has a nonzero standard molar entropy. You look up S° for each reactant and product, multiply each by its coefficient in the balanced equation, then subtract the reactant total from the product total.
A reaction that produces more moles of gas than it consumes will almost always have a positive ΔS° because gases have far higher entropy than solids or liquids. Conversely, reactions that consume gas or form more ordered structures tend to have a negative entropy change.
Irreversible Processes
Real-world processes are rarely reversible. Gas escaping from a punctured tire, a hot pan cooling on the counter, cream mixing into coffee: these are all irreversible. You can still calculate the entropy change because entropy is a state function. It doesn’t matter what path the system actually took. You just need to construct any reversible path that connects the same initial and final states, then calculate ΔS along that imaginary path.
For example, suppose a gas expands irreversibly and adiabatically (no heat exchange) from volume V to 2V at constant temperature T₁. The actual process involves no heat flow, but to find the entropy change, you imagine the gas expanding reversibly and isothermally from V to 2V. The entropy change for that reversible path is nR ln(2), and since both paths share the same start and end states, the entropy change is nR ln(2) for the irreversible process too.
One important distinction: for the irreversible process, the entropy of the universe increases (the total entropy change of system plus surroundings is positive). For the reversible version, the universe’s total entropy stays the same. The system’s entropy change, however, is identical in both cases.
The Statistical Mechanics Approach
All the formulas above come from classical thermodynamics, where entropy is defined through heat and temperature. There’s an entirely different way to think about it, rooted in counting arrangements at the molecular level. Ludwig Boltzmann defined entropy as:
S = k ln Ω
Here, k is the Boltzmann constant (1.380649 × 10⁻²³ J/K) and Ω is the number of microstates, meaning the number of distinct ways the particles in a system can be arranged while still producing the same overall energy and volume you observe. More possible arrangements means higher entropy.
To find the change in entropy between two states using this approach, you calculate:
ΔS = k ln Ω₂ − k ln Ω₁ = k ln(Ω₂ / Ω₁)
This formula is most useful in contexts like mixing, expansion into a vacuum, or molecular-level simulations where you can count or estimate the number of microstates directly. For most practical chemistry and engineering problems, the classical formulas above are far easier to apply. But the Boltzmann equation reveals what entropy physically represents: it quantifies how many microscopic configurations are compatible with what you observe at the macroscopic level.
Choosing the Right Formula
- Constant temperature, changing volume or pressure: ΔS = nR ln(V₂/V₁) or nR ln(P₁/P₂)
- Changing temperature, no phase change: ΔS = Cₚ ln(T₂/T₁) when heat capacity is constant
- Phase change at constant temperature: ΔS = ΔH_transition / T
- Chemical reaction: ΔS° = Σ(S° products) − Σ(S° reactants)
- Molecular-level counting: ΔS = k ln(Ω₂/Ω₁)
Many real problems combine several of these steps. Heating ice from −10°C to steam at 110°C, for instance, requires you to calculate ΔS for warming the ice, melting it, warming the liquid water, boiling it, and then warming the steam, each step using the appropriate formula. You then add all five contributions together. The total entropy change is the sum of each segment because entropy, as a state function, is additive along any path connecting the initial and final states.