In chemistry thermodynamics, H stands for enthalpy, a measure of the total heat content of a system. It’s defined by a simple equation: H = U + PV, where U is the internal energy of the system, P is pressure, and V is volume. Enthalpy is measured in kilojoules per mole (kJ/mol), and it’s one of the most commonly used quantities in chemistry because it tells you how much heat a reaction releases or absorbs under normal laboratory conditions.
What Enthalpy Actually Represents
Every chemical system has internal energy, which includes all the kinetic and potential energy of its molecules. But in a real lab setting, reactions don’t happen in sealed, rigid containers. They happen in open flasks and beakers at atmospheric pressure, where gases can expand or compress. When a reaction produces gas, for example, that gas pushes against the atmosphere and does work. Enthalpy wraps up internal energy and that pressure-volume work into a single, convenient number.
That’s why H = U + PV is so useful. The PV term accounts for the energy a system needs to “make room” for itself against its surroundings. For solids and liquids, the PV contribution is tiny. For gases, it matters a lot more. By bundling everything together, enthalpy gives chemists a single value that tracks heat flow in the conditions they actually work in.
Why ΔH Equals Heat at Constant Pressure
The real power of enthalpy shows up when you look at changes rather than absolute values. When a reaction runs at constant pressure (which covers nearly every reaction on a benchtop), the math simplifies beautifully. The change in enthalpy, written ΔH, equals the heat transferred to or from the system. The derivation relies on the first law of thermodynamics: the change in internal energy equals heat plus work. At constant pressure, the work term cancels with the PΔV portion of enthalpy, leaving ΔH = q, where q is the heat exchanged at constant pressure.
This is why you’ll see ΔH treated almost interchangeably with “heat of reaction” in textbooks. They’re the same thing, as long as pressure stays constant.
Exothermic vs. Endothermic Reactions
The sign of ΔH tells you which direction heat flows. A negative ΔH means the reaction releases heat into the surroundings. The products end up with less enthalpy than the reactants, so the reaction is energetically “downhill.” These are exothermic reactions: combustion, neutralization of acids and bases, and many oxidation reactions.
A positive ΔH means the reaction absorbs heat from the surroundings. The products carry more enthalpy than the reactants, making the reaction “uphill.” These endothermic reactions include dissolving certain salts in water, photosynthesis, and thermal decomposition. If you’ve ever held an instant cold pack, you’ve felt an endothermic process pulling heat out of your hand.
Enthalpy Is a State Function
One of enthalpy’s most important properties is that it’s a state function. This means its value depends only on the current conditions of a system (temperature, pressure, composition), not on how the system got there. If you start with reactants at 25°C and end with products at 25°C, the enthalpy change is the same regardless of whether the reaction happened in one step or ten.
This path independence is what makes Hess’s Law work. Hess’s Law says that the total enthalpy change for a reaction equals the sum of enthalpy changes for any series of steps that lead to the same overall reaction. If you can’t measure a reaction’s ΔH directly, you can piece it together from reactions you can measure. Two rules make this practical: reversing a reaction flips the sign of ΔH, and multiplying a reaction by a coefficient multiplies its ΔH by the same number.
Standard Enthalpy of Formation
To keep enthalpy values comparable across labs and textbooks, chemists report them under standard conditions: 25°C (298.15 K) and 1 bar of pressure, as defined by IUPAC. The standard enthalpy of formation (written ΔH°f) is the enthalpy change when one mole of a compound forms from its elements in their most stable forms under these conditions.
Pure elements in their standard state (oxygen gas, solid carbon as graphite, metallic iron) are assigned a formation enthalpy of zero. Everything else is measured relative to that baseline. For example, the formation enthalpy of liquid water is about −286 kJ/mol, and carbon dioxide gas comes in at roughly −394 kJ/mol. These negative values tell you that forming water or CO₂ from their elements releases a significant amount of heat.
How Enthalpy Is Measured
In practice, enthalpy changes are measured with calorimeters. The simplest version, often called a coffee cup calorimeter, is exactly what it sounds like: nested Styrofoam cups with a thermometer. You run a reaction inside the cups at atmospheric (constant) pressure, measure the temperature change of the solution, and calculate heat flow using q = mcΔT, where m is the mass of the solution, c is its specific heat capacity, and ΔT is the temperature change. Since the process happens at constant pressure, the heat you measure equals ΔH.
Estimating ΔH From Bond Energies
When precise formation data isn’t available, you can estimate a reaction’s enthalpy change using average bond enthalpies. The idea is straightforward: breaking a chemical bond always requires energy, and forming a bond always releases energy. To estimate ΔH for a reaction, you add up the energy needed to break all the bonds in the reactants and subtract the energy released when new bonds form in the products.
ΔH ≈ (sum of bonds broken) − (sum of bonds formed)
This method gives a rough estimate rather than a precise value, because bond energies are averages that vary slightly depending on the molecule they’re in. A carbon-hydrogen bond in methane isn’t identical to one in ethanol. Still, bond energy calculations are useful for quick predictions when you need a ballpark answer rather than a calorimetry-grade measurement.