Every chemical change is accompanied by a transfer of energy. When atoms and molecules rearrange to form new substances, the breaking of existing bonds and the formation of new bonds either requires or releases energy. This energy transfer, typically in the form of heat, is a fundamental property of the reaction itself. Quantifying this thermal exchange is essential for understanding and controlling chemical processes. This specific measure of heat exchange during a reaction is known as the heat of reaction.
Defining the Heat of Reaction: Enthalpy Change
The heat of reaction is formally defined as the change in enthalpy (Delta H) that occurs when reactants are converted into products under constant pressure. Enthalpy is a thermodynamic property representing the total heat content of a system. Since most chemical reactions occur at constant atmospheric pressure, the heat gained or lost by the system is equivalent to the change in enthalpy.
Chemists use enthalpy rather than simply “heat” because it accounts for the internal energy of the system plus the energy required to make room for the reaction products against the external pressure. Delta H is a state function, depending only on the initial and final states of the reactants and products, not the path taken between them. The change is calculated by subtracting the total enthalpy of the reactants from the total enthalpy of the products (Delta H = H products – H reactants).
For standardized comparisons, the heat of reaction is reported as the Standard Enthalpy of Reaction (Delta H standard). This value is determined under specific standard conditions, typically including a pressure of 1 bar and a temperature of 298 Kelvin (25 degrees Celsius). The standard units are kilojoules per mole (kJ/mol). This molar quantity ensures the heat of reaction is consistent regardless of the total amount of material used, relating the energy change directly to the reaction’s stoichiometry.
Classifying Reactions: Exothermic and Endothermic Processes
The sign of the enthalpy change (Delta H) classifies a reaction, indicating the direction of heat flow between the system and its surroundings. When the products have a lower enthalpy than the reactants, the difference in energy is released into the surroundings, resulting in an exothermic process. This heat release causes the surrounding temperature to increase, and the reaction is assigned a negative Delta H value.
A common example of an exothermic process is the combustion of fuels, such as burning wood or natural gas, which releases significant heat and light. Biological processes like cellular respiration are also exothermic, continuously releasing energy to power the body’s functions. In these reactions, the energy released forming the more stable bonds in the products is greater than the energy required to break the reactant bonds.
Conversely, an endothermic process occurs when the products possess a higher enthalpy than the reactants. The system must absorb heat energy from the surroundings to drive the reaction forward. This heat absorption causes the surrounding temperature to decrease, characterizing an endothermic reaction by a positive Delta H value.
A practical example of an endothermic process is the dissolution of ammonium nitrate in water, used in instant cold packs. The system absorbs heat from the surrounding environment, making the pack feel cold to the touch. Photosynthesis is another well-known endothermic process, where plants absorb light energy to convert carbon dioxide and water into glucose. In endothermic reactions, the energy needed to break the reactant bonds is greater than the energy gained by forming the product bonds.
Methods for Determining Reaction Heat
The heat of reaction is defined by the difference between the initial and final energy states, but its numerical value is determined through experimental measurements or theoretical calculations. The most direct experimental method for finding Delta H is calorimetry, the science of measuring heat flow. In a laboratory, this often involves using a coffee-cup calorimeter, designed to minimize heat loss to the outside environment.
Calorimetry measures the temperature change (Delta T) of the reaction mixture and the surrounding medium, such as water. The amount of heat (q) absorbed or released is calculated using the equation q = mc(Delta T), where m is the mass, c is the specific heat capacity, and Delta T is the temperature change. Since the reaction occurs at constant atmospheric pressure, the measured heat flow (q) equals the change in enthalpy (Delta H).
For reactions difficult to measure directly, theoretical methods are employed. One powerful tool is Hess’s Law of Heat Summation, which states that the total enthalpy change for a reaction is the same regardless of the number of steps or the pathway taken. This allows chemists to calculate an unknown Delta H by algebraically summing the known enthalpy changes of simpler reactions that add up to the overall target reaction.
Another theoretical approach involves using bond energies, which are the energy required to break a specific chemical bond. The heat of reaction is estimated by summing the energy required to break all reactant bonds and subtracting the energy released when all product bonds are formed. This method provides a good approximation of the Delta H, as it is rooted in the fundamental energy changes associated with the rearrangement of atoms.