What Is an Ampere-Hour and What Does It Measure?

The Ampere-Hour (Ah) is a fundamental unit used to measure the capacity of a battery, indicating the total amount of electrical charge it can store. This metric helps consumers and engineers quantify how much power a battery can supply over a specific duration, making it a primary consideration when selecting power sources for electronics and electric vehicles. Understanding the Ah rating is the first step toward accurately predicting a device’s runtime and comparing the storage capability of different energy systems.

Defining Ampere-Hour (Ah)

The Ampere-Hour unit specifically measures the electric charge capacity of a battery, essentially indicating the quantity of electrons available for use. This unit is calculated by multiplying the current flow (in Amperes) by the time the current is maintained (in hours). An Ampere, or Amp, is a measure of the rate of electrical current flowing through a circuit at a single moment in time. The distinction between Amps and Ampere-Hours is similar to the difference between the rate of water flowing from a tank and the total volume of water the tank holds. A battery rated at 1 Ampere-Hour (1 Ah) can theoretically deliver a continuous current of 1 Amp for exactly one hour. For example, a 1 Ah battery could also supply 2 Amps of current for 30 minutes, or a smaller current of 0.5 Amps for two hours. For smaller batteries, such as those found in cell phones, the capacity is often listed in milliampere-hours (mAh), where one thousand mAh equals one Ah.

Connecting Ah to Total Stored Energy

While the Ampere-Hour rating is a measure of charge capacity, it does not fully describe the total energy stored in a battery because it omits the influence of voltage. Voltage represents the electrical potential difference, which is the “pressure” driving the current. To determine the true energy available, the voltage must be included in the calculation, which leads to the Watt-Hour (Wh) unit. Watt-Hours are the standard measure for total electrical energy storage, and the conversion is straightforward: Watt-Hours (Wh) equals Ampere-Hours (Ah) multiplied by the battery’s nominal Voltage (V). This formula makes Watt-Hours a much more comprehensive metric for energy comparison. Two batteries with the exact same Ah rating can store vastly different amounts of energy if their voltages differ. Consider two batteries, both rated at 10 Ah: a 12-Volt battery stores 120 Wh, while a 48-Volt battery stores 480 Wh, four times the energy despite having the identical Ah rating. The Wh rating is the definitive metric for evaluating a battery’s ability to perform work.

Calculating Battery Run Time

The most practical application of the Wh rating is estimating how long a battery will power a specific device. The ideal run time in hours is calculated by dividing the battery’s total energy capacity in Watt-Hours by the device’s constant power consumption in Watts. For instance, a 100 Wh battery powering a 20 Watt light would theoretically run for five hours.

However, this calculation provides only a theoretical run time, as real-world factors inevitably reduce the usable capacity. The Depth of Discharge (DoD) is a significant factor, representing the percentage of the battery’s total capacity that has been used. Repeatedly discharging a battery to a high DoD can significantly shorten its lifespan, so most battery systems are managed to operate at a lower DoD for longevity.

Another consideration is the C-rate, which describes the rate of charge or discharge relative to the battery’s maximum capacity. Drawing a very high current (high C-rate) can cause internal resistance to increase, generating heat and reducing the usable Ah capacity of the battery.