Pressure is a fundamental physical quantity defined as the force exerted perpendicularly on a surface per unit area. The way pressure is measured often depends on the specific baseline chosen for the calculation. Relative pressure is a highly practical form of measurement that does not measure the total force exerted, but rather the degree to which a system’s pressure deviates from the pressure of its surroundings. This method provides a focused reading of the pressure available to perform work.
Defining Relative Pressure (Gauge Pressure)
Relative pressure is formally defined as the difference between a system’s internal absolute pressure and the current atmospheric pressure of the environment outside the system. This specific measurement is nearly always referred to in technical fields as gauge pressure. The term “gauge” is used because most mechanical and digital pressure-measuring instruments are designed to display this differential reading. Since these instruments are open to the atmosphere, they automatically zero out the effect of the ambient air pressure, providing a reading that is relative to the surroundings. This difference-based measurement is useful when the force available to operate mechanical components is the primary concern.
The Role of Atmospheric Pressure as a Reference Point
Relative pressure uses local atmospheric pressure as the zero reference point. Atmospheric pressure is the force exerted by the weight of the air column above a given point on Earth’s surface. When a relative pressure gauge is open to the atmosphere, it reads zero because the pressure on both sides of its sensing element is equal. This zero-setting simplifies measurements by isolating the pressure generated by a specific system from the constantly present pressure of the air. Although atmospheric pressure changes with altitude and weather, this fluctuating external pressure is effectively ignored, allowing the measurement to directly indicate the net force a fluid or gas can exert against the prevailing conditions.
Absolute Pressure vs. Relative Pressure
Relative pressure measurements contrast with absolute pressure measurements because they use different reference points. Absolute pressure is the total pressure measured relative to a perfect vacuum, a theoretical state of zero pressure, meaning it always includes the force of the atmosphere acting on the system. The relationship is straightforward: absolute pressure equals the sum of the relative pressure and the atmospheric pressure at that moment and location. For example, a car tire inflated to 32 pounds per square inch (psi) relative pressure will display 32 psi on a standard gauge. Since atmospheric pressure at sea level is approximately 14.7 psi, the total absolute pressure inside the tire is about 46.7 psi, but the relative measurement is more relevant as it determines the tire’s stiffness and load-bearing capacity.
Common Uses and Measurement
Relative pressure is the most common form of pressure measurement used across commercial and industrial applications. It is used in systems where the performance of the equipment depends on the pressure difference relative to the surrounding environment. For instance, the pressure in pneumatic tools, hydraulic lines, and compressed air systems is monitored using relative pressure gauges. Instruments designed to measure this type of pressure, such as Bourdon tube gauges and many electronic pressure sensors, are constructed to be open to the atmosphere on one side of their sensing element. Measurements in refrigeration circuits, monitoring fluid flow in pipes, and ensuring correct pressure in air conditioning units all rely on this direct, differential reading.