Pressure measures the force distributed across a specific surface area. Applying force with a finger distributes it over a larger area for lower pressure, while a needle concentrates the same force for higher pressure. Understanding pressure is fundamental across diverse fields, from weather analysis to biological systems.
Common Pressure Units
Pressure is quantified using various units, each with specific applications. The Pascal (Pa), the SI standard, is defined as one Newton of force over one square meter (N/m²). Since it is small, kilopascals (kPa), equal to 1,000 Pascals, are often used.
The imperial system uses Pounds per Square Inch (PSI), representing one pound-force on a single square inch, commonly for vehicle tire pressure. The Bar, another metric unit, equals 100,000 Pascals (100 kPa). Millibars (mbar), one-thousandth of a bar, are frequently used in meteorology, approximating atmospheric pressure at sea level.
The Atmosphere (atm) is based on average atmospheric pressure at sea level, defined as 101,325 Pascals, serving as a reference standard. Historically, Torr and Millimeters of Mercury (mmHg) emerged from early barometer designs. Torr is 1/760th of a standard atmosphere and nearly equivalent to mmHg, both commonly used in vacuum technology and for blood pressure.
Fundamental Pressure References
Pressure measurements gain meaning when referenced against a specific baseline. Different reference points define distinct pressure types, providing varied insights.
Absolute pressure is measured relative to a perfect vacuum, representing zero pressure. This reflects the total pressure within a system, including any atmospheric pressure. For example, pressure inside a sealed container, unaffected by external atmosphere, is measured in absolute terms.
Gauge pressure measures pressure relative to the local atmospheric pressure. A tire pressure gauge, for example, reads zero when a tire is flat, indicating internal pressure equals external atmosphere. Positive readings signify pressure above ambient atmosphere, while negative readings indicate vacuum.
Differential pressure quantifies the difference between two distinct points within a system. This measurement focuses on the pressure gradient or drop across components. For instance, differential pressure gauges monitor pressure drop across a pipeline filter, indicating cleanliness or a clog. It also applies to fluid flow rates or aircraft cabin pressurization.
Tools of Measurement
Instruments for measuring pressure vary in design and operating principles. Barometers measure atmospheric pressure, for weather forecasting. Mercury barometers balance a mercury column against atmospheric pressure, its height indicating pressure. Aneroid barometers use a sealed, flexible metal capsule that expands or contracts with pressure changes, transferring movement to a dial.
Manometers are classic instruments measuring pressure differences using a liquid column. A simple U-tube manometer, partially filled with fluid like water or mercury, shows pressure or pressure difference by the height difference. They are often used for precise low to medium pressure measurements in laboratory and industrial settings.
Mechanical gauges, such as the Bourdon tube gauge, are recognized for durability and direct readability. The Bourdon tube, a C-shaped flattened tube, straightens when pressurized. This deformation is mechanically amplified through linkages and gears, moving a pointer across a calibrated dial to display pressure.
Electronic transducers, or pressure sensors, are modern advancements in pressure measurement. These devices convert pressure into an electrical signal, typically using a flexible diaphragm that deforms under pressure. Strain gauges or capacitive elements detect this deformation, producing an electrical output. This signal transmits to control systems, allowing automated monitoring and control in various applications, from automotive systems to medical devices.