How Is Suction Measured? Units and Instruments

Suction is a phenomenon commonly encountered in everyday life, from drinking through a straw to the operation of a vacuum cleaner. It describes the movement of liquids or gases due to a pressure difference, occurring when an area of lower pressure is created. This causes surrounding higher pressure to push fluids or gases into that reduced-pressure zone, driven by a pressure gradient.

The Concept of Suction and Pressure

Suction relies on pressure differentials, the difference in pressure between two areas. Earth’s atmosphere exerts atmospheric pressure, approximately 14.7 pounds per square inch (psi) or 101.3 kilopascals (kPa) at sea level. Suction is achieved by reducing gas pressure within a confined space relative to this ambient atmospheric pressure. For instance, pulling a syringe plunger back lowers internal air pressure, allowing higher atmospheric pressure to push liquid into it.

A “perfect vacuum” is a theoretical space devoid of matter with zero pressure, unattainable on Earth. Suction instead creates a partial vacuum, where gas pressure is significantly less than atmospheric pressure. The greater the difference between the lower pressure inside the system and the higher external pressure, the stronger the suction force.

Units for Quantifying Suction

Various units quantify suction, each representing a measure of pressure difference. Millimeters of mercury (mmHg) originated from early barometric experiments, measuring atmospheric pressure by the height of a mercury column. Inches of mercury (inHg) is another unit based on the same principle, frequently used in North America for rough vacuum measurements.

The kilopascal (kPa) is the standard International System of Units (SI) measurement for pressure, representing newtons per square meter, and is widely used in scientific and technical fields. Pounds per square inch (psi) measures force per unit area, commonly used in the United States for applications like tire pressure.

Torr, named after Evangelista Torricelli, is approximately equivalent to one millimeter of mercury. For extremely low pressures, the micron (one-thousandth of a Torr) is often used. Millibar (mbar) is also a common unit, especially in Europe, with one millibar equaling 100 pascals.

Measuring Instruments and Their Principles

Measuring suction involves instruments designed to detect and quantify pressure differences. Manometers are among the oldest and simplest devices. A U-tube manometer, for example, consists of a U-shaped tube partially filled with liquid. When one end is exposed to the pressure being measured and the other to a reference, the difference in liquid levels indicates the pressure difference or suction.

Mechanical vacuum gauges, such as Bourdon tube and diaphragm gauges, operate on the principle of mechanical deformation. A Bourdon tube gauge uses a curved tube that straightens or bends in response to pressure changes, translating this movement to a needle on a dial. Diaphragm gauges utilize a flexible diaphragm that deforms under pressure, converting this deformation into a reading. These mechanical gauges provide direct readings of pressure by sensing the force exerted by gas molecules.

Advanced digital vacuum meters and gauges employ various sensors, including thermal conductivity and capacitance sensors. Thermal conductivity gauges, like Pirani gauges, measure pressure by assessing how well a gas conducts heat away from a heated element. Capacitance manometers measure pressure by detecting changes in electrical capacitance caused by the deformation of a diaphragm. These instruments provide highly accurate readings, particularly in deeper vacuum ranges.