What Units Do Spectrophotometers Measure In?

A spectrophotometer is an instrument designed to measure the interaction between light and matter. This device works by passing a beam of light through a sample and quantifying how much light is absorbed or transmitted. This measurement technique is foundational in modern science, providing researchers with a precise method for analyzing the composition of a sample. By revealing how a compound interacts with specific parts of the electromagnetic spectrum, the spectrophotometer makes it possible to identify unknown substances and determine the amounts of known chemicals present.

How Spectrophotometers Interact with Light

The first step in spectrophotometric measurement involves selecting a specific portion of the electromagnetic spectrum. Light entering the instrument is narrowed down by a component called a monochromator, which isolates a very narrow band of light measured in nanometers (nm). This selected wavelength is typically chosen because the substance of interest is known to absorb light most effectively at that point.

Once the specific wavelength is selected, the instrument measures the initial intensity of the light beam, referred to as \(I_0\). This measurement represents the total amount of light energy striking the sample, which is held in a transparent container called a cuvette. As the light passes through the solution, molecules within the sample absorb a portion of the light energy.

The remaining light that successfully exits the sample is then measured by a detector, recording its intensity, designated as \(I\). By comparing the initial intensity (\(I_0\)) to the transmitted intensity (\(I\)), the spectrophotometer calculates the extent of the light-matter interaction. This fundamental comparison of light intensities forms the basis for all the derived units reported by the instrument.

The Primary Units: Transmittance and Absorbance

The core output of a spectrophotometer is delivered in two interconnected, dimensionless units: Transmittance and Absorbance. Transmittance, symbolized by \(T\), is the ratio of the light intensity that passes through the sample (\(I\)) to the initial light intensity (\(I_0\)). This ratio, \(T = I/I_0\), expresses the fraction of incident light that is transmitted through the sample.

Transmittance is frequently expressed as percent transmittance (\(\%T\)), which is the ratio multiplied by 100. A sample that is perfectly transparent to the selected wavelength will have a \(\%T\) of 100. Conversely, a \(\%T\) of 10 signifies that only ten percent of the light was transmitted.

Absorbance (\(A\)) is the unit favored for most quantitative analytical work. Absorbance is mathematically defined as the negative logarithm (base 10) of the Transmittance, \(A = \log_{10}(I_0/I)\). This logarithmic relationship means the Absorbance scale is non-linear with respect to the light intensity measurements.

Both Transmittance and Absorbance are unitless quantities because they are calculated as a ratio of two identical units of light intensity. Absorbance is preferred because it establishes a linear relationship with the concentration of the absorbing substance, unlike Transmittance, which follows an exponential curve. This linearity simplifies the process of determining unknown concentrations.

Converting Measurements to Concentration

The unitless Absorbance value is the essential link used to determine a sample’s concentration, which is the practical unit sought by most users. This conversion relies on a fundamental principle of spectroscopy known as the Beer-Lambert Law. The law states that the Absorbance (\(A\)) of a solution is directly proportional to both the concentration of the absorbing species (\(c\)) and the distance the light travels through the solution (\(b\)).

This relationship is summarized by the equation \(A = \epsilon b c\). The proportionality constant that completes the equation is \(\epsilon\), which is known as the molar absorptivity. This value is a unique physical property of the absorbing molecule at a specific wavelength and temperature, representing how strongly a substance absorbs light.

The path length, \(b\), refers to the width of the cuvette holding the sample, which is typically a standardized one centimeter (cm). Since Absorbance (\(A\)) is unitless and \(b\) is measured in centimeters, the molar absorptivity (\(\epsilon\)) must carry the necessary units to ensure the final concentration (\(c\)) is in a standard chemical unit. For concentration expressed in Molarity (moles per liter, or mol/L), \(\epsilon\) is expressed in units of \(L \cdot mol^{-1} \cdot cm^{-1}\).

By measuring the Absorbance (\(A\)) of a sample and knowing the molar absorptivity (\(\epsilon\)) and the path length (\(b\)), the equation can be rearranged to solve for the concentration (\(c\)). The final result is then reported in standard chemical units, such as Molarity, or in mass-per-volume units like milligrams per milliliter (mg/mL) or parts per million (ppm).