Thermal imaging provides a way to perceive the world by converting invisible heat energy into visible images. Also known as infrared thermography, it allows us to “see” temperature variations that the human eye cannot detect. All objects with a temperature above absolute zero emit infrared radiation, making thermal imaging broadly useful across many fields. Its development evolved over centuries from fundamental discoveries to sophisticated devices.
The Dawn of Infrared Detection
The scientific understanding of heat beyond visible light began in 1800 with British astronomer Sir William Herschel. Investigating heat associated with different sunlight colors, he conducted an experiment. Herschel passed sunlight through a glass prism, separating it into its spectrum of colors.
He then placed thermometers in each color of the spectrum, observing temperature changes. Intriguingly, when he positioned a thermometer just beyond the red end of the visible spectrum, where no light was apparent, the temperature continued to rise, even exceeding that in the visible red light. This discovery revealed an invisible form of radiation, which Herschel termed “calorific rays,” now known as infrared radiation. This marked the first demonstration of light imperceptible to the human eye, laying the foundation for sensing heat patterns.
Early Attempts at Visualizing Heat
Following Herschel’s discovery, scientists began developing instruments to detect and measure this invisible heat. An important step was Leopoldo Nobili’s invention of the thermopile in 1829, building on Thomas Johann Seebeck’s discovery of the thermoelectric effect. Nobili, along with Macedonio Melloni, later constructed a thermopile capable of sensing a person from a distance of 10 meters by 1835. These devices used the principle that temperature changes produced measurable electrical signals.
A further advancement came with the bolometer, invented by American astronomer Samuel Pierpont Langley in 1878. This instrument was remarkably sensitive, capable of detecting temperature differences as minute as one hundred-thousandth of a degree Celsius. Langley’s bolometer worked by measuring the change in electrical resistance of a thin metal strip as it absorbed radiant heat. While these early devices did not produce images, they were foundational in sensing and quantifying heat radiation, paving the way for imaging.
The Invention of Thermal Imaging Devices
Creating visual images from detected heat marked a significant milestone. Early efforts emerged in the early 20th century, with Hungarian physicist Kálmán Tihanyi creating an infrared-sensitive camera in 1929 for British anti-aircraft defense. Known as the Evaporograph, this early device visualized thermal signatures.
Practical thermal imaging camera development accelerated during and after World War II, driven by military requirements for night vision and target detection. In the 1940s, advancements in detector technology, such as the development of lead sulfide (PbS) as a practical infrared detector, proved crucial. By the 1950s, American companies like Texas Instruments, Hughes Aircraft, and Honeywell developed single-element detectors that could scan a scene and produce linear thermal images. The 1950s saw the invention of the first practical infrared cameras, capable of capturing thermal images on film, though initial processes were slow, sometimes taking an hour per image. This represented a substantial leap, enabling the visualization of heat signatures.
Initial Applications and Early Evolution
Thermal imaging technology first found practical applications predominantly in military contexts. These early infrared cameras provided night vision, allowing forces to operate in darkness by detecting heat emitted by people and vehicles. They were also employed in missile guidance systems, surveillance, and target acquisition on the battlefield.
Beyond military uses, applications began to appear in other sectors. For instance, early thermal devices were explored for industrial purposes, such as monitoring machinery for overheating. The Royal Navy also adopted early thermal imagers for shipboard firefighting in the 1970s. As the technology matured through the 1960s and 1970s, devices became more refined, with improved sensitivity and the emergence of scanning systems that could visualize heat scenes.