How Is the Temperature of a Star Determined?

A star’s temperature is a fundamental property influencing its appearance, energy output, and life cycle. Understanding it allows astronomers to classify stars, predict their behavior, and unravel their evolutionary paths.

The Fundamental Principle: Blackbody Radiation

Determining a star’s temperature relies on the principle of blackbody radiation. A blackbody is an idealized object that absorbs all electromagnetic radiation and emits radiation based solely on its temperature. Stars behave as approximate blackbodies, making this concept applicable to stellar studies.

Every object with a temperature above absolute zero emits electromagnetic radiation. A blackbody’s temperature directly influences both the total intensity and the distribution of its emitted light. Hotter objects emit more radiation at all wavelengths. As temperature increases, the peak wavelength shifts towards shorter, higher-energy wavelengths (blue or ultraviolet). Cooler objects emit most radiation at longer, lower-energy wavelengths (red or infrared).

Method One: Analyzing a Star’s Color

One primary method for determining a star’s surface temperature involves analyzing its observed color. The color of a star serves as a direct indicator of its surface temperature. This relationship is rooted in the principles of blackbody radiation, where hotter stars appear blue or blue-white, while cooler stars appear red.

This connection is described by Wien’s Displacement Law: the wavelength at which a star emits the most light is inversely proportional to its temperature. For example, a star emitting most light in the blue spectrum is hotter than one peaking in the red. Blue stars typically exceed 25,000 Kelvin, while red stars can be as cool as 2,000-3,000 Kelvin. Our Sun, a yellow star, has a surface temperature of approximately 5,000-6,000 Kelvin.

Method Two: Interpreting Spectral Lines

Another method for determining a star’s temperature involves interpreting its spectral lines. When starlight is dispersed into a spectrum, it reveals a series of dark absorption lines. These lines are formed when specific elements in the star’s outer atmosphere absorb light at particular wavelengths. The presence and strength of these spectral lines are dependent on the star’s temperature.

Temperature dictates the ionization state and excitation level of atoms in the stellar atmosphere. Hydrogen absorption lines are strongest in stars with intermediate temperatures, around 10,000 Kelvin, because a significant number of hydrogen atoms can readily absorb photons. In contrast, very hot stars have highly ionized hydrogen, resulting in weak or absent hydrogen lines. Very cool stars lack strong hydrogen lines because their atoms do not have enough energy to excite electrons. Different elements and their ions exhibit characteristic lines at specific temperature ranges, providing a detailed thermal fingerprint.

Stellar Classification and Temperature

Temperatures derived from color analysis and spectral line interpretation classify stars into a systematic sequence. The OBAFGKM sequence, the most widely used system, arranges stars by decreasing surface temperature. O-type stars are the hottest, followed by B, A, F, G, K, and M-type stars, which are the coolest.

Each spectral type corresponds to a specific temperature range and exhibits characteristic spectral lines. For example, O-type stars are bluish-white with temperatures from 25,000 K to over 30,000 K and show lines of ionized helium, while M-type stars are red with temperatures around 3,000 K and prominently display titanium oxide. This classification system provides a concise way to categorize stars based on their thermal properties.

Surface vs. Internal Temperature

A star’s surface temperature differs from its internal temperature. Methods like color analysis and spectral lines primarily determine the temperature of a star’s photosphere, its visible surface. This surface temperature typically ranges from 2,000 Kelvin for the coolest stars to over 40,000 Kelvin for the hottest.

In contrast, a star’s core temperature is vastly higher and cannot be directly measured. The Sun’s core, for example, is estimated at 15 million degrees Celsius. These internal temperatures are inferred through stellar models and the nuclear fusion processes that power stars. Models use a star’s observed mass, radius, and luminosity to calculate its internal structure and conditions.