Life expectancy at birth in 1900 was roughly 32 years globally and about 46 to 48 years in the United States, depending on sex. Those numbers sound shockingly low, but they don’t mean most people dropped dead in their 30s or 40s. The real story is more nuanced, and it has everything to do with how many children never made it past their first few years of life.
Why the Number Seems So Low
Life expectancy “at birth” is an average that includes every death, from newborns to the elderly. In 1900, that average was dragged down dramatically by infant and child mortality. For every 1,000 babies born in the U.S., about 100 died before their first birthday. In some American cities, that figure was closer to 30%, meaning nearly one in three infants didn’t survive to age one.
When that many deaths happen in the first year of life, the statistical average plummets even though plenty of adults were living into their 60s and 70s. Think of it this way: if one child dies at age one and another person lives to 70, their average life expectancy is 35.5, even though neither person actually died anywhere near that age.
How Long Adults Actually Lived
Once you survived childhood, your odds improved considerably. Data from the Virginia Commonwealth University School of Medicine shows that a white male who reached age 10 in 1900 could expect to live about 51 more years, putting him into his early 60s. A white female who reached 10 could expect roughly 52 additional years. For nonwhite Americans, the numbers were lower but still far above the headline figure: about 42 more years for males and 43 for females who made it to age 10.
So “life expectancy of 47” didn’t mean a 45-year-old was on death’s door. It meant the world was extraordinarily dangerous for the very young, and surviving childhood was the single biggest hurdle.
Who Lived Longer and Who Didn’t
In the U.S., women outlived men by a small margin in 1900: 48 years at birth compared to 46. That gap has widened over time (today it’s about five years), but the pattern of female survival advantage was already in place.
Race created a far larger divide. Nonwhite Americans faced dramatically shorter lives, reflected in the conditional survival data: even after reaching age 10, nonwhite males could expect about nine fewer remaining years than white males. This gap was driven by unequal access to clean water, nutrition, housing, and medical care, along with the effects of segregation and poverty concentrated in the South and in urban areas.
Geography mattered too. Living in a city carried what researchers at the National Bureau of Economic Research call a substantial “mortality penalty.” Mortality was higher in larger cities than smaller ones, and higher in southern cities than northern ones. By 1900, about 40% of the U.S. population was urban, meaning a large share of Americans lived in the most dangerous environments. Overcrowding, contaminated water supplies, and poor sewage systems made cities breeding grounds for infectious disease.
What Was Killing People
The causes of death in 1900 look almost nothing like today’s list. The three leading killers were pneumonia, tuberculosis, and diarrheal diseases. Together with diphtheria, these four conditions caused one third of all deaths. Tuberculosis alone killed 194 out of every 100,000 Americans, mostly in cities.
These were infectious diseases, not the chronic conditions (heart disease, cancer, stroke) that dominate mortality today. Diarrheal illness was especially lethal for infants and young children, who dehydrated quickly from contaminated water and milk. Pneumonia killed across all age groups but was particularly devastating without antibiotics or supportive care.
The medical tools we take for granted simply didn’t exist. Penicillin wasn’t discovered until 1928 and wasn’t widely used until the 1940s. Sulfonamides, the first effective anti-infection drugs, arrived in 1935. Insulin for diabetes wasn’t used in humans until 1920. In 1900, a bacterial infection that a modern pharmacy could resolve in a week was frequently a death sentence.
What Changed Everything
The leap from a U.S. life expectancy of 47 in 1900 to 79 today is one of the most dramatic shifts in human history, an increase of more than 30 years in just over a century. Most of that gain came not from high-tech medicine but from public health infrastructure.
Water treatment was a turning point. In 1908, Jersey City, New Jersey became the first U.S. city to routinely disinfect its drinking water. The results were swift: typhoid fever dropped from about 100 cases per 100,000 people in 1900 to roughly 34 per 100,000 by 1920. The CDC credits this decline to water disinfection, better source water quality, and improvements in sanitation and hygiene. Similar patterns played out with other waterborne illnesses.
Pasteurized milk, improved sewage systems, childhood vaccination programs, and better nutrition all stacked on top of clean water to slash infant mortality. As fewer children died, the average life expectancy climbed rapidly, even before antibiotics or modern surgery became available. The biggest gains in the first half of the 20th century came from keeping children alive, not from extending old age.
The Global Picture
While the U.S. averaged 46 to 48 years in 1900, the global average was far lower at around 32 years. Many regions in Africa, South Asia, and parts of Latin America had even higher rates of infectious disease, less access to clean water, and virtually no modern medical infrastructure. Famine and malaria pushed life expectancy in some areas below 25. The wealthiest nations, particularly in Western Europe and North America, were already pulling ahead, but even their numbers would look alarming by today’s standards.
Today, global life expectancy sits above 70, meaning the world has more than doubled the 1900 figure. That transformation happened unevenly, with wealthy nations gaining ground first and lower-income countries catching up through the mid and late 20th century as vaccination, water treatment, and basic medical care spread.