People in old photographs genuinely did look older at the same age, and it wasn’t just the black-and-white film or outdated hairstyles. A combination of widespread smoking, unprotected sun exposure, poor dental care, harder physical lives, and virtually no skin care meant that a 40-year-old in 1960 had endured far more biological wear than a 40-year-old today. Some of these factors aged people on the inside. Most of them showed up on the outside first.
Smoking Aged Nearly Half the Population
In 1965, 42.4% of American adults smoked. That’s not a subculture or a habit confined to certain groups. Nearly half the adult population was routinely inhaling a substance that directly accelerates skin aging. By 2022, that number had fallen to 11.6%, according to the American Lung Association.
Smoking damages skin in two ways. It constricts blood vessels near the surface, starving skin cells of oxygen and nutrients, which thins the skin and dulls its color. It also breaks down collagen and elastin, the proteins that keep skin firm and elastic. The result is deeper wrinkles, sagging around the mouth and eyes, and a grayish complexion. Researchers have consistently found that long-term smokers look significantly older than nonsmokers of the same age, sometimes by a decade or more. When nearly half of adults smoked, that premature aging was everywhere, shaping what “normal” aging looked like.
Decades of Unprotected Sun Exposure
Sunscreen existed in some form as early as the 1930s, when L’Oréal founder Eugène Schueller marketed a product called Ambre Solaire. But these early sunscreens were crude, offered limited protection, and were treated as beach accessories rather than daily health products. For most of the 20th century, a tan was considered a sign of health and vitality. After World War II, tanned skin became so fashionable that advertisers actively promoted sun exposure as aspirational.
UV radiation is the single largest external contributor to skin aging. It breaks down collagen fibers, causes uneven pigmentation (age spots), and produces the leathery texture that people associate with looking old. This process, called photoaging, accumulates over years. Someone who spent decades working outdoors or sunbathing without protection would develop visible skin damage far earlier than someone using modern broad-spectrum sunscreen daily. Public health campaigns about UV protection didn’t gain real traction until the late 1980s and 1990s, meaning most people born before the 1970s accumulated significant unprotected sun damage during their most formative years.
Tooth Loss Changed Facial Structure
Before community water fluoridation became widespread in the United States (starting in the mid-1940s and expanding through the 1960s and 1970s), tooth decay and tooth loss were far more common. Fluoridated water reduces cavities by about 25% in both children and adults, according to the CDC. Modern dentistry, including routine cleanings, fillings, crowns, and implants, has made it possible for most people to keep their natural teeth well into old age.
This matters for appearance more than people realize. When you lose teeth, the jawbone beneath them gradually shrinks because it no longer has tooth roots to stimulate it. The lower third of the face collapses inward, the chin moves closer to the nose, the lips thin out, and the cheeks sink. This gives the classic “old person” look that was once common in people as young as their 40s and 50s. Dentures slow this process but don’t stop it entirely. Today, with better preventive care and dental implants that preserve bone, far fewer people experience that kind of dramatic facial change.
Skin Care Barely Existed
The concept of a daily skin care routine aimed at slowing aging is remarkably recent. Tretinoin, the gold-standard topical treatment for skin aging, was first approved by the FDA in 1971 for acne. Its anti-aging properties were discovered almost by accident: elderly acne patients using it began reporting that their skin looked generally better. Researchers followed up, and in 1995, the FDA approved tretinoin specifically for treating fine wrinkles, uneven skin tone, and rough texture from sun damage.
Before that, most people’s skin care consisted of soap and maybe a basic moisturizer. There were no widely available products containing ingredients that actually change how skin ages at a cellular level. Today, over-the-counter retinol, vitamin C serums, chemical exfoliants, and daily sunscreen are routine for millions of people. The cumulative effect of these products, used consistently over years, is that skin simply holds up better than it used to. Previous generations didn’t have these tools, and their skin showed it.
Harder Lives Left Visible Marks
Physical labor was far more common in earlier decades. More people worked outdoors or in physically demanding jobs, exposing them to sun, wind, pollution, and repetitive stress on their bodies. Manual labor doesn’t just tire you out. It creates visible wear: rougher skin on the hands and face, deeper expression lines, a more weathered appearance overall.
Nutrition also played a role. Year-round access to fresh fruits and vegetables is a modern convenience made possible by refrigeration, global shipping, and improved agriculture. Earlier generations ate more seasonally and, in many cases, less diversely. Diets lower in antioxidants, healthy fats, and key vitamins (particularly vitamins C and E, which protect skin from oxidative damage) contribute to faster visible aging. Chronic nutritional gaps don’t just affect how you feel. They show up in your skin, hair, and overall appearance.
Style Choices Made It Worse
It’s worth separating biological aging from aesthetic choices, because both contribute to the impression that past generations looked older. Clothing, hairstyles, and grooming standards in previous decades tended to push people toward a more mature presentation at younger ages. A 30-year-old man in the 1950s wore a suit, had a conservative haircut, and may have grown a mustache. A 30-year-old woman wore structured clothing and styled her hair in ways now associated with much older women. These weren’t aging effects. They were fashion norms that made young people look more like what we now consider “middle-aged.”
Today’s fashion and grooming trends skew younger. Casual clothing, less structured hairstyles, teeth whitening, and hair coloring all contribute to a more youthful overall look. When you combine these aesthetic shifts with the genuine biological differences (less smoking, more sun protection, better nutrition, active skin care), the gap between how a 45-year-old looked in 1965 and how one looks today is dramatic.
The Gap Is Real, Not Just Perception
This isn’t purely nostalgia or photographic bias. The factors that age skin and bodies are well understood, and nearly all of them were more prevalent in earlier decades. A population where 42% of adults smoke, almost nobody wears sunscreen, tooth loss is common by middle age, and skin care products don’t exist will simply look older at every age than a population where those conditions are reversed. The people in those old photographs weren’t a different species. They were living under conditions that aged them faster, and it showed.