Skin cancer can develop at any age, including childhood, though it becomes far more common with each passing decade. Most cases are diagnosed after age 50, but the damage that causes them often starts decades earlier. Understanding how age interacts with skin cancer risk helps explain why protection matters at every stage of life.
Skin Cancer in Children and Teens
Skin cancer in children is rare, but it does happen. About 300 cases of melanoma are diagnosed each year in patients younger than 20 in the United States, making up roughly 0.3% of all new melanoma cases. For children under 10, the rate is extremely low: approximately 1 to 2 cases per million. The rate climbs in the teenage years. Melanoma accounts for about 3% of all cancers in teens aged 15 to 19, and girls in that age group are diagnosed nearly twice as often as boys.
Melanoma in young children often looks different than it does in adults. Kids under 10 are more likely to develop growths that lack the dark pigmentation people typically associate with melanoma. Instead, their tumors may appear as a flesh-colored or reddish bump, sometimes bleeding, which can delay diagnosis. These cases also tend to involve thicker tumors and a higher rate of spread to lymph nodes, partly because the unusual appearance makes them harder to catch early.
Genetic Conditions That Cause Early Skin Cancer
Certain inherited conditions can trigger skin cancer unusually early. Gorlin syndrome, also called nevoid basal cell carcinoma syndrome, is one of the best known. People with this condition carry a mutation in a gene called PTCH1, which normally acts as a brake on cell growth. Without that brake functioning properly, basal cell carcinomas typically begin appearing during adolescence or early adulthood, decades before they would in the general population.
Another rare condition, xeroderma pigmentosum, leaves the body unable to repair UV damage to DNA. Children with this disorder can develop skin cancers before age 10 if their skin isn’t rigorously protected from sunlight. These genetic conditions are uncommon, but they illustrate that age alone doesn’t determine risk. Biology plays a role too.
Why Young Adults Are Not Immune
Melanoma is one of the most common cancers in people under 40, particularly young women. While the overall incidence of skin cancer rises steeply after 50, younger adults are far from safe. Indoor tanning has been a significant contributor to melanoma diagnoses in this age group, and cumulative sun exposure through outdoor sports, beach trips, and everyday life adds up faster than most people realize.
The pattern is worth noting: skin cancer diagnosed in your 30s or 40s often traces back to UV exposure during childhood and adolescence. Many physicians believe melanoma has a latency period of 30 to 40 years, meaning the harmful sun exposure from previous decades can drive the development of the disease long after the sunburns have faded. A diagnosis at 45 may reflect damage done at 10.
How Childhood Sun Damage Shapes Adult Risk
The connection between early sunburns and later skin cancer is one of the strongest findings in cancer prevention research. Five or more sunburns over your lifetime more than doubles your risk of developing melanoma. A history of painful, blistering sunburns in childhood is also significantly associated with squamous cell carcinoma later in life.
This happens because UV radiation damages DNA in skin cells. Young skin is especially vulnerable because it’s still developing, and the body’s repair mechanisms don’t always catch every error. Most of the time, damaged cells are destroyed or repaired. But occasionally, a mutation slips through and sits quietly in the skin for years or decades. Additional UV exposure, aging, or immune changes can eventually allow that damaged cell to begin growing out of control. This is why dermatologists emphasize sun protection for children so strongly. The sunscreen you put on a five-year-old is protecting a 45-year-old.
When Risk Peaks
The median age for a melanoma diagnosis is 66, and most basal cell and squamous cell carcinomas appear after 50. The risk curve rises steadily with age for a straightforward reason: the longer you live, the more UV exposure accumulates, and the more time mutations have to develop. An 80-year-old has had six decades more sun exposure than a 20-year-old, plus an aging immune system that’s less effective at eliminating abnormal cells.
That said, “most common after 50” doesn’t mean “only after 50.” Melanoma is regularly diagnosed in people in their 20s and 30s, and non-melanoma skin cancers increasingly appear in younger adults with heavy sun or tanning bed exposure. Age shifts the odds, but it never eliminates risk entirely at any point in life.
Current Screening Recommendations
There is no universally recommended age to begin routine skin cancer screening. The U.S. Preventive Services Task Force currently states that the evidence is insufficient to assess the benefits and harms of routine visual skin exams by a clinician for people without signs or symptoms of skin cancer. No major professional organization in the U.S. recommends population-wide screening at a specific age.
This doesn’t mean skin checks are useless. It means the data hasn’t yet shown that screening everyone on a schedule prevents deaths at a population level. In practice, dermatologists still perform skin exams and many recommend them for people with higher risk profiles: a family history of melanoma, a large number of moles, a history of blistering sunburns, fair skin that burns easily, or a personal history of any skin cancer. If you have any of those risk factors, getting a baseline skin exam in your 20s or 30s is reasonable. For everyone, paying attention to new or changing spots on your skin remains the single most practical way to catch problems early, regardless of age.