Eating disorders arise from a convergence of genetic, biological, psychological, and social factors, not any single cause. No one chooses to develop an eating disorder, and no one factor is enough on its own. Twin studies estimate that genetics account for 58% to 82% of the risk for disordered eating, which means biology loads the gun, but environment, life experiences, and brain chemistry pull the trigger.
Genetics Play a Larger Role Than Most People Expect
Research on identical and fraternal twins has consistently shown that eating disorders run in families for biological, not just behavioral, reasons. Studies from the Minnesota Center for Twin and Family Research found that anorexia nervosa has a heritability of roughly 58%, and broader forms of disordered eating show genetic influences ranging from 59% to 82%. The remaining risk comes from individual environmental experiences, meaning the unique things that happen to one person but not their sibling.
A large genome-wide study identified eight specific regions of DNA that differ between people with and without anorexia. What surprised researchers is that anorexia doesn’t look purely psychiatric at the genetic level. Some of the same genetic variations linked to lower risk of type 2 diabetes and altered insulin signaling are linked to higher risk of anorexia. This overlap suggests that anorexia involves metabolic differences in how the body processes energy, not just psychological traits around food and weight. The National Institute of Mental Health has described this as evidence of “metabo-psychiatric origins,” meaning the disorder sits at the intersection of metabolism and mental health.
How Brain Chemistry Gets Involved
Dopamine, the brain chemical tied to reward and motivation, plays a central role in how eating disorders develop and become entrenched. Researchers have proposed a two-stage model for anorexia that helps explain why the disorder is so hard to reverse once established.
In the first stage, restricting food (especially combined with exercise) triggers a surge in dopamine activity. This creates an escalating loop: the restriction feels rewarding, which reinforces more restriction. The brain essentially learns to treat weight loss behaviors the way it would treat any pleasurable activity. In the second stage, after prolonged starvation, dopamine drops or becomes impaired. This shift makes the brain rigid and inflexible, locking a person into patterns of restriction even when they want to change. Animal studies support this model. Mice that increase their running during food restriction show changes in dopamine receptors in the brain’s reward center, and manipulating those receptors can make animals more or less vulnerable to anorexia-like behavior.
This two-stage process helps explain something families often notice: early in the illness, the person may seem energized and driven, but over time they become increasingly stuck and unable to change course, even when they recognize the danger.
Hormones and Puberty Create a Window of Vulnerability
Most eating disorders first appear during adolescence, and that timing is not a coincidence. The hormonal shifts of puberty directly affect appetite, body composition, and emotional regulation in ways that can trigger disordered eating in someone who is already genetically vulnerable.
Estrogen has a direct suppressive effect on food intake. In animal studies, removing the ovaries increases meal size, and replacing estrogen reverses that effect. Eating also fluctuates with the menstrual cycle: food intake drops right after estrogen peaks. Progesterone complicates this picture by counteracting estrogen’s appetite-suppressing effects. When both hormones are elevated at the same time, binge eating tends to increase. Among women with a history of binge eating, low estrogen levels predict more dysregulated eating, and low progesterone amplifies this effect.
These hormonal dynamics mean that puberty doesn’t just change how a young person looks. It changes how their brain and gut communicate about hunger and fullness, creating biological conditions where disordered eating patterns can take root.
Social Media and Cultural Pressure
Cultural ideals around thinness have long been recognized as a risk factor, but social media has intensified the exposure. A meta-analysis of studies involving nearly 1,830 female participants aged 10 to 46 found a consistent link between social media use and internalization of thin-ideal body standards. The more someone used social networking platforms, the more likely they were to adopt thinness as a personal goal rather than just an abstract cultural message.
The type of engagement matters. Using appearance-focused features on social media, such as photo sharing, filters, and image-based browsing, had a meaningfully stronger association with thin-ideal internalization than general platform use. This suggests it’s not simply time spent online that poses the risk. It’s specifically the visual, comparison-driven content that reshapes how a person sees their own body.
Internalization is the key step. Many people are exposed to unrealistic body images without developing an eating disorder. The risk increases when someone moves from noticing the ideal to believing they should achieve it, and that shift is more likely in someone who already carries genetic vulnerability or is going through a stressful period.
Childhood Experiences and Trauma
Adverse childhood experiences, including abuse, neglect, household dysfunction, and bullying, are more common among people with eating disorders than in the general population. A systematic review found higher rates of these experiences across all eating disorder diagnoses, with binge-type disorders (binge eating disorder and the binge-purge form of bulimia) showing the strongest connection to childhood adversity. Restrictive anorexia had the lowest rates of reported trauma among eating disorder subtypes, though still higher than average.
Weight-related bullying stands out as a specific and potent risk factor. People who have been teased or shamed about their weight by peers, family members, coaches, or even healthcare providers are more likely to develop eating problems. The source of the shaming matters less than the fact that it happened. A coach’s offhand comment about a teenager’s body can carry the same weight as years of teasing from classmates.
Family Environment and Dieting Culture
Families influence eating disorder risk through both direct and indirect channels. Direct influences include comments about a child’s weight or body, restricting certain foods, or using food as a reward or punishment. Indirect influences are often more powerful: a parent who constantly diets, criticizes their own body, or treats certain foods as morally good or bad teaches their child a framework for relating to food that can become disordered.
Frequent dieting itself is a well-established risk factor, especially the pattern of cycling through restrictive diets followed by weight regain. Each cycle can reinforce the idea that normal eating has failed and that more extreme control is needed. For adolescents, watching a parent go through this cycle normalizes it. The Mayo Clinic specifically recommends that parents avoid dieting in front of their children and refrain from criticizing their own bodies, because children absorb these messages even when they aren’t directed at them.
Why It’s Never Just One Thing
The reason eating disorders are so difficult to predict and prevent is that no single factor is sufficient. A person can carry significant genetic risk and never develop an eating disorder if their environment is supportive and they avoid major stressors during vulnerable developmental windows. Conversely, someone with moderate genetic risk might develop a severe eating disorder after a combination of puberty, social pressure, a traumatic experience, and a well-intentioned diet that spirals out of control.
The lifetime prevalence of eating disorders in adolescents and young adults ranges from less than 1% to as high as 27% depending on how broadly the condition is defined and which population is studied. In U.S. adolescents, roughly 0.3% meet criteria for anorexia, 0.9% for bulimia, and 1.6% for binge eating disorder. These numbers likely undercount the real burden, since many people with eating disorders never receive a formal diagnosis. Understanding that the causes are layered, biological as much as psychological, and never a matter of personal choice helps explain both why these disorders develop and why recovery requires addressing the whole picture.