Why Was Baby Formula Invented? The Real History

Baby formula was invented because thousands of infants died every year when breastfeeding wasn’t possible and the alternatives were dangerous. Before the mid-1800s, a baby who couldn’t nurse from its mother faced grim options: cloths soaked in animal milk, direct suckling from goats, or hiring a wet nurse. Each carried serious risks, from bacterial contamination to disease transmission. The first commercial formula, created in 1865, was a chemist’s attempt to build a safer substitute using the tools of modern science.

How Babies Were Fed Before Formula

For most of human history, breastfeeding was the only reliable way to nourish an infant. When a mother couldn’t nurse, whether due to illness, death in childbirth, or insufficient milk supply, families turned to wet nurses. But wet nurses were expensive and carried their own risks. In 16th-century France, many mothers refused them entirely out of fear that syphilis could pass to their babies through nursing.

The other common option was animal milk, delivered in ways that seem alarming today. Cloths soaked in goat or cow milk could be squeezed into a baby’s mouth, but as Columbia University historian Richard Bulliet has noted, this method was “often a bacterial bomb.” In some cases, babies nursed directly from animals. French mothers let infants suckle from goats, and by the 18th century, orphanages kept animals on site specifically for feeding. Some of those orphans had syphilis themselves, which limited their access to human wet nurses even further.

These methods kept some babies alive, but mortality rates were staggering. Without refrigeration, sterilization, or any understanding of bacteria, animal milk spoiled quickly and carried pathogens that infant immune systems couldn’t handle.

The Dangerous State of Cow’s Milk

Even as cow’s milk became more widely available in growing cities, it was far from safe. In the late 1800s, urban milk was routinely skimmed, diluted with water, then doctored with dyes, caramel, and salt to disguise the tampering. Consumers had no way to tell what they were actually buying. If contaminated water had been added to the milk and a baby got sick, tracing the infection back to its source was nearly impossible.

The toll was enormous. Public health officials at the time reported that thousands of children perished annually from what amounted to starvation, because the milk they received had been so diluted it had little nutritional value. Breastfeeding was not the norm among American mothers at the turn of the 20th century. Instead, most fed their infants a gruel made from water and cow’s milk, weaning babies at very young ages. This meant contaminated milk was a direct, daily threat.

The first organized response came in 1893, when New York philanthropist Nathan Straus opened a milk station on the Lower East Side, providing subsidized pasteurized milk to low-income mothers. Within a decade, similar stations opened in Chicago and Rochester. City milk inspections followed and became the first widespread effort to reduce infant mortality in crowded tenements. After five years, those inspections were associated with a 12 percent drop in deaths from waterborne diseases. After a decade, the reduction reached 19 percent.

The Chemist Who Built the First Formula

The German chemist Justus von Liebig created the first commercial infant food in 1865. Liebig used his knowledge of organic chemistry to analyze human breast milk, then tried to replicate its nutritional profile using ingredients he could standardize and manufacture. His formula mixed one part wheat flour with ten parts powdered skim cow’s milk, with potassium bicarbonate added to reduce acidity.

His key innovation was adding malt, which is grain that has begun to sprout. Sprouting grain contains an enzyme that breaks starches down into simple sugars, making them far easier for a baby’s immature digestive system to process. Liebig wasn’t just guessing. He was applying a real understanding of digestion, even if the science was still rudimentary by modern standards.

The product wasn’t perfect, but it represented something genuinely new: a feeding option designed from chemistry rather than improvised from whatever animal milk was available. It launched an industry. Within a few decades, dozens of commercial infant foods flooded the market, though quality and safety varied wildly.

Why Cow’s Milk Alone Doesn’t Work

Part of the scientific challenge that Liebig and later researchers faced is that cow’s milk and human milk are surprisingly different. Cow’s milk contains about 3.3 grams of protein per 100 milliliters, more than double the 1.3 grams in mature human breast milk. That sounds like it would be beneficial, but the excess protein strains a newborn’s kidneys and digestive system.

The type of protein matters too. Human milk is rich in whey proteins, particularly one called alpha-lactalbumin, which makes up 28 percent of total breast milk protein but only 3 percent of cow’s milk protein. Cow’s milk is dominated by casein, a heavier protein that forms tough curds in the stomach and is harder for infants to break down. The ratio is essentially flipped: human milk is roughly 60 percent whey and 40 percent casein, while cow’s milk runs about 20 percent whey and 80 percent casein.

Cow’s milk also contains roughly half the levels of certain amino acids that are critical for brain development in newborns. Early formula makers didn’t understand these specifics, but they could see that straight cow’s milk made babies sick. Diluting it, adding sugar, and adjusting the fat content were all attempts to bridge a gap that scientists wouldn’t fully map for another century.

Industrialization Changed Everything

The demand for formula didn’t come only from mothers who couldn’t breastfeed for medical reasons. The Industrial Revolution reshaped family life in ways that made breastfeeding harder. As women entered factory work and moved into cities, they were separated from their infants for long stretches of the day. Extended family networks that might have provided wet nursing or childcare were disrupted by urbanization. A shelf-stable, commercially available infant food solved a practical problem that millions of families faced.

By the early 20th century, the medical profession became deeply involved. Doctors began prescribing specific “formula recipes” tailored to individual babies, mixing evaporated milk with water and corn syrup in precise ratios. This medicalization of infant feeding gave commercial products an air of scientific authority. Hospital-based marketing by the infant food industry accelerated the shift. By 1960, fewer than 20 percent of infants in some countries received any breast milk at all.

How Formula Became Regulated

For decades, there was no legal standard for what baby formula had to contain. The first federal regulations in the United States didn’t go into effect until 1941. Even then, oversight was limited. In 1954, the American Academy of Pediatrics established its Committee on Nutrition, which began setting standards for infant nutritional requirements and evaluating commercial products. Its early reports tackled issues like advertising ethics and how much water babies need relative to the concentration of their food.

The modern regulatory framework came with the Infant Formula Act of 1980, which requires every formula sold in the U.S. to meet minimum levels for 29 specific nutrients, including protein, fat, 12 vitamins, and 9 minerals. The law sets both floors and ceilings for key nutrients. Protein, for example, must fall between 1.8 and 4.5 grams per 100 calories. Iron must be present but capped to prevent toxicity. The calcium-to-phosphorus ratio is legally required to stay between 1.1 and 2.0 to ensure proper bone development.

These regulations exist because the history of infant feeding is, in large part, a history of babies dying from products that were unsafe, nutritionally incomplete, or fraudulently made. Every major advancement in formula, from Liebig’s malt-based mixture to pasteurized milk stations to modern nutrient standards, came as a direct response to infant suffering and death. Formula was invented not as a convenience but as a lifeline, and the century-long effort to make it safe reflects how high the stakes have always been.