How Was Dementia Treated in the Past?

Dementia is not a single disease but a collection of symptoms that result from damage to the brain, affecting memory, thinking, and social abilities. Throughout history, these changes in cognitive function were understood and managed very differently than they are today. The historical approach to cognitive decline has shifted dramatically, moving from philosophical interpretation to institutional warehousing and, eventually, to targeted biological and pharmacological interventions. This evolution reflects a changing understanding of the brain, moving from theories of spiritual or humoral imbalance to the modern study of neuropathology. This article explores the progression of dementia treatment across various historical periods.

Ancient and Pre-Scientific Interpretations

For millennia, the decline in memory and reason associated with advanced age was largely considered an inevitable aspect of the human condition known as “senility.” Physicians in ancient Greece, such as Hippocrates, attributed these mental changes to imbalances in the body’s four humors: blood, phlegm, yellow bile, and black bile. Intellectual decline was often linked to an excess of cold, dark “black bile,” suggesting a physical cause for the mental state.

Treatments focused on restoring this internal balance through non-invasive methods like dietary changes, herbal remedies, and lifestyle adjustments. More aggressive interventions, such as bloodletting or purging, were also employed to remove excess humors believed to be poisoning the system.

During this long pre-scientific period, care for those with cognitive decline was primarily a family or community responsibility. The symptoms were not viewed as a distinct medical disease requiring specialized intervention. Societal responses ranged from tolerance and support to neglect, depending on the individual’s social standing and the severity of their symptoms.

The Era of Institutional Care and Moral Treatment

A significant shift occurred in the 19th century with the rise of large state institutions, or asylums, intended for the care of the mentally ill. Individuals displaying symptoms of senility, agitation, and confusion were increasingly moved into these specialized facilities. This movement was initially spurred by the philosophy of “moral treatment,” which advocated for humane conditions, occupational therapy, and a structured environment.

Moral treatment, however, proved largely ineffective for patients whose symptoms stemmed from organic, irreversible cognitive decline. As institutions became overcrowded, the humane principles of moral treatment gave way to simpler, more controlling measures.

Staff relied heavily on physical restraints to manage patients who wandered or became aggressive. Devices like straitjackets and leather straps became common tools for controlling behavior. The distinction between mental illness and senile dementia blurred, and management often defaulted to the most restrictive means available to ensure order.

Early 20th Century Physiological Interventions

The early 1900s marked the beginning of a search for specific biological causes, driven by neuropathological discoveries like Alzheimer’s disease. This focus led to a range of aggressive physiological treatments aimed at controlling behavioral symptoms. One prominent, though scientifically unfounded, hypothesis was the “focal infection theory.”

This theory proposed that mental illness, including cognitive decline, was caused by toxins released from a localized infection, most commonly in the teeth, tonsils, or colon. Treatments included the radical surgical removal of these body parts, such as performing colectomies or extracting all of a patient’s teeth, to eliminate the supposed source of toxins. These invasive procedures were applied in the hope of reversing symptoms caused by neurodegeneration.

Interventions originally developed for severe psychiatric conditions were also adapted to manage extreme behavioral symptoms. Treatments like electroconvulsive therapy (ECT) were employed to control severe agitation, aggression, or depression in patients with dementia. The prefrontal lobotomy was occasionally performed to reduce debilitating agitation and emotional distress, often resulting in profound personality changes.

Mid-to-Late 20th Century Management Strategies

The decades following the mid-20th century saw a shift away from large state hospitals toward community-based care, largely in nursing homes. This deinstitutionalization changed the primary tool for controlling challenging behaviors from mechanical to chemical means. Heavy reliance on pharmaceutical agents to manage agitation became the standard of care.

Psychotropic medications, particularly antipsychotics and sedatives, were widely administered off-label to control wandering, aggression, and nighttime disturbances. This practice became known as the “chemical straitjacket,” subduing patients and making them easier to manage in understaffed facilities. The use of these drugs prioritized staff convenience and institutional order over the patient’s quality of life.

This period lacked effective disease-modifying treatments, leaving symptom management as the focus. The widespread use of sedating medications continued largely unchecked until the 1990s, when increased scrutiny and the development of the first modern cognitive-enhancing drugs began to change the landscape of dementia care.