Mental health treatment in the 1950s was defined by large-scale institutionalization, experimental physical therapies, and a major turning point: the arrival of the first psychiatric medications. More than 500,000 people lived in state psychiatric hospitals at mid-decade, many confined for years or even decades. The treatments they received ranged from talk therapy influenced by Freudian ideas to insulin-induced comas, lobotomies, and electroconvulsive therapy. By the end of the decade, new drugs were beginning to reshape what was possible, but the system patients lived inside remained harsh and largely beyond their control.
Life Inside Psychiatric Hospitals
State-run psychiatric institutions were the centerpiece of mental health care in the 1950s. These facilities housed enormous populations, often in overcrowded and underfunded conditions. Patients diagnosed with schizophrenia, severe depression, or other serious conditions could be admitted and effectively forgotten, spending the rest of their lives behind hospital walls. The institutions functioned more as warehouses than treatment centers, with limited staff and few options beyond containment.
Getting someone committed was disturbingly easy by modern standards. States relied on broad legal criteria that gave families, doctors, and courts wide latitude. A person could be involuntarily hospitalized based on “dangerousness,” a perceived “need for care and treatment,” or even the vague standard of “welfare of self or others.” Once inside, patients had almost no legal protections. Hospital administrators generally operated on the assumption that committed patients were mentally incapable of giving consent, so treatment decisions were made for them. Even in states like California, where laws technically allowed patients to refuse procedures like shock therapy or lobotomy, those rights could be overridden “for good cause” by the person running the facility. In practice, patients were wards of the state, and the state could do what it saw fit.
How Doctors Classified Mental Illness
The first edition of the Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association in 1952, gave psychiatrists their first standardized system for classifying conditions. It framed mental disorders as “reactions,” reflecting the dominant view of the era: that psychiatric illness represented the personality’s reaction to psychological, social, and biological stressors. This language carried the heavy influence of Adolf Meyer, a psychiatrist who saw mental illness not as a fixed brain disease but as a response to life circumstances.
The practical effect was a diagnostic system that was broad, subjective, and inconsistent. Two doctors could evaluate the same patient and arrive at very different conclusions. The lack of clear, measurable criteria meant that diagnosis often depended on the clinician’s theoretical leanings, and conditions like homosexuality were classified as mental disorders. This looseness also made it easier to institutionalize people who didn’t fit social norms.
Lobotomy and Insulin Coma Therapy
Two of the most notorious treatments of the era were already in decline by the mid-1950s, but both were still in active use at the start of the decade.
Lobotomy involved severing connections in the brain’s frontal lobes, typically through the eye socket using an instrument resembling an ice pick. The procedure was championed by neurologist Walter Freeman, who performed thousands of them across the country, sometimes in his van with no surgical setting at all. The goal was to calm severely disturbed patients, but the results were devastating. Many patients were left emotionally flat, cognitively impaired, or unable to function independently. The procedure fell out of favor as psychiatric drugs emerged, but not before tens of thousands of Americans had undergone it.
Insulin shock therapy, introduced in the 1930s, was still a standard treatment for schizophrenia in the early 1950s. Patients were injected with escalating doses of insulin until they fell into a hypoglycemic coma, then revived with a concentrated sugar solution. A full course of treatment involved inducing 30 to 40 separate comas. The theory was that the repeated shock to the nervous system could somehow reset disordered thinking. The procedure carried serious risks, including prolonged coma and death, and it was eventually abandoned as evidence grew that it was no more effective than other treatments.
Electroconvulsive Therapy Gets Safer
Electroconvulsive therapy had been used since the late 1930s, but its early form was brutal. Patients were fully conscious when electrical current was applied to the brain, and the resulting seizures were so violent they could fracture bones. The 1950s brought a critical improvement: the widespread adoption of “modified ECT,” which combined general anesthesia with a synthetic muscle relaxant. Physicians had experimented with curare, a plant-derived paralytic, in the 1940s, but switched to safer synthetic alternatives in the 1950s. The anesthesia meant patients were unconscious during the procedure, and the muscle relaxant prevented the full-body convulsions that had caused injuries.
This modified version made ECT more acceptable to both patients and doctors, and it remained one of the most effective treatments available for severe depression throughout the decade. But the public image of ECT was already deeply negative, shaped by its earlier, unmodified form and by portrayals in media that emphasized its use as punishment or control rather than therapy.
The First Psychiatric Medications
The single biggest shift in 1950s mental health care was the introduction of chlorpromazine, sold in the United States as Thorazine. Approved by the FDA for the treatment of schizophrenia in 1957, it was the first drug that could genuinely reduce psychotic symptoms like hallucinations and delusions. Before chlorpromazine, a schizophrenia diagnosis often meant permanent hospitalization. The drug didn’t cure the condition, but it made symptoms manageable enough that many patients could leave the hospital and live in their communities.
The impact on institutionalization was enormous. From a peak of more than 500,000 patients in state psychiatric hospitals in the mid-1950s, the population would eventually drop to around 60,000 by 2000, a decline that chlorpromazine helped set in motion. The drug also opened the door for an entire class of psychiatric medications. Within a few years, the first antidepressants and anti-anxiety drugs followed, fundamentally changing what psychiatry could offer.
The arrival of these drugs also had a darker side. They were sometimes used to sedate and control patients rather than to treat them, and the rush toward deinstitutionalization that followed left many former patients without adequate community support. But in the context of the 1950s, when the alternatives were lobotomy, insulin comas, or indefinite confinement, the drugs represented a genuine revolution.
The Push for National Reform
By the middle of the decade, there was growing recognition that the system was broken. Exposés by journalists and former patients had revealed the squalid conditions inside state hospitals, and pressure mounted on the federal government to act. In 1955, Congress passed the Mental Health Study Act, which called for “an objective, thorough, and nationwide analysis and reevaluation of the human and economic problems of mental illness.” The law authorized the National Institute of Mental Health to study the state of care across the country and created the Joint Commission on Mental Illness and Health to develop recommendations.
The commission’s work would eventually lead to the Community Mental Health Act of 1963, which aimed to replace large institutions with local treatment centers. But the 1955 law was significant on its own: it marked the first time the federal government formally acknowledged that mental health care in America needed to be fundamentally rethought, not just expanded.
Stigma and Social Attitudes
The social experience of mental illness in the 1950s was defined by shame and secrecy. Families often hid the fact that a relative had been hospitalized. A psychiatric diagnosis could cost someone a job, a marriage, or custody of their children. The legal system treated mentally ill people as fundamentally incompetent, stripping them of decision-making power the moment they were committed.
Popular culture reinforced fear. Movies and newspapers depicted people with mental illness as dangerous and unpredictable. The Cold War era’s emphasis on conformity made any deviation from “normal” behavior suspect. For women especially, behaviors that challenged social expectations, such as refusing domestic roles or expressing anger, could be pathologized and used as grounds for institutionalization. The 1950s were a decade of enormous change in psychiatric medicine, but the stigma surrounding mental illness would take far longer to erode than the treatments themselves.