Anesthesia was invented to solve one of medicine’s oldest and most brutal problems: surgery was performed on fully conscious patients. Before 1846, the best a patient could hope for was an herbal sedative or a dose of alcohol while a surgeon cut as fast as humanly possible. At the start of the 19th century, up to 80% of surgical patients died, often from complications in the days after the operation. The need to eliminate pain during surgery wasn’t just a matter of comfort. It was a barrier that limited what surgeons could attempt, how carefully they could work, and whether patients would even agree to go under the knife.
What Surgery Looked Like Before Anesthesia
For most of human history, surgery meant agony. Patients were held down by assistants while surgeons worked at extraordinary speed, because every second of cutting meant another second of screaming. As far back as 400 BCE, Assyrians compressed the carotid arteries in the neck to briefly knock patients unconscious for procedures like circumcision or cataract surgery. Egyptians used the same technique for eye operations. These methods were crude, unreliable, and dangerous in their own right.
By the 15th century, the standard approach was a mixture of opium, mandragora, and henbane applied to a sponge and held under the patient’s nose. This “soporific sponge” could dull the senses but rarely eliminated pain entirely, and dosing was guesswork. Too little and the patient remained in agony. Too much and the sedative itself could be fatal. This basic method persisted for centuries, essentially unchanged, well into the 1800s. Surgeons were left with a narrow window of operations they could realistically perform: amputations, surface tumor removals, and other procedures that could be completed in minutes. Anything requiring slow, precise work inside the body was effectively off the table.
The Problem That Forced Innovation
Speed was the defining constraint. A surgeon’s reputation often rested not on precision but on how quickly they could finish. This meant surgical techniques were limited to brute-force approaches. There was no time to carefully explore tissue, tie off blood vessels methodically, or operate on internal organs. The human body’s involuntary response to pain, including thrashing, muscle tension, and shock, made delicate work nearly impossible even with assistants restraining the patient.
Pain also kept people away from surgery altogether. Patients who might have survived a tumor removal or a corrective procedure chose to live with their condition rather than endure the ordeal. Surgeons themselves recognized that the inability to control pain was the single greatest limitation of their craft. The motivation behind anesthesia wasn’t academic curiosity. It was a desperate, practical need to make surgery survivable and humane.
The First Attempts and Failures
The road to modern anesthesia started with nitrous oxide, commonly known as laughing gas. In 1845, a Connecticut dentist named Horace Wells attempted a public demonstration at Harvard, hoping to prove that nitrous oxide could eliminate surgical pain. It went badly. Wells failed to administer a sufficient dose to fully anesthetize the patient, and the audience dismissed the entire concept as a hoax. Wells himself later recalled that observers called it “a humbug affair,” which was “all the thanks I got for this gratuitous service.” The failure set back public confidence in pain-free surgery and personally devastated Wells.
But the idea didn’t die. Wells’ former partner, a Boston dentist named William Morton, shifted his attention to ether, a chemical compound with stronger and more reliable sedative properties. Morton spent months experimenting with ether on animals and on himself before he felt confident enough to attempt a public demonstration.
October 16, 1846: The Day Surgery Changed
On October 16, 1846, Morton administered ether to a patient named Edward Gilbert Abbott at Massachusetts General Hospital. The operation was the removal of a tumor under Abbott’s jaw, later identified as a congenital lymphovascular malformation. Abbott remained unconscious throughout, and the surgery was completed without the screaming, restraint, and chaos that had defined every prior operation. The demonstration was witnessed by a room full of surgeons and medical faculty, and the impact was immediate. Word spread across the United States and then to Europe within weeks.
This single event, now known as “Ether Day,” is widely considered the birth of modern anesthesia. It proved that a patient could be rendered completely unconscious, safely, for the duration of an operation. For the first time, surgeons could work slowly, carefully, and on parts of the body that had previously been untouchable.
Early Resistance to Pain Relief
Not everyone welcomed the breakthrough. Some physicians believed pain actually served a useful purpose during surgery, arguing that it helped patients survive by keeping the body alert and responsive. Others raised moral objections, comparing the state of anesthesia to severe drunkenness and questioning whether it was ethical to render someone completely unconscious.
The most heated debate surrounded the use of anesthesia during childbirth. When Scottish obstetrician James Simpson began using chloroform (a newer alternative to ether) to relieve labor pain, critics cited Genesis 3:16, which describes pain in childbearing as part of a divine curse on women. From pulpits across Britain, Simpson’s use of chloroform was “denounced as impious and contrary to Holy Writ,” with clergy arguing that relieving birth pain meant defying God’s will. Other objections were more practical: some worried that a drowsy or unconscious mother couldn’t respond to instructions during delivery, or that numbing the pain might interfere with the contractions needed to deliver the baby.
How Queen Victoria Settled the Debate
Public opinion shifted dramatically in 1853, when Queen Victoria chose to use chloroform during the birth of her eighth child, Prince Leopold. Her physician, Dr. John Snow, administered the anesthetic by holding a chloroform-saturated handkerchief over her face. The results were so satisfactory that Victoria requested chloroform again for her next delivery. The practice quickly became known in Britain as “anesthesia a la Reine,” or anesthesia fit for a queen.
Royal approval carried enormous cultural weight. Opposition to chloroform, both religious and medical, largely evaporated. As one contemporary account put it, approval by the Queen was “as close as you could get to approval by God.” The use of chloroform and ether spread rapidly through hospitals across Europe and North America in the years that followed.
What Anesthesia Made Possible
The invention of anesthesia didn’t just remove pain. It fundamentally transformed what surgery could be. Before 1846, operations were limited to what could be done in minutes on the body’s surface. Afterward, surgeons could take their time. They could operate on abdominal organs, repair internal injuries, and eventually develop the intricate cardiac, neurological, and reconstructive procedures that define modern medicine. Anesthesia also made elective surgery possible for the first time, allowing patients to choose corrective operations they would never have endured while conscious.
The soporific sponge that had been used since the Middle Ages gave way, within a few decades, to precisely controlled anesthetic delivery systems. Joseph Lister, the pioneer of antiseptic surgery, initially used the old cloth-over-the-face method with a few drops of ether or chloroform. But the field professionalized quickly, and by the late 19th century, anesthesia was becoming a medical specialty in its own right, with dedicated practitioners focused solely on keeping patients safely unconscious during increasingly complex operations.