Celiac disease (CD) is a chronic autoimmune condition where the ingestion of gluten leads to damage in the small intestine. This reaction, which occurs in genetically predisposed individuals, causes the immune system to attack the lining of the small bowel, interfering with nutrient absorption. Tracing the history of this disorder reveals a long struggle by physicians to understand a condition that has been affecting humans for millennia. The journey from ancient observations to the modern understanding of the specific protein trigger and its complex diagnostic tools spans over two thousand years.
Early Descriptions and Clinical Naming
The earliest documented observations of a condition resembling celiac disease date back to the second century AD. The Greek physician Aretaeus of Cappadocia provided the first clinical description, noting a chronic affliction characterized by severe diarrhea, abdominal distention, and wasting. He named the condition “The Coeliac Affection,” deriving the term from the Greek word koilia, meaning “abdomen.” This name highlighted the patients’ primary suffering and their inability to retain food, which passed through undigested.
For centuries, this digestive disorder remained a mystery, with various treatments attempted but no clear cause identified. It was not until 1888 that the condition received its formal modern definition from English physician Dr. Samuel Gee. Gee described the symptoms in detail, specifically noting children between one and five years old who suffered from chronic indigestion and wasting. He documented the pale, bulky, frothy, and foul-smelling stools, indicating severe malabsorption.
Dr. Gee stated that if the patient could be cured, it would have to be through diet, though he was unable to identify the specific toxic component. His suggested experimental diets included items like a quart of Dutch mussels daily, which temporarily improved one child’s condition, highlighting the trial-and-error nature of treatment. Gee’s comprehensive description provided the foundation for future clinical research, re-adopting the ancient Greek name for the disorder.
Identifying the Dietary Trigger
The breakthrough connecting the symptoms to a specific dietary component came during a time of great hardship in the Netherlands. Dutch pediatrician Dr. Willem-Karel Dicke had long suspected that wheat played a role in the disorder, even before World War II. He observed that relapses in his young patients often coincided with the consumption of bread.
Dr. Dicke’s suspicions were confirmed during the “Dutch Hunger Winter” of 1944–1945, when a German blockade led to a severe famine and scarcity of wheat products. Paradoxically, the mortality rate among children with celiac disease in his care dropped from over 35% to nearly zero, and their symptoms improved significantly. This improvement was followed by a sharp relapse when Allied air drops brought bread back into the diet.
These wartime observations provided the evidence needed to establish the link between wheat and the condition. Dicke’s subsequent 1950 doctoral thesis formally demonstrated that removing wheat, rye, and oats from the diet led to dramatic clinical improvement. Further research by his colleagues, including Jan van de Kamer and Harmen Weijers, isolated the toxic substance as a protein component, which was definitively identified as gluten, specifically the gliadin fraction. This work established the gluten-free diet as the only effective treatment, transforming the prognosis from a potentially fatal childhood disease into a manageable chronic condition.
The Development of Modern Diagnostics and Screening
With the cause of celiac disease now known, the focus shifted to finding reliable methods for diagnosis beyond clinical observation. The first significant advancement occurred in the mid-1950s with the introduction of the small intestinal biopsy. Pioneers like Dr. Margot Shiner developed a suction capsule technique for obtaining tissue samples from the jejunum. This allowed physicians to visually confirm the characteristic internal damage, known as villous atrophy (the flattening of the small intestine’s lining).
This invasive biopsy procedure remained the gold standard for decades until the rise of serological testing. The first generation of blood tests involved detecting anti-gliadin antibodies (AGA) in the 1960s, offering a less invasive screening option. A major diagnostic leap occurred in the 1980s and 1990s with the identification of highly specific autoantibodies.
The discovery of endomysial antibodies (EMA) in 1984, followed by the identification of tissue transglutaminase (tTG) as the target auto-antigen in 1997, revolutionized screening. Tissue transglutaminase IgA is now widely recognized as the most sensitive and specific screening test, often used as a first-tier method. The identification of the necessary genetic markers, HLA-DQ2 and HLA-DQ8, in the 1990s provided a tool to reliably rule out the condition in individuals without these genes. These modern diagnostic tools have led to a significant increase in awareness and a perceived rise in the prevalence of celiac disease.