Dr. Anthony Allison, a medical scientist and geneticist, investigated a major puzzle in the 1950s: the surprisingly high prevalence of a gene that, in its double-dose form, caused a deadly blood disorder. His work revolutionized the understanding of human evolution by proposing a direct link between the sickle cell trait and resistance to a major infectious disease. Allison hypothesized that carrying a single copy of the sickle cell gene (a heterozygote) provided a powerful survival advantage. He set out to test this idea, which suggested a genetic mutation could be maintained by balancing its fatal consequences with protection against an environmental threat.
Formulating the Hypothesis: The Geographic Clues
The initial observations that sparked Allison’s hypothesis came from a 1949 Oxford University expedition to Mount Kenya. He was tasked with collecting and analyzing blood samples from various East African populations. During this survey, he noted a striking non-uniform distribution of the sickle cell trait across the region.
The frequency of individuals carrying the trait was extremely high, sometimes exceeding 20%, in populations living in hot, humid areas like the coastal regions and around Lake Victoria. Conversely, the trait was nearly absent, often less than 1%, in people inhabiting the drier, high-altitude Kenyan highlands. This sharp contrast suggested a powerful environmental factor was at play.
Allison recognized that the areas with the highest sickle cell trait frequency were also known to be hyperendemic for Plasmodium falciparum malaria, the most lethal form of the parasite. He theorized that the sickle cell gene, despite causing a severe disease in the homozygous state, persisted because the single-gene carrier (the heterozygote) gained a selective advantage by being protected from malaria. This concept proposed a state of balanced polymorphism, where selection favored the heterozygote over both the susceptible normal and the severely affected sickle cell homozygotes.
Designing the Human Study: Testing the Blood
To scientifically test his correlation, Allison needed to compare the incidence of malaria in people who carried the trait versus those who did not. His field research in East Africa, spanning Kenya, Uganda, and Tanganyika, involved collecting approximately 5,000 blood samples from local populations.
A practical challenge was identifying carriers before modern genetic sequencing was available. Allison utilized a simple, on-site laboratory technique: a low-oxygen environment was induced in a blood sample, causing the red blood cells of trait carriers to visibly sickle under a microscope. He powered his traveling microscope with a small bulb run off a car battery, enabling him to determine the hemoglobin status of thousands of individuals in remote locations.
The next step involved assessing the level of malarial infection in the same individuals. This was achieved by preparing thin and thick blood smears, staining them, and meticulously counting the number of P. falciparum parasites present within the red blood cells.
To ensure the results reflected true genetic resistance rather than acquired immunity, Allison focused his analysis primarily on young children. Children between six months and four years old were the most informative cohort. This age group had lost their mother’s protective antibodies but had not yet developed a robust acquired immune response to the parasite.
Analyzing the Data: Evidence of Protection
The systematic comparison of blood samples yielded the definitive evidence Allison sought. When examining the data from children in malaria-endemic regions, a clear pattern emerged between a child’s hemoglobin type and their parasite load. Individuals with the sickle cell trait (AS heterozygotes) exhibited significantly lower counts of P. falciparum in their bloodstream compared to those with normal hemoglobin (AA homozygotes).
This reduction in parasite density was directly correlated with a lower incidence of severe clinical malaria. Children who carried the trait were far less likely to be hospitalized or suffer from life-threatening complications, such as cerebral malaria or severe malarial anemia. Subsequent studies confirmed this finding, demonstrating that the sickle cell trait provided substantial protection from severe and fatal malaria infection.
The statistical findings confirmed the hypothesis: the sickle cell trait conferred a powerful selective advantage where P. falciparum was common. This evidence established the sickle cell gene as the first documented example of a single-gene trait maintained in a human population due to natural selection against an infectious disease. The research demonstrated that the genetic “cost” of the disorder was balanced by the immense survival “benefit” it offered to carriers in malarious regions.