When Did Genetic Testing Start? A Brief History

Genetic testing involves analyzing an individual’s DNA to identify changes in genes, chromosomes, or proteins. This analysis helps in understanding health, diagnosing genetic conditions, predicting disease risk, or even determining ancestry. Genetic testing has a rich history, evolving from early observations of inherited traits to sophisticated genomic analyses.

Foundational Discoveries

The groundwork for understanding genetic inheritance began long before direct DNA analysis was possible. In the mid-19th century, Gregor Mendel’s experiments with pea plants revealed fundamental laws of heredity, demonstrating that traits are passed down in discrete units. Published in 1866, his work provided the first systematic framework for understanding how characteristics are inherited.

The early 20th century saw the identification of chromosomes within cells as the carriers of this genetic information. A major breakthrough occurred in 1953 when James Watson and Francis Crick, building on the work of Rosalind Franklin and Maurice Wilkins, described the double helix structure of DNA. This discovery provided the physical and chemical basis for genetic information storage, copying, and transmission, foundational for modern molecular biology.

Early Diagnostic Applications

Genetic testing for human conditions began in the mid-20th century. A significant advancement was the development of karyotyping. This technique allows scientists to visualize and analyze an individual’s chromosomes.

In 1959, Jérôme Lejeune and his colleagues Marthe Gautier and Raymond Turpin made a landmark discovery using karyotyping, identifying an extra copy of chromosome 21 in individuals with Down syndrome, then known as “mongolism.” This discovery established the first link between a specific chromosomal abnormality and an intellectual disability, marking a significant moment in diagnostic cytogenetics.

Protein-based tests also emerged for diagnosing metabolic disorders. An example is the Guthrie test for screening newborns for Phenylketonuria (PKU). This test detected elevated levels of phenylalanine in a newborn’s blood, indicating PKU, even though it didn’t directly analyze the gene. Another early application involved the electrophoretic detection of abnormal hemoglobin, used to identify sickle cell anemia.

Technological Leaps

Technological advancements in the 1970s and 1980s transformed genetic testing, making it more precise and accessible. Restriction enzymes, discovered in the 1970s, allowed scientists to cut DNA at specific sequences, enabling genetic material manipulation. This paved the way for recombinant DNA technology, where DNA from different sources could be combined.

Sanger sequencing, developed by Frederick Sanger in 1977, was a significant leap. This method provided the first widely adopted way to determine the exact order of nucleotides in DNA, allowing direct analysis of individual genes. Automation using capillary electrophoresis in 1987 further increased its efficiency.

The Polymerase Chain Reaction (PCR), invented by Kary Mullis in 1983, revolutionized genetic analysis. PCR allowed rapid amplification of millions of copies from tiny amounts of DNA. This amplification was important for genetic testing, enabling detailed study of specific DNA sequences that would otherwise be too scarce to analyze.

The Genomic Era

The early 21st century ushered in the genomic era, marked by large-scale genetic analysis. The Human Genome Project, an international scientific endeavor launched in 1990 and completed in 2003, aimed to sequence the entire human genome. This ambitious project mapped approximately 3 billion base pairs of human DNA, providing a foundational reference for understanding human genetics and disease.

The project’s success led to a deeper understanding of genetic diseases and accelerated new diagnostic and therapeutic approaches. Following the Human Genome Project, Next-Generation Sequencing (NGS) technologies emerged in the early 2000s. NGS significantly reduced the cost and time for DNA sequencing, enabling analysis of entire exomes (protein-coding regions of genes) or whole genomes more efficiently. This technological shift expanded genetic testing applications, leading to widespread use in research, clinical diagnostics, and direct-to-consumer services.

Betweenness Centrality in Biological Networks

Why Are Drug Development Costs So High?

What Is an Orthotopic Model and Why Is It Used?