What Is the Rabbit Test and How Did It Work?

The “rabbit test” was a historical method for detecting pregnancy, widely employed in the mid-20th century. This early biological test, known scientifically as a bioassay, relied on the physiological response of a live animal to a woman’s urine. The phrase “the rabbit died” became a common, albeit grim, euphemism indicating a positive pregnancy result during this era. This highlights the ingenuity and limitations of diagnostic methods before modern laboratory techniques.

How the Test Was Performed

The “rabbit test,” also known as the Friedman test, was developed in 1931. The procedure involved injecting a woman’s urine into a female rabbit. The scientific basis for this test was the detection of human chorionic gonadotropin (hCG), a hormone produced by the placenta shortly after a fertilized egg implants in the uterus. If pregnant, the woman’s urine would contain significant levels of hCG.

Upon injection, the hCG hormone would stimulate specific changes within the rabbit’s reproductive system. Within 36 to 48 hours, if hCG was present, the rabbit’s ovaries would show observable reactions. These reactions included enlargement, the formation of hemorrhagic follicles, or the development of corpora lutea.

To determine if these changes had occurred, the rabbit had to be euthanized and dissected so its ovaries could be visually examined. This was necessary regardless of the test result, meaning the rabbit always died. The test was effective for its time, with an accuracy rate of around 98 percent.

From Rabbit Test to Modern Diagnostics

The rabbit test declined due to ethical concerns about animal welfare, the time for results, and the expense of maintaining rabbit colonies. Each test required a live animal, and tens of thousands of rabbits were sacrificed over the years for this diagnostic purpose. This labor-intensive and animal-dependent method spurred the search for more humane and efficient alternatives.

Earlier animal-based tests used mice or African clawed frogs, the latter not requiring the animal to be killed. The shift to immunoassay-based tests began in the 1960s. The first such test, the hemagglutination inhibition test, emerged in 1960, offering a faster and less costly alternative that did not require live animals.

Further advancements in the 1970s, including radioimmunoassays (RIA), provided even greater sensitivity and specificity for detecting hCG. These breakthroughs paved the way for the development and widespread availability of home pregnancy tests, with the first approved by the FDA in 1976. Modern pregnancy tests are now rapid, non-invasive, and animal-free, relying on chemical reactions to detect hCG in urine or blood, providing results within minutes.