DNA evidence is not wrong in the way most people imagine. A clean, single-source DNA sample matched to a suspect is extraordinarily accurate, with the chance of a random match often cited as one in billions or more. But DNA evidence can and does produce misleading results, and it happens more often than the public realizes. The errors rarely come from the DNA itself. They come from how samples are collected, handled, mixed, interpreted, and presented in court. Of the 375 people exonerated by DNA testing through the Innocence Project, 43% of the original convictions involved misapplication of forensic science.
Where DNA Evidence Actually Goes Wrong
The phrase “DNA evidence” covers a wide spectrum of reliability. At one end, you have a blood sample from a crime scene that contains a single person’s complete genetic profile. That kind of evidence is about as close to certain as forensic science gets. At the other end, you have a tiny trace of skin cells lifted from a doorknob, degraded by heat and time, mixed with DNA from several other people, and analyzed by software that requires analysts to guess how many contributors are in the sample. That kind of evidence can be deeply unreliable, yet both types get presented in court under the same umbrella of “DNA evidence.”
The problems fall into a few major categories: contamination during collection or storage, secondary transfer (your DNA ending up somewhere you never touched), errors in interpreting complex mixtures, software limitations, and human bias in the analysts reading the results.
Contamination Can Create False Trails
One of the most dramatic examples of DNA contamination played out in Europe over 15 years. Beginning in 1993, police across Germany, Austria, and France found the same female DNA profile at more than 40 crime scenes, including brutal murders and thefts. They spent eight years actively hunting this “Phantom of Heilbronn,” logging over 16,000 hours of overtime and roughly 2 million euros in investigation costs. In March 2009, investigators finally realized the phantom criminal did not exist. The DNA had been present on the cotton swabs before they were ever used at crime scenes. All the swabs came from the same factory, where a worker’s skin cells, sweat, or saliva had contaminated them during manufacturing. The swabs had been properly sterilized to kill bacteria and viruses, but sterilization doesn’t destroy human DNA.
This case exposed a systemic vulnerability. Any consumable used at a crime scene or in a forensic lab, from swabs to tubes to gloves, can carry trace amounts of human DNA from the manufacturing or packaging process.
Secondary Transfer Puts Your DNA Where You’ve Never Been
You can leave your DNA on an object you never touched. This is called secondary transfer, and it happens when DNA moves from person to object to another object, or from person to person to object. You shake someone’s hand, they pick up a knife, and your DNA is now on that knife.
A study published in Genes tested how often DNA transfers between items stored in the same evidence package, the kind of bag police use to hold multiple pieces of evidence from a crime scene. DNA transferred between items in 39% of the samples tested. More critically, 10% of all items had their source attribution changed by the transfer, meaning the DNA profile shifted enough to point toward the wrong person. Cotton gloves were the worst offenders, with 88% showing transferred DNA from their packaging partner. Resealable bags picked up transferred DNA 83% of the time, and latex gloves 67% of the time. Hard, smooth surfaces like lighters transferred DNA only about 10% of the time.
This means that even after evidence is properly collected, the simple act of packaging two items together can introduce DNA that changes who the evidence points to, especially when the original DNA on an item was present in tiny quantities.
Mixed Samples Are a Major Weak Point
When a DNA sample contains genetic material from more than one person, the analysis gets dramatically harder. Crime scene samples frequently contain mixtures: a steering wheel gripped by multiple people, clothing worn and then handled by investigators, a weapon touched during a struggle. Once a sample contains DNA from three or more people, interpretation becomes genuinely uncertain.
The core problem is separating one person’s genetic markers from another’s when the signals overlap. Analysts have to estimate how many people contributed to the mixture, which is itself a source of error. The software most commonly used for this task requires the analyst to input the correct number of contributors, and research has shown this assumption is frequently wrong when analysts rely on standard counting methods. Getting this number wrong cascades through the entire analysis.
Different software tools produce very different error rates. One widely used system showed false positive rates of roughly 1 in 1,200 individuals for three-person mixtures, while a competing system reported rates closer to 1 in 20,000. A NIST interlaboratory study called MIX13, which sent identical DNA mixture samples to forensic labs across the United States and Canada, found large variation in how different analysts and different labs interpreted the exact same evidence. The same sample could lead one lab to identify a suspect and another to exclude them.
Touch DNA Pushes the Limits of Reliability
Touch DNA, also called low-template DNA, comes from the skin cells you shed when you handle an object. It’s become a go-to source of evidence because it’s available on almost anything a person has touched. But the amount of DNA recovered is often vanishingly small, sometimes just a few dozen trillionths of a gram, and that creates serious accuracy problems.
When the starting amount of DNA is extremely low, the amplification process that makes it readable introduces random effects. Some genetic markers get amplified while others drop out entirely, a phenomenon called allele dropout. This means the resulting profile may be incomplete or distorted. Labs set minimum signal thresholds to filter out background noise, but with low-template samples, the true DNA signal can be so faint that it’s nearly indistinguishable from noise.
Research on optimizing these thresholds found a fundamental tradeoff. Lowering the threshold catches more real DNA signals but also labels more noise as real DNA, creating false inclusions. Raising the threshold reduces false inclusions but causes more real markers to be missed, potentially excluding the true contributor. For extremely low-template samples (below about 8 picograms of DNA), even optimized thresholds produced total error rates near 1.0, meaning roughly one error per profile analyzed. Labs working with such tiny quantities also face contamination rates between 8% and 11%, so the DNA in the sample may not accurately reflect who actually touched the object.
Analyst Bias Shapes Ambiguous Results
DNA interpretation is not fully objective. When a profile is clean and complete, there’s little room for judgment. But when profiles are partial, degraded, or mixed, analysts make subjective calls at multiple steps: deciding how many contributors are present, determining which peaks represent real genetic markers and which are noise, and choosing whether a suspect can be included or excluded.
Research has demonstrated that cognitive bias influences these decisions. When analysts know that a suspect confessed, or that an eyewitness identified someone, their interpretations of ambiguous DNA profiles shift. This kind of contextual bias has been documented across forensic disciplines including DNA mixture interpretation, fingerprint comparison, and toxicology. The bias can also compound over time. Documented patterns suggest that prior case outcomes can create self-reinforcing cycles, where earlier biased decisions shape the baseline expectations analysts bring to future cases.
What This Means in Practice
A complete, single-source DNA profile from a well-collected sample remains one of the most powerful forms of evidence in criminal justice. The chance of a coincidental match between two unrelated people is genuinely astronomical for a full profile. The problems arise at the margins, and an increasing share of forensic DNA work happens at those margins. Touch DNA from handled objects, mixed samples from busy environments, degraded samples exposed to weather or time: these are now routine in criminal cases, and they carry real risks of error.
The 375 DNA exonerations recorded by the Innocence Project represent cases where later, better DNA testing proved someone innocent. They don’t capture cases where flawed DNA interpretation contributed to a conviction that was never re-examined. The actual scale of DNA-related errors in the justice system is almost certainly larger than what exoneration numbers alone suggest, because exonerations require preserved evidence, legal resources, and years of effort that most convicted people never receive.