Forensic Science Isn’t Science. Why juries hear—and trust—so much biased, unreliable, inaccurate evidence.
Nine days before death row inmate Earl Washington’s scheduled execution, his lawyers informed the state of Virginia that it was about to murder an innocent man. Forensic analysis of semen introduced at trial had convinced the jury that Washington, whose mental abilities matched those of a 10-year-old, had brutally raped and murdered a young woman in 1982. Washington’s lawyers uncovered evidence that the analysis was faulty. The state halted the impending execution, and following a gubernatorial pardon, Washington was released from prison in 2001. He had been there for 17 years.
How could forensic evidence, widely seen as factual and unbiased, nearly send an innocent person to his death? The answer is profoundly disturbing—and suggests that for every Earl Washington freed, untold more are sent to their deaths. Far from an infallible science, forensics is a decades-long experiment in which undertrained lab workers jettison the scientific method in favor of speedy results that fit prosecutors’ hunches. No one knows exactly how many people have been wrongly imprisoned—or executed—due to flawed forensics. But the number, most experts agree, is horrifyingly high. The most respected scientific organization in the country has revealed how deeply, fundamentally unscientific forensics is. A complete overhaul of our evidence analysis is desperately needed. Without it, the number of falsely convicted will only keep growing.
Behind the myriad technical defects of modern forensics lie two extremely basic scientific problems. The first is a pretty clear case of cognitive bias: A startling number of forensics analysts are told by prosecutors what they think the result of any given test will be. This isn’t mere prosecutorial mischief; analysts often ask for as much information about the case as possible—including the identity of the suspect—claiming it helps them know what to look for. Even the most upright analyst is liable to be subconsciously swayed when she already has a conclusion in mind. Yet few forensics labs follow the typical blind experiment model to eliminate bias. Instead, they reenact a small-scale version of Inception, in which analysts are unconsciously convinced of their conclusion before their experiment even begins.
Advertisement
The second flaw that plagues forensics is even more alarming: For decades, nobody knew how accurate forensic analyses were, or whether they were accurate at all. There’s no central agency that evaluates each test for precision or reliability before approving its use, and most were developed with barely a gesture toward the scientific method and with little input from the scientific community. Nor did the creators of forensics tests publish their methods in peer-reviewed scientific journals. And why should they? Without a government agency overseeing the field, forensic analysts had no incentive to subject their tests to stricter scrutiny. Groups such as the Innocence Project have continually put pressure on the Department of Justice—which almost certainly should have supervised crime labs from the start—to regulate forensics. But until recently, no agency has been willing to wade into the decentralized mess that hundreds of labs across the country had unintentionally created.
It might sound astonishing that undependable forensic tests have been able to slip through the cracks for so many decades. But in light of their origin and use, it’s really no surprise at all. Unlike medical diagnostic tools—which undergo rigorous testing by government agencies—forensic analyses are developed exclusively for law enforcement. Almost everybody will need a cancer screening some day, but you won’t need a semen analysis until you’re convicted of rape. So long as forensics remains the sole province of law enforcement, police and prosecutors have no incentive to screen their tests for accuracy once they’ve been adopted. Exposing a bad test might mean exposing a wrongful prosecution, and few prosecutors are eager to admit that they might be sending innocent people to prison.
In 2009, a National Academy of Sciences committee embarked on a long-overdue quest to study typical forensics analyses with an appropriate level of scientific scrutiny—and the results were deeply chilling. Aside from DNA analysis, not a single forensic practice held up to rigorous inspection. The committee condemned common methods of fingerprint and hair analysis, questioning their accuracy, consistent application, and general validity. Bite-mark analysis—frequently employed in rape and murder cases, including capital cases—was subject to special scorn; the committee questioned whether bite marks could ever be used to positively identify a perpetrator. Ballistics and handwriting analysis, the committee noted, are also based on tenuous and largely untested science. The report amounted to a searing condemnation of the current practice of forensics and an ominous warning that death row may be filled with innocents.
In 2009, a National Academy of Sciences committee embarked on a long-overdue quest to study typical forensics analyses with an appropriate level of scientific scrutiny—and the results were deeply chilling. Aside from DNA analysis, not a single forensic practice held up to rigorous inspection. The committee condemned common methods of fingerprint and hair analysis, questioning their accuracy, consistent application, and general validity. Bite-mark analysis—frequently employed in rape and murder cases, including capital cases—was subject to special scorn; the committee questioned whether bite marks could ever be used to positively identify a perpetrator. Ballistics and handwriting analysis, the committee noted, are also based on tenuous and largely untested science. The report amounted to a searing condemnation of the current practice of forensics and an ominous warning that death row may be filled with innocents.
Given the flimsy foundation upon which the field of forensics is based, you might wonder why judges still allow it into the courtroom. The rather depressing answer is a combination of ignorance and laziness. In 1993, the Supreme Court announced a new test, dubbed the “Daubert standard,” to help federal judges determine what scientific evidence is reliable enough to be introduced at trial. The Daubert standard was meant to separate the judicial process from the quest for scientific truths—but it wound up frustrating judges and scientists alike. As one dissenter griped, the new test essentially turned judges into “amateur scientists,” forced to sift through competing theories to determine what is truly scientific and what is not.
The Daubert court clearly had little understanding of the complex, often contentious trial and error that goes into the establishment of an accepted scientific technique. It instructed judges to look to “the existence and maintenance of standards controlling [the technique’s] operation,” a strikingly opaque command tossed off with little explanation. Even more puzzlingly, the new standards called for judges to ask “whether [the technique] has attracted widespread acceptance within a relevant scientific community”—which, as a frustrated federal judge pointed out, required judges to play referee between “vigorous and sincere disagreements” about “the very cutting edge of scientific research, where fact meets theory and certainty dissolves into probability.”
Faced with this unenviable chore, most judges have simply trusted prosecutors not to introduce anything that wouldn’t roughly fit the Daubert standard. The conventional wisdom is that, if a prosecutor introduces any truly egregious pseudoscience, the defense can introduce its own expert to refute it or can undermine it through aggressive questioning. It’s a comforting idea: Presented with conflicting scientific findings, jurors will sift out the truth.
Unfortunately, it is also entirely false. American jurors today expect a constant parade of forensic evidence during trials. They also refuse to believe that this evidence might ever be faulty. Lawyers call this the CSI effect, after the popular procedural that portrays forensics as the ultimate truth in crime investigation.
“Once a jury hears something scientific, there’s a kind of mythical infallibility to it,” Peter Neufeld, a co-founder of the Innocence Project, told me. “That’s the association when a person in white lab coat takes the witness stand. By that point—once the jury’s heard it—it’s too late to convince them that maybe the science isn’t so infallible.”
Faced with this unenviable chore, most judges have simply trusted prosecutors not to introduce anything that wouldn’t roughly fit the Daubert standard. The conventional wisdom is that, if a prosecutor introduces any truly egregious pseudoscience, the defense can introduce its own expert to refute it or can undermine it through aggressive questioning. It’s a comforting idea: Presented with conflicting scientific findings, jurors will sift out the truth.
Unfortunately, it is also entirely false. American jurors today expect a constant parade of forensic evidence during trials. They also refuse to believe that this evidence might ever be faulty. Lawyers call this the CSI effect, after the popular procedural that portrays forensics as the ultimate truth in crime investigation.
“Once a jury hears something scientific, there’s a kind of mythical infallibility to it,” Peter Neufeld, a co-founder of the Innocence Project, told me. “That’s the association when a person in white lab coat takes the witness stand. By that point—once the jury’s heard it—it’s too late to convince them that maybe the science isn’t so infallible.”
If judges can’t be trusted to keep spurious forensic analysis out of the courtroom, and juries can’t be trusted to disregard it, then how are we going to keep the next Earl Washington off death row? One option would be to permit anybody convicted on the basis of biological evidence to subject that evidence to DNA analysis—which is, after all, the one form of forensics that scientists agree actually works. But in 2009, the Supreme Court ruled that convicts had no such constitutional right, even where they can show a reasonable probability that DNA analysis would prove their innocence. (The ruling was 5–4, with the usual suspects lining up against convicts’ rights.)
That leaves one last option: reforming the sprawling field of forensics itself. For years, this job proved nearly impossible, due in large part to what the National Academy of Sciences called the “extreme disaggregation” of forensics. Until lab technicians follow some uniform guidelines and abandon the dubious techniques glamorized on shows like CSI, forensic science will barely qualify as a science at all. As a recent investigation by Chemical & Engineering News revealed, little progress has been made in the five years since the National Academy of Sciences condemned modern forensic techniques.
There is, however, some hope on the horizon. The Obama administration has started to aggregate forensics practices, creating a National Commission on Forensic Science to develop uniform standards to be used across the country. The National Institute of Justice has also given out more than $1 million to fund research into the efficacy and consistency of commonly used forensics techniques.* More and more, labs across the country will know which methods of forensic analysis are trustworthy—and which are glorified pseudoscience.
But what about those thousands of people who’ve already been put behind bars based on evidence analysis that we know today to be utterly unreliable? Even those inmates fortunate enough to obtain capable counsel will often need access to biological evidence, which the Supreme Court has refused to grant them. Every state provides post-conviction DNA access in theory, but many states restrict access after a few years—which is sure to leave some innocent prisoners marooned on death row. Our national experiment in untested forensics may soon be coming to a close. But it hasn’t ended in time to prevent a few more people like Earl Washington from being sacrificed on the altar of pseudoscience.