In the three decades since DNA emerged as a forensic tool, courts have rarely been skeptical about its power. When the technique of identifying people by their genes was invented, it seemed like just the thing the justice system had always been waiting for: Bare, scientific fact that could circumvent the problems of human perception, motivation, and bias.
Other forensic sciences had taken a stab at this task. Lie detector tests, ballistics, fingerprinting, arson analysis, hair examinations — all aim to provide evidence independent of the flawed humans wrapped up in an investigation. But those methods were invented by law-enforcement agencies eager for clues; it is now well established that their results are not always sound.
But DNA was different. It came up through science, which began, in the 1950s, to unravel the ways the double helix drafts our existence. When DNA profiling led to its first conviction in a U.S. courtroom in 1987, DNA had already vaulted through the validating hoops of the scientific method. Soon it was accompanied by odds with enough zeros in front of the decimal to eliminate reasonable doubt.
Today, most of us see DNA evidence as terrifically persuasive: A 2005 Gallup poll found that 85 percent of Americans considered DNA to be either very or completely reliable. Studies by researchers at the University of Nevada, Yale, and Claremont McKenna College found that jurors rated DNA evidence 95 percent accurate and between 90 and 94 percent persuasive, depending on where the DNA was found. That faith could be shaken, but only when lawyers made a convincing case that a lab had a history of errors.
The GLP aggregated and excerpted this blog/article to reflect the diversity of news, opinion and analysis. Read full, original post: Can DNA testing be trusted? The shockingly imprecise science of a proven courtroom tool