Written in a crisp and engaging style, free of legal and scientific jargon, Failed Evidence (NYU Press, 2012), by David A. Harris, will explain to police and prosecutors, as well as anyone else who cares about how law enforcement does its job, why the criminal justice system has resisted science for so long and where we should go from here. Because only if we understand why law enforcement resists the best that science has to offer will we be able to convince those in power to adopt it. In this excerpt from the introduction, Harris tells of certain miscarriages of justice wherein traditional investigative methods failed to place blame on the guilty and, in turn, punished the innocent.
In 2010, and for the previous nine years running, CSI: Crime Scene Investigation ranked among the most popular shows on television in the United States. The program became a hit so quickly after its premiere in 2000 that the original series, set in Las Vegas, spawned two clones: CSI: Miami and CSI: New York. These shows put a new twist on the old police procedural drama. The CSI officers solved crimes with high-tech forensics: gathering DNA, lifting fingerprints with revolutionary new techniques, and using science to reconstruct the paths of bullets. Watching these programs, the viewer knows that policing has changed. For every member of the CSI team using a gun, more wield test tubes, DNA sampling equipment, and all manner of futuristic gizmos designed to track down witnesses and catch the bad guys.The show signals a break with the past, because it revolves around the way police use modern science to find the guilty and bring them to justice.
CSI reflects the emergence of DNA evidence as a powerful tool since it first appeared in American criminal courts in the late 1980s. With DNA and other formidable forensic techniques on our side, little could escape our scientific police work. In this new world, in which science could tell us definitively that the police had the right guy, with a probability of millions or even billions to one, the game had changed for good. The “just the facts, ma’am” approach of Sergeant Joe Friday, and the slow and inexact old-school ways that might or might not turn up evidence, began to seem like quaint relics of a bygone era. Sure, some real-world police protested that CSI raised unrealistic public expectations of both forensic science and the police, but CSI simply put a drama-worthy sheen on the way that police departments liked to portray themselves in the age of DNA: using the best of what science had to offer to construct air-tight criminal cases. Police frequently announced that they had used DNA to catch guilty people, sometimes for crimes far in the past, attracting wide public notice and bolstering law enforcement’s science-based image. With headlines like “State, City Police Laud Increase in Arrests Using DNA” in Baltimore, “Georgia DNA Solves 1,500 Cases” in Atlanta, “DNA Databanks Allow Police to Solve at Least Four Murders” in Memphis, and “With Added Lab Staff, DNA Tests Resolve String of Old Killings” in Milwaukee, the direction and approach of police work now seem woven together with the latest scientific advancements. Science has given police and prosecutors an enormous, unbeatable advantage.
But this all-too-common view of modern police work using science to move into a gleaming, high-tech future turns out to be a myth. When we strip away the veneer of television drama and the news stories about how DNA has helped police catch another killer or rapist, the real picture concerning law enforcement and science actually looks much different. With the exception of DNA (and then, only sometimes), most of our police and prosecutorial agencies do not welcome the findings of science; they do not rush to incorporate the latest scientific advances into their work. On the contrary, most police departments and prosecutor’s offices resist what science has to say about how police investigate crimes. The best, most rigorous scientific findings do not form the foundation for the way most police departments collect evidence, the way they test it, or the way they draw conclusions from it. Similarly, most prosecutors have not insisted upon evidence collected by methods that comply with the latest scientific findings in order to assure that they have the most accurate evidence to use in court. Like police, most prosecutors have resisted. And this resistance comes despite a nearly twenty year drumbeat of exonerations: people wrongly convicted based on standard police practices, but proven irrefutably innocent based on DNA evidence. These DNA exonerations, now numbering more than 250 nationwide, prove that traditional techniques of eyewitness identification, suspect interrogation, and forensic testing contain fundamental flaws that have resulted in miscarriages of justice.
Yet the resistance continues. At best, police and prosecutors have used advances in science selectively, when it helps their cases. At worst, they have actively opposed replacing questionable investigative methods with better, empirically proven techniques, sometimes even insisting on retaining flawed methods. As a matter of principle and logic, this indifference to improved practices that will avoid miscarriages of justice seems puzzling and irresponsible, since we know for certain that we can do better than we used to. As a matter of concrete cases, when we see that the failure to use our best methods sometimes leads to both the punishment of the innocent and the escape of the guilty, indifference can become a catastrophe for our system of justice. It is this resistance to sound, science-based police investigative methods that forms the heart of this book.
Brandon Mayfield’s case makes a striking example. In March of 2004, terrorists bombed four commuter trains in Madrid, killing 191 people and wounding approximately eighteen hundred. Spanish police soon found a partial fingerprint on a plastic bag in a car containing materials from the attack. Using a digital copy of the fingerprint sent by the Spanish police, a senior FBI fingerprint examiner made “a 100% identification” of Brandon Mayfield, an Oregon attorney, whose prints appeared in government databases because of his military service and an arrest years earlier. Three other fingerprint experts confirmed the match of Mayfield to the print found on the bag: FBI supervisory fingerprint specialist Michael Wieners, who headed the FBI’s Latent Print Unit; examiner John Massey, a retired FBI fingerprint specialist with thirty years of experience; and Kenneth Moses, a leading independent fingerprint examiner. The FBI arrested Mayfield and at the Bureau’s request incarcerated him for two weeks, despite the fact that he did not have a valid passport on which he could have traveled to Spain; he claimed he had not left the United States in ten years. When the FBI showed the Spanish police the match between the latent print from the bag and Mayfield’s prints, the Spanish police expressed grave doubts. The FBI refused to back down, even declining the request of the Spanish police to come to Madrid and examine the original print. Only when the Spanish authorities matched the print with an Algerian man living in Spain did the FBI admit its mistake, releasing Mayfield. The Bureau issued an apology to Mayfield—an action almost unprecedented in the history of the FBI—and later paid him millions of dollars in damages in an out-of-court settlement.
The extraordinary apology and the payment of damages may help to rectify the injustice done to Mayfield and his family. But for our purposes, what happened after the FBI admitted its mistakes and asked the court to release Mayfield shows us something perhaps more important. The Mayfield disaster occurred because, among other things, the verification of the original FBI match of Mayfield’s print—a procedure performed by three well-regarded fingerprint experts—ignored one of the most basic principles of scientific testing: the verification was not a “blind” test. The three verifying examiners knew that an identification had already been made in the case, and they were simply being asked to confirm it. No scientific investigation or basic research in any other field—a test of the effectiveness of a new over-the-counter medicine, for example—would ever use a non-blind testing procedure; yet nonblind verification is still routine in fingerprint identification. Further, the FBI conducted proficiency testing of all of the examiners involved in the Mayfield case—but only after revelation of the errors, not before. At the time of Brandon Mayfield’s arrest, the FBI did no regular proficiency testing of its examiners to determine their competence, even though such testing routinely occurs in almost any laboratory that uses quality-control procedures. Further, and perhaps most shocking of all, the fingerprint comparison in the Mayfield case relied not on rigorously researched data and a comparison made under a well-accepted set of protocols and standards but on the unregulated interpretations of the examiners.
Yet, confronted by an undeniable, publicly embarrassing error that highlighted the crying need for fingerprint analysts to adopt standard practices used in every scientific discipline, the experts refused. Their answer was resistance and denial: resistance to change, and denial of the existence of a problem. Months after the humiliating exposure of the Mayfield debacle, some of those involved continued to insist that the matching of prints to identify unknown perpetrators could not produce mistakes—ever. In an article on the Mayfield case and other instances of mistaken forensic identification, Agent Massey, who had verified the print as belonging to Mayfield, told the Chicago Tribune that he and his fellow analysts had just done their jobs— nothing more. He acknowledged that when he verified Mayfield’s print, he knew that another examiner had already declared the print a match; in other words, he had not performed a blind verification test. Nevertheless, he said, “I’ll preach fingerprints till I die. They’re infallible.” Another examiner interviewed about the Mayfield case made an almost identical, unequivocal statement: “Fingerprints are absolute and infallible.” When another false fingerprint match led to the two-year incarceration of a man named Rick Jackson, CBS News correspondent Lesley Stahl confronted another FBI agent on the news program 60 Minutes. The agent’s words eerily echoed Agent Massey’s declarations of fingerprint infallibility. After a demonstration of fingerprint identification by the agent, Stahl asked, “What are the chances that it’s still not the right person?” Without hesitation, the agent replied, “zero,” because “[i]t’s a positive identification.”
As an institution, the FBI did no better at accepting its error and changing its practices. The Bureau announced that it would conduct an investigation of the practices of its Latent Fingerprint Unit, with an eye to “adopting new guidelines.” (The Latent Fingerprint Unit conducted this investigation itself.) As these words are written, more than six years after a mistaken fingerprint match almost sent Brandon Mayfield to prison for the rest of his life, the FBI laboratory’s fingerprint identification division does not use standard blind testing in every case. The laboratory widely considered to have the best fingerprint identification operation in the country continues to resist change and remains in denial, and has refused to move toward practices and safeguards that the scientific world has long considered standard.
To understand how we got to this point, we must start with DNA. DNA analysis did not develop in the context of police-driven forensic investigation, but rather as a wholly scientific endeavor. This helps explain why DNA testing has always included fully developed standard protocols for its use and the ability to calculate the probability of its accuracy based on rigorously analyzed data. This made courts willing to allow its use as proof. Despite its obvious complexity, DNA analysis had been thoroughly tested and was well grounded in scientific principles. As long as forensic scientists followed proper protocols for handling and testing the evidence, DNA could “individualize”—indicate whether a particular person had or had not supplied a tiny piece of tissue or fluid—with a degree of precision unimaginable before. The potential for solving crimes, particularly murders and rapes by strangers in which police might find some fragment of the assailant’s DNA left behind, seemed limitless. Defendants who might have escaped detection and conviction got the punishment they deserved. Even decades-old “cold cases” would yield to this marvelous new tool providing that enough testable biological material still existed, and advances in testing rapidly made accurate analysis of ever smaller samples possible.
Soon enough, though, police and prosecutors found that the great sword of DNA had two edges: it could confirm guilt like nothing else, but it could also exclude a suspect that the authorities believed had perpetrated the crime. Sometimes the prosecution had already tried the suspect and obtained a guilty verdict. DNA could convict, but it could also throw police investigations, charges, and even convictions into the gravest doubt. A pattern emerged: many of the cases upended by DNA rested on well-accepted types of evidence, like identifications by eyewitnesses, confessions from suspects, or forensic science producing a “match” with a perpetrator. Thus DNA began to demonstrate that these traditional methods actually did not have anything like the rock-solid basis everyone in law enforcement had always imagined. The very basis for trusting these standard types of evidence began to erode.
By early 2010, DNA had resulted in the exoneration of more than 250 previously convicted people, some of whom had spent years on death row. By far, the single most common factor, found in 75 percent of these cases, was incorrect eyewitness identifications; the second most common type of error was inaccurate (or sometimes downright fraudulent) forensic testing. Perhaps most surprisingly, DNA also proved that some suspects did something most people considered unimaginable: they confessed to serious crimes that they had not committed. All in all, the DNA exoneration cases showed, beyond any doubt, that we simply had to rethink some of our fundamental assumptions about the most basic and common types of evidence used in criminal cases. An eyewitness who expressed absolute certainty when identifying the perpetrator could actually be wrong. A person who confessed to a crime might not actually have done it. And forensic analysis, always the gold standard of fingerprint matching, was not invariably correct.
With DNA exonerations continuing every year in the 1990s and 2000s, more and more research on traditional police investigative methods began to come to prominence. The research had earned acceptance in the scientific community, sometimes decades before, through peer review, publication, and replication by other scientists, but most of it had remained obscure except to a small circle of researchers. With the advent of DNA exonerations, the science became important to anyone interested in the integrity of the criminal justice system. Decades of these studies, it turned out, pointed out flaws in the ways that police conducted eyewitness identifications. Other research showed that the most widely used method of interrogating suspects rested upon assumptions shown to be deeply flawed, and that common interrogation methods created real risks of false confessions.
DNA’s precision and scientifically sound foundation effectively raised the bar for all other forensic techniques and investigative methods. Experts and researchers began to call traditional (i.e., non-DNA) investigative methods into question. The full scope of damage to the credibility of police investigative tactics became visible in 2009, with the National Research Council’s report on forensic sciences, Strengthening Forensic Science in the United States: A Path Forward. In this landmark report, a large group of the most knowledgeable people in forensic science and related fields declared that, aside from DNA and a few other solidly scientific disciplines such as toxicology, almost none of the forensic science disciplines could claim any real scientific basis for their results. Most of the forensic work done in the United States did not follow the standard scientific precautions against human cognitive biases. Striking at the core of forensic science, particularly fingerprint analysis, the report stated that (again with the exception of DNA and some other disciplines based firmly in the hard sciences) none of the common forensic disciplines could proclaim themselves rigorously reliable, let alone infallible.
But this sudden exposure of the shortcomings of traditional police investigation tactics also had another, more positive side. The same bodies of research that demonstrated the failings of traditional eyewitness identification testimony, interrogation methods, and forensics also revealed better, more accurate methods to solve crimes—or, at the very least, improved ways to investigate that would greatly reduce the risks of incorrect charges and convictions. These new methods could help guard against mistakes, both by producing more reliable evidence and by eliminating common human cognitive biases from police and forensic investigation. Many of these improved methods would cost very little—sometimes nothing. Thus, the research on traditional investigative methods did not just point out flaws; it pointed the way to better, more reliable tactics. A few examples make this plain.
• Research by cognitive psychologist Gary Wells and others demonstrated that eyewitness identification procedures using simultaneous lineups—showing the witness six persons together, as police have traditionally done—produces a significant number of incorrect identifications. This is the case because showing the six persons to the witness simultaneously encourages witnesses to engage in relative judgment: they make a selection by asking themselves, “Which of the people in the lineup looks most like the perpetrator, even if I can’t say for sure that the perpetrator is there?” Wells discovered that if he showed the persons in the lineup to the witnesses sequentially—one at a time, instead of all six together—a direct comparison of each individual person in the lineup to the witness’s memory of the perpetrator replaces the flawed relative judgment process, reducing the number of false identifications significantly.
• Research has demonstrated that interrogations that include threats of harsh penalties (“Talk, or we’ll ask for the death penalty.”) and untruths about the existence of evidence proving the suspect’s guilt (a false statement by police asserting that they found the suspect’s DNA at the scene) significantly increase the prospect of an innocent person confessing falsely. By eliminating these tactics, police can reduce false confessions.
• Fingerprint matching does not use probability calculations based on collected and standardized data to generate conclusions, but rather human interpretation and judgment. Examiners generally claim a zero rate of error—an untenable claim in the face of publicly known errors by the best examiners in the United States. To preserve the credibility of fingerprint examination, forensic labs could use exactly the kinds of proficiency testing and quality assurance standards scientists have crafted for other fields. These methods have become widely available; scientists, engineers, and researchers all use them for work that requires high levels of reliability.
In light of all of the challenges that science now poses to established methods of police investigation, highlighted by what DNA tells us about the (in)accuracy of the procedures police have long used, we ought to have seen wholesale changes by now in the basics of the procedures used by police to investigate crimes. We might also have expected to see at least the beginnings of changes in forensic science practices—a willingness to embrace proficiency testing, for example, or a wholesale reexamination of some of the disciplines, such as bite mark analysis, that seem to have little scientific basis and a notable track record of producing convictions of innocent people.
But as the discussion of the Mayfield case shows, we have seen very little change at all. To be sure, in a growing but still relatively small group of police departments and prosecutors’ offices, one now sees an openness to new, scientifically proven methods of investigation that minimize the risk of sacrificing the truth. But the reaction in law enforcement overall has been disappointing. Most police departments and prosecutors’ offices have ignored the new science on forensics, eyewitnesses, and interrogation, preferring the status quo. Others have actively resisted change, even fought against it. Some agencies have proclaimed the soundness of discredited methods even in the face of undeniable failures, just as those who mistakenly matched Brandon Mayfield to the Madrid bombings continued to proclaim fingerprints infallible. This represents not just a missed opportunity to do better, but a likely source of future cases in which the train of justice derails and the wrong people pay the price for crimes they did not commit while the real predators and perpetrators remain free to strike again.
The resistance to these new approaches takes different forms. Sometimes, those in policing or prosecution see the new science behind eyewitness identification, interrogation, and forensics as just a way in which slick defense lawyers can help guilty defendants avoid punishment. These new methods might interfere in some way with the constant battle to arrest criminals, law enforcement says, and this means society cannot afford to accept these new approaches. They may lead to more guilty people escaping justice—something no society should tolerate, let alone encourage. Second, police and prosecutors sometimes distrust or deny outright the correctness of the scientific findings and their implications for investigative work as police currently do it. Scientists may find these new methods proven and sound, police say, but that means nothing; a clever academic can make statistics say anything. Science remains too unsettled to allow law enforcement to rely on it, too fluid to build a conviction on, and too laboratory-centric and divorced from the realities of the street, where real police work takes place. Science that criticizes police work seems fundamentally elitist to many in law enforcement, overvaluing experiments and undervaluing the lived experience of police officers. The need to prove everything in terms of hard data, they often say, fails to appreciate the special intuitive skills of experienced police officers, which allow them to spot lies, identify suspicious activity, and see potentially criminal behavior that the rest of us do not recognize. Third, some law enforcement officials take the new information science has provided as a personal attack on them—“you’re saying that police are corrupt”—or as an attack on the law enforcement profession and its abilities as a whole. “We know how to do police investigation,” the argument goes; “we’ve been doing this for years, and no bunch of pointy-headed ivory tower types will convince us of anything different.” Fourth, some police and prosecutors simply do not understand science and the scientific method. Therefore, they do not recognize the power of the scientific method to help us appreciate how well the investigative procedures police use actually work—as opposed to how well police and prosecutors think these procedures work. They also object to the costs of these new ways of investigating cases, both the direct costs of implementing these changes and the indirect costs of greater manpower and new training.
But none of this explains—let alone justifies—the refusal to accept solid evidence that has shown that serious flaws exist in the ways we have always investigated crimes, and the battle on the part of many—not all, to be sure, but many—in law enforcement to resist improved and tested methods. Given the solid evidence that exists, we can no longer pretend that our methods of interrogation, our ways of using eyewitness testimony, and our forensic testing all work as well as we have always thought they did. We can no longer credibly contend that innocent people never confess, that eyewitnesses always provide generally reliable evidence, and that forensic matching of things like hair, shoe prints, bite marks, and or even fingerprints have a scientific basis. The facts simply do not support these and so many other assumptions, no matter how strongly, or for how long, police, prosecutors, and others have believed them. Yet much of the law enforcement establishment still holds fast to those views. And none of the reasons why they continue to believe could outweigh the obligation to make determinations of guilt and innocence in the most reliable fashion that we can.
Thus, if neither the feared acquittals of the guilty, nor costs, nor distrust of science, nor limiting police officers’ ability to utilize their intuition justify this resistance, we must ask why the resistance occurs and continues. Surely, no one enters the police academy or takes a job as a prosecutor with the aim of arresting or convicting the wrong people. Why, then, would law enforcement leaders and prosecutors resist changes in investigation procedures in the face of a steadily growing body of evidence that the ways they do and have long done these basic law enforcement tasks produce a noticeable number of miscarriages of justice, with the wrong people punished and the guilty free to victimize others? Why resist change when the data we have proves not just that the old ways work poorly at times but also that new ways can work better, without costing the innocent their liberty? Why has law enforcement not only failed to embrace advances that could help improve police work but actively fought against acceptance of these improvements?
At least two sets of explanations account for this resistance. The first set focuses on cognitive obstacles to change: the ways that human beings think. First, consider police culture. Police officers tend to regard each other as members of a closed fraternity; those who do not wear the badge cannot fully understand what it means to do so, and therefore can never claim membership in the clan. Police culture remains notoriously insular, and those who belong regard outsiders with suspicion. This culture does not welcome change or new ideas that challenge established ways of thinking or operating. Cognitive science tells us that this kind of situation will induce group polarization: those who associate only with like-minded others will tend toward an extreme version of their beliefs. They hear only one side of the argument; even more importantly, agreeing with other group members becomes a mark of group loyalty and identity. And group identity is especially strong in both police departments and prosecutors’ offices.
Strong group identity also breeds an intense us-versus-them mentality; therefore, anything that helps “them” (suspects, defendants) must hurt “us” (the law enforcement fraternity). When new knowledge coming from outsiders appears to undermine the tried and true approaches police have used for decades, this constitutes a direct threat to the social status of police. Police view themselves as having special talents that laypersons do not have: the ability to spot a lie, to see suspicious activity the rest of us would not recognize, and to identify potentially criminal conduct in what looks like innocuous behavior. This special type of expertise gives police officers claims to higher social status, and with it entitlement to greater authority and autonomy. Thus the new science that challenges these supposed special abilities stiffens any natural hesitation to accept change. The new science on police investigation may also cause cognitive dissonance. Police officers and prosecutors see themselves as the good guys, making the world safe by arresting, charging, and trying criminals. The research on police investigation that has emerged, including the documented cases of wrongful convictions, could appear to point in a very different direction: the possibility that individuals in law enforcement may have had some role in convicting innocent persons in the past. This would create a strong cognitive conflict: people who believe themselves fundamentally good and associated with the right side of the struggle would have to confront the possibility that, perhaps inadvertently, they have participated in grave injustices. Thus, to avoid this cognitive dissonance, police and prosecutors will filter out information that indicates this could be true, and will resist change. Change from the status quo also brings wealth effects into play: the natural sense that movement away from the tried and true will bring loss of what one has in the present. Since individuals feel losses more painfully than they feel denied gains, they resist change.
The second set of reasons for resistance to change involves professional and institutional barriers. First and foremost, we must consider the institutional and professional incentives built into the jobs of both police officers and prosecutors. In the United States, police officers live by the imperative of arrest. Their careers thrive or fail to prosper according to how many cases they can close by arresting perpetrators. Arrests are measurable— one can easily count their numbers (if not their quality). Arresting criminals, and thereby closing cases, has therefore become the single important measurement within police departments of whether an officer does his or her job well. For prosecutors, convictions constitute the parallel metric. Those with the best conviction rates advance; their peers regard them as stars of the office staff. Obviously, police want to arrest the right people—the real perpetrators. Similarly, prosecutors want high conviction rates, though presumably only of the truly guilty. The problem is that any reform that looks as though it might disturb the police officer’s or prosecutor’s chances of achieving arrests or convictions runs counter to the strong incentives built into the very core of the way they perform their work. Thus, if the traditional ways in which police interrogate suspects, work with eyewitnesses, and utilize forensic testing bring them results by which they benefit, they may be, at best, indifferent to the new scientific knowledge that suggests better ways to perform these tasks. If they perceive these new methods as a threat to what has always worked for them, they may be not just indifferent but actually hostile to the idea of moving in those new directions and away from the tried and true. Police unions and other groups concerned with police working conditions may resist change because they fear that changing the status quo in any way may endanger gains they have already secured for their members. They will resist giving such gains in any context; trading them away for possible gains that accrue to someone else—a criminal suspect, no less—seems unthinkable. The law-and-order orientation of the media also plays a role. While the media have covered numerous stories of wrongful convictions in ways that help the public understand the reality of these miscarriages of justice, most news reports on criminal justice issues still err on the side of a vastly oversimplified presentation of local crime and criminals. This creates the impression of an increasingly dangerous environment for everyday citizens. Media organizations, especially television outlets, have presented this impression for years, even in the face of irrefutable evidence that crime has can make bringing about reform in existing police procedures devilishly difficult. Political ambition may also play a role in generating resistance to reform, along with lack of representation of voices for reform in legislative settings.
Why focus on police and prosecutors? Police seek out the evidence, collect it, and, at least initially, interpret it. They decide in the most direct way whether a suspect has committed a crime, and if so of what nature. Prosecutors take the evidence police bring them and make independent determinations concerning whether cases will or will not proceed, and with what charges. Thus most of what happens in the criminal courts originates with the actions of police and prosecutors, and those actions shape every individual case and importance of their input, police and prosecutors have the ability to shift their own efforts and the entire criminal justice system toward better practices that science has shown will produce fewer erroneous convictions and miscarriages of justice. To be sure, other actors share some of the blame when things have gone wrong. Defense lawyers have failed to challenge the basic science behind police procedures. Judges have failed to act as “gatekeepers” against the use of junk science in criminal courts. Nevertheless, police and prosecutors have the dominant position and the most leverage in shaping the way criminal justice works in our country, whether focusing on individual cases or on systemic issues. For that reason, this book focuses primarily on them.
This does not mean police or prosecutors have intentionally caused wrongful convictions by using discredited forensic testing or investigative methods. Some cases of outright fraud or knowing creation of false evidence have occurred, but the vast majority of police do not want to bring cases against the wrong suspects. Prosecutors have no reason to want to convict those who have done nothing, instead of the actual guilty parties. But even if most use of poor investigative or forensic methods does not constitute intentional wrongdoing, the actions of police and prosecutors still deserve special scrutiny. They serve as the collectors (police) and presenters (prosecutors) of evidence in our adversary system of criminal justice. The object is not a perfect criminal justice system that never makes an error; no system created and run by human beings could function perfectly. Rather, we should strive to do the best that imperfect human beings can do, using all of the knowledge, energy, and creativity at our disposal, to minimize the risk of the conviction of the innocent. If our police and prosecutors do their jobs according to the best practices we have, according to the strongest evidence and the most current scientific research, the criminal justice system as a whole can do the best possible job of excluding the innocent from accusation, trial, and punishment— the ultimate nightmare, because an undeserving person would suffer and because the guilty party would remain free to harm other victims. The public has every right to expect that those who carry the awesome responsibility of arresting, charging, and trying suspects will do their jobs using the best possible tactics and methods we have, in order to steer clear of the mistakes that imperfect human beings can avoid. If, instead, they use methods that researchers have found to have no scientific basis, that do not work as advertised, or that actually create significant risks of producing faulty evidence, when better practices can become part of law enforcement’s arsenal without significant cost, police and prosecutors do not serve the public properly.
Doing justice may be hard, but no task has greater importance to our nation’s values, and for that reason alone our public officials must at least attempt to carry out their difficult responsibilities in the most accurate ways available. And that responsibility leaves no room for ignoring new methods scientifically proven to produce more accurate evidence. The integrity of the criminal justice system must, at all times, remain paramount, and we should not tolerate anything that subverts it.
Accurate proof of facts has always been the lifeblood of this process. And we depend on our police and prosecutors to do the best that they can to assure that only accurate evidence comes before judges and juries. In the end, we cannot be absolutely certain that any of us—members of juries, police officers, judges, or prosecutors—know the ground truth; we cannot know for certain whether the conclusions we draw from the evidence we bring to court are in fact accurate. But we can know whether we have obtained the evidence we use by methods and processes that minimize the risk of bringing into the record false confessions, or incorrect identifications by witnesses, or misleading or mischaracterized or incorrect forensic test results. And we can also know whether we have used methods that our best scientific minds have proven will minimize the chances that the wrong person will go to jail, with the real criminal remaining free. That is what is at stake in this debate. If we choose to let the resistance to new methods stand, we will not have not done what we as human beings can to ensure that we are right when we judge the guilt or innocence of our fellow men and women. And when we can see the result of that—miscarriages of justice, lost years in prison for someone undeserving of punishment, and the destruction of the trust and confidence citizens need to have in police, prosecutors, and the rule of law itself—we see there is simply no excuse. We must persuade our law enforcement officers and our prosecutors of the necessity of doing what is right, and if we cannot persuade them, we must bring them along anyway in the march toward better practices.
What must we do to make this happen? At this point, scientists and researchers have given us a reasonably good map of how the system must change. In the forensic sciences, the most questionable fields, such as hair analysis, bite mark comparisons, and shoe and tire impressions, must become unusable in courts, until they can establish their reliability through empirical research. For more established disciplines, such as fingerprint comparisons or tool marks comparisons involving firearms, we must begin constructing databases and a set of protocols to standardize comparisons, and we must use these tools to render a true picture of the accuracy of these disciplines. Basic processes to eliminate human cognitive bias and proficiency testing (for individual evidence examiners) and quality assurance procedures (for laboratories) must become standard. For interrogation, we must make electronic recording a requirement. We must prohibit certain tactics—threats, promises of leniency, and lies about test results—that dramatically raise the risk of false confessions. For the handling of eyewitnesses, the list of scientifically proven approaches includes blind administration of lineups, sequential presentation of live suspects and photos, prohibition on any statements by the lineup administrator before or after the lineup, the taking of a “level of confidence” statement from a witness who makes an identification immediately afterward, and electronic recording of the process.
If that is what we must do, the tougher question remains how to accomplish it. After all, the research recommending these changes is not new, and yet the improved practices the research supports have not spread across the entire field. Making it happen will take a multipronged approach. Above all, we will need the right leadership, which must usually come from the law enforcement establishment and from the political Right. A movement for the reform of police and prosecutorial methods led by criminal defense lawyers and civil liberties advocates may sometimes succeed, but will often find itself undermined by criticism that it is self-interested; a group advocating reform led by prosecutors will, by contrast, seem to have the kind of disinterested credibility that will persuade many others in law enforcement to join. As a persuasive strategy, we will also have to focus more on the future than on the past, while at the same time institutionalizing mechanisms for review of old miscarriages of justice that cry out for correction. The key concept is the integrity of the criminal justice system. Wrongful convictions eat away at the public confidence that the legal system must have in order to survive.
We must also concentrate on the three other things: creating best practice standards that all law enforcement agencies would need to follow; using public money as leverage; and sharpening the contours of judicial control over evidence while creating the judicial spine to use it. As for best practices, the federal government and professional law enforcement organizations should set best practices in all three of these areas—forensic testing, identification by witnesses, and suspect interrogation. In some cases, this work has already advanced; the framework for these practices already exists. The Department of Justice issued guidelines followed by an academic white paper two years later, setting out best practices in witness identification in 1999; a distinguished group of researchers did the same for interrogation practices in 2010. For the forensic sciences, the National Research Council’s report, issued in 2009, will provide more than a starting point. Professional groups, such as the International Association of Chiefs of Police and the National District Attorneys Association, should become part of this process, and studies by the National Institute of Justice, such as one focusing on improving the reliability of fingerprint examination and comparison, can make vital contributions to this process. Money will help motivate almost any set of actors and institutions, and law enforcement agencies in the United States receive considerable funding from both state and federal governments. Receipt of all of this funding should become contingent on adopting all of the best practices in this area; in other words, federal standards should require that if an agency wants any federal funding for its operations, it must comply with best practices regarding the interrogation of suspects, the making of witness identifications, and the conduct and use of forensic testing. In addition, federal funding should be made available for any department that wishes to retrain its officers in the new best practices to be adopted. In the courts, the beginning point is the U.S. Supreme Court’s decision in Duabert v. Dow Chemical from 1993, which designated judges as the gatekeepers in their courts to guard against junk science. They have carried out this responsibility vigorously in civil cases, but for whatever reason, they have failed miserably in this role in criminal cases. To put it simply, this must change.
There is hope that these new, science-driven methods will overcome the resistance of most police and prosecutors. In a small but growing number of states, police departments, prosecutors’ offices, and forensic laboratories, new ways of doing things have taken root. For example, in decades past, no jurisdictions required audio or video recordings of the interrogation of suspects. Now, a small number of states require it, and an increasing (though still relatively small) number of police departments and prosecutors swear by it—even some of those who resisted it mightily in the past. Similarly, a number of jurisdictions have imposed changes in the way their personnel conduct eyewitness identification procedures— requiring blind administration of the process, and even using sequential as opposed to simultaneous lineups. All of this tells us that, with the right push forward, we can indeed hope for better tactics and methods, and overall better evidence.
The purpose of this book is not to condemn police officers or policing, or to paint prosecutors as intransigent. Police officers do a very difficult and often thankless job, frequently under the most trying circumstances. Prosecutors work in the public sector, sacrificing the high incomes they might make in private law practice in order to serve the public by putting criminals in jail. Neither job is easy. But the institutions involved—police departments and prosecutors’ offices or, more generally speaking, the law enforcement establishment—must become more open to the advances that scientists and professionals in a variety of fields have brought to bear on the work of criminal investigation. There is much to lose by resistance; there is much to gain by embracing the future.
This excerpt has been reprinted with permission from Failed Evidence: Why Law Enforcement Resists Science, by David A. Harris, published by NYU Press, 2012.