Opinion | A D.C. judges much-needed opinion on junk science

Posted by Tobi Tarwater on Sunday, August 4, 2024

Last September, the D.C. Superior Court restricted the testimony of a prosecution ballistics expert in a felony case. I want to draw some attention to the opinion, which I haven’t seen written up elsewhere, because it is one of the best decisions I have read in response to a challenge to the scientific validity of forensic evidence, particularly in a criminal case.

As I’ve written here ad nauseam, judges are entrusted to be the gatekeepers of good and bad science in the courtroom. By and large, they’ve performed poorly. Judges are trained to perform legal analysis, not scientific analysis, and law and science are two very different fields. Science is forward-looking, always changing and adapting to discoveries and new empirical evidence. The law, by contrast, puts a premium on consistency and predictability. It relies on precedent, so courts look to previous courts for guidance and are often bound by prior decisions.

By and large, judges have approached their task of scientific analysis just as we might expect them to: They have tried to apply it within a legal framework. This means when assessing whether a given field of forensics is scientifically reliable, judges tend to look to what previous courts have already determined. And when confronted with a new field, they tend to err on the side of relying on our adversarial system — they let the evidence in but also let the defense call its own experts to dispute the prosecution’s witness. The problem here is that by simply admitting the evidence, the courts lend it an air of legitimacy. Once the evidence is allowed in, whether jurors find it convincing tends to come down to which witness is most persuasive. State’s witnesses are often seen as unbiased and altruistic, while jurors tend to see defense witnesses as hired guns. And the set of skills it takes to persuade a jury isn’t necessarily the same skill set of a careful and cautious scientist. Indeed, the two are often in conflict.

Advertisement

This is why a field such as bite-mark analysis — which has been found to be unreliable by multiple scientific bodieshas yet to be disallowed by any courtroom in the country. Every time it has been challenged, the court has upheld its validity.

This brings me to the September D.C. opinion of United States v. Marquette Tibbs, written by Associate Judge Todd E. Edelman. In this case, the prosecution wanted to put on a witness who would testify that the markings on a shell casing matched those of a gun discarded by a man who had been charged with murder. The witness planned to testify that after examining the marks on a casing under a microscope and comparing it with marks on casings fired by the gun in a lab, the shell casing was a match to the gun.

This sort of testimony has been allowed in thousands of cases in courtrooms all over the country. But this type of analysis is not science. It’s highly subjective. There is no way to calculate a margin for error. It involves little more than looking at the markings on one casing, comparing them with the markings on another and determining whether they’re a “match.” Like other fields of “pattern matching” analysis, such as bite-mark, tire-tread or carpet-fiber analysis, there are no statistics that analysts can produce to back up their testimony. We simply don’t know how many other guns could have created similar markings. Instead, the jury is simply asked to rely on the witness’s expertise about a match.

Advertisement

Because this sort of testimony has been accepted by courts thousands of times over, it would have been easy and relatively unremarkable for Edelman to have cited those decisions and allowed the evidence. He could have argued that any doubts about the evidence could have been addressed by the defense during cross examination or by putting on its own expert. Instead, Edelman held a thorough evidentiary hearing, known as a Daubert hearing (named for a Supreme Court case on the admissibility of scientific evidence), personally reviewed the testimony and scientific literature, and reached a conclusion.

Here’s the heart of the opinion:

After conducting an extensive evidentiary hearing in this case—one that involved detailed testimony from a number of distinguished expert witnesses, review of all of the leading studies in the discipline, pre- and post-hearing briefing, and lengthy arguments by skilled and experienced counsel—this Court ruled on August 8, 2019 that application of the Daubert factors requires substantial restrictions on specialized opinion testimony in this area. Based largely on the inability of the published studies in the field to establish an error rate, the absence of an objective standard for identification, and the lack of acceptance of the discipline’s foundational validity outside of the community of firearms and toolmark examiners, the Court precluded the government from eliciting testimony identifying the recovered firearm as the source of the recovered cartridge casing. Instead, the Court ruled that the government’s expert witness must limit his testimony to a conclusion that, based on his examination of the evidence and the consistency of the class characteristics and microscopic toolmarks, the firearm cannot be excluded as the source of the casing. The Court issues this Memorandum Opinion to further elucidate the ruling it made in open court.

Note that Edelman did not rule that the witness couldn’t testify at all. He ruled that the witness could testify only to conclusions backed by scientific research. The witness could tell the jury that he could not exclude the gun as the weapon that produced the casing. But he could not say it’s a match because such a conclusion could not be proved.

Advertisement

This is an important distinction. Even the most strident critics of these fields of forensics don’t claim that they’re useless. Even bite-mark analysis can have some (minimal) investigative value. If there are clear bite marks all over a victim, for example, and the main suspect has no teeth, it seems safe to say that the suspect isn’t the source of the bites.

But it’s useful to compare fields like this with single-source DNA evidence, which is backed by science. DNA analysts don’t tell jurors that a suspect is a match. Instead, they use percentages. Because we know the frequency with which specific DNA markers are distributed across the population, analysts can calculate the odds that anyone other than the suspect was the source of the DNA in question. We can’t do that with marks on shell casings, or bite marks, or pry marks on a door because there is no way of knowing how many different guns or teeth or crowbars might, under the right conditions, produce identical marks.

What is remarkable about Edelman’s opinion is he acknowledges that his ruling will be unusual and that it will cut against nearly every court to rule before him, including appellate courts. But he issues it anyway, because it happens to be correct.

Advertisement

Judges across the United States have considered similar challenges to firearms and toolmark identification evidence. Of course, “for many decades ballistics testimony was accepted almost without question in most federal courts in the United States.” Based on the pleadings in this case, as well as the Court’s own research, there do not appear to be any reported cases in which this type of evidence has been excluded in its entirety. Earlier this year, the United States District Court for the District of Nevada also surveyed the relevant case law and concluded that no federal court had found the method of firearms and toolmark examination promoted by AFTE—the method generally used by American firearms examiners and employed by Mr. Coleman in this case—to be unreliable.

Nevertheless, he determines that the guiding principle here should not be precedent. It should be science.

In evaluating the persuasive weight of these decisions, however, the undersigned could not help but note that, despite the enhanced gatekeeping role demanded by Daubert, see 509 U.S. at 589, the overwhelming majority of the reported post-Daubert cases regarding this type of expert opinion testimony have not engaged in a particularly extensive or probing analysis of the evidence’s reliability. In 2009, the National Research Council (“NRC”) specifically criticized the judiciary’s treatment of issues relating to the admissibility of firearms and toolmark evidence and the judiciary’s failure to apply Daubert in a meaningful fashion. In the NRC’s view, “[t]here is little to indicate that courts review firearms evidence pursuant to Daubert’s standard of reliability.” …

Without disparaging the work of other courts, the NRC’s critique of our profession rings true, at least to the undersigned: many of the published post-Daubert opinions on firearms and toolmark identification involved no hearing on the admissibility of the evidence or only a cursory analysis of the relevant issues.

Yet, the case law in this area follows a pattern in which holdings supported by limited analysis are nonetheless subsequently deferred to by one court after another. This pattern creates the appearance of an avalanche of authority; on closer examination, however, these precedents ultimately stand on a fairly flimsy foundation. The NRC credited Professor David Faigman—one of the defense experts who testified at the Daubert hearing in this matter—with the observation that trial courts defer to expert witnesses; appellate courts then defer to the trial courts; and subsequent courts then defer to the earlier decisions.

As someone who has been beating this drum for years, I can’t tell you how satisfying it is to see this in a court opinion. It’s just remarkable.

Share this articleShare

Under Daubert v. Merrell Dow Pharmaceuticals Inc., the Supreme Court laid out markers that judges should look for when assessing scientific evidence, such as whether the methods in question are subject to peer review and whether the expert’s methods are generally accepted in the scientific community. Consequently, Daubert spawned cottage industries of forensic boards, certifying organizations and quasi-academic journals, all aimed at conferring legitimacy on dubious fields. When assessing a challenge to the scientific reliability of an entire discipline of forensics such as ballistics analysis or bite-mark analysis, then, too many judges have simply looked to these bogus boards and journals and concluded that the state’s expert and his or her methods are “generally accepted.”

Advertisement

But they’re accepted only by other experts within those same suspect fields. These judges neglect to assess how the entire field has been assessed by actual scientists. It’s like assessing the scientific validity of an astrologer by citing astrology journals or by consulting other astrologists.

In this case, the prosecution cited a publication called the Association of Firearm and Tool Mark Examiners (AFTE) Journal, which it claimed had published “peer-reviewed” studies concluding that ballistics analysts had a low rate of error. In his opinion, Edelman deftly slices through this noise:

Overall, the AFTE Journal’s use of reviewers exclusively from within the field to review articles created for and by other practitioners in the field greatly reduces its value as a scientific publication, especially when considered in conjunction with the general lack of access to the journal for the broader academic and scientific community as well as its use of an open review process. …

Other courts considering challenges to this discipline under Daubert have concluded that publication in the AFTE Journal satisfies this prong of the admissibility analysis. …

It is striking, however, that these courts devote little attention to the sufficiency of this journal’s peer review process or to the issues stemming from a review process dominated by financially and professionally interested practitioners, and instead, mostly accept at face value the assertions regarding the adequacy of the journal’s peer review process. …

In the undersigned’s view, if Daubert, Motorola, and Rule 702 are to have any meaning at all, courts must not confine the relevant scientific community to the specific group of practitioners dedicated to the validity of the theory—in other words, to those whose professional standing and financial livelihoods depend on the challenged discipline. As Judge Jon M. Alander of the Superior Court of Connecticut aptly stated, “[i]t is self evident that practitioners accept the validity of the method as they are the ones using it. Were the relevant scientific community limited to practitioners, every scientific methodology would be deemed to have gained general acceptance.”

Edelman’s opinion is the Platonic ideal of a Daubert analysis. It ought to be the norm. But we should also be careful not to conclude that because Edelman did it correctly, other judges will too. Again, it’s just not realistic to expect people trained in law to accurately assess the validity of scientific evidence that sometimes gets quite complicated.

Advertisement

One additional item worth noting: In 2016, President Barack Obama nominated Edelman to be a federal district court judge. Despite his impressive résumé, the Republican-controlled Senate never voted on the nomination; it expired eight months later.

In 2017, President Trump nominated Matthew S. Petersen to fill the vacancy Edelman was denied. Petersen had no previous trial or criminal law experience. In a viral video taken during his confirmation hearing, Sen. John Neely Kennedy (R-La.) asked Petersen about basic concepts in criminal law with which any judge should be familiar. One of those concepts was the Daubert standard. Clearly flustered, Petersen responded, “I don’t have that readily at my disposal. But I would be happy to take a closer look at that.” Petersen later withdrew from consideration for the position.

Read more:

Radley Balko: Journalists need to stop enabling junk forensics

Harry T. Edwards and Jennifer L. Mnookin: A wake-up call on the junk science infesting our courtrooms

Radley Balko: We need to fix forensics. But how?

Dana Milbank: Conservatives’ junk science is having real consequences

Letter to the Editor: Junk science doesn’t belong in court

ncG1vNJzZmivp6x7uK3SoaCnn6Sku7G70q1lnKedZLyxtc2ipqerX2d9c3yOaWloamhksaR5ya6boJ1dnsC0wcSsZKatk516r7HEnZydZZ%2Bltq%2B1zqdko62eoHq0r8iepZydXw%3D%3D