Techno Blender
Digitally Yours.

Firearm Forensics Has Proven Reliable in the Courtroom. And in the Lab

0 31



Over 10 days in early March 2022, five homeless men were shot in Manhattan and Washington, D.C. Two died. With the extraordinary tool of firearms identification analysis, law enforcement linked every shooting to the same gun.

Firearms identification analysis involves the microscopic examination and comparison of fired ammunition samples (typically fired bullets and spent cartridge cases recovered at crime scenes), in relation to each other and to test fires produced from recovered firearms. Qualified firearms examiners can identify a particular firearm as having fired a specific bullet or cartridge case. Investigators can then connect firearms to shootings, and even one shooting to another. From New York City to Los Angeles, hundreds of shooting investigations benefit every day from this analysis. As such, firearms identification evidence is critical to maintaining public safety and to holding shooters accountable.

Unknown to many, firearms identification analysis has a long scientific history. In 1925, Calvin Goddard, a physician, established the Bureau of Forensic Ballistics in New York City. At this independent laboratory, colleagues Charles E. Waite and Philip O. Gravelle adapted the comparison microscope for use in the identification of fired bullets and cartridge casings. As a result of his pioneering work, Goddard began the Scientific Crime Detection Laboratory at Northwestern University and was instrumental in the development of the FBI Technical Laboratory.

Nevertheless, firearms identification analysis has more recently faced criticism. A report from the President’s Council of Advisors on Science and Technology (PCAST) in 2016 concluded that there was only one appropriately designed study, known as Ames I, that validated firearms examination. The report indiscriminately dismissed several other such studies. Two years later, PCAST’s co-chair, Eric Lander, wrote in the Fordham Law Review that “PCAST judged that firearms analysis fell just short of the criteria for scientific validity, which requires reproducibility. A second study would solve this problem.”

That second study has been done, as well as several others that meet PCAST’s prescribed standards and vindicate firearms identification. The time has arrived for the scientific and legal communities to recognize its reliability in shooting investigations.

Building on the solid foundation of the Ames I study, the latest studies show remarkable accuracy for firearms identification. In fact, false positive error rates are less than 1 percent—and that is without technical review or verification to screen for errors. In other words, with a second set of trained eyes examining the evidence—just what happens in casework—those study error rates would be vanishingly low.

And the recent studies were intentionally challenging. In the 2022 Ames II study, 173 trained firearm examiners compared a total of 8,640 fired cartridge cases and bullets. The firearms and ammunition were carefully chosen for their “propensity to produce challenging and ambiguous test specimens.” Study ammunition, for example, had “steel cartridge cases and steel-jacketed bullets (steel, being harder than brass, is less likely to be marked).” With fewer microscopic markings, the comparison’s difficulty increases. Even faced with these stacked odds, the overall false positive error rate was less than 1 percent.

A study with even more participants led by Arizona State University’s Max Guyll, is noteworthy both for its results and its principal authors. They were nonpractitioners—not forensic examiners—who had no vested interest in the outcome. In the courtroom, we call those types of witnesses “independent” and “unbiased.” They asked 228 trained firearm examiners from across the United States to perform 1,811 microscopic comparisons of fired cartridge cases. This broad swath of examiners worked in private, county, state and federal laboratories. The authors concluded that “the results equally revealed a very low false-negative rate and a very low false-positive rate.” Of some 1,429 conclusive decisions, they included just one false negative and five false positives. No single examiner made more than one error. Again, the overall false positive error rate was less than 1 percent.

Study after study demonstrates the same reality: examiners are remarkably accurate when they identify casings and bullets.

Worth noting, a measure of the field’s integrity is its honesty about when it cannot link fired ammunition to a firearm. Inconclusive decisions are common both in the studies and in casework. This is a feature, not a bug, despite critics’ complaints on this point. As the Ames II study explained: “As with any instrument (the examiner being the instrument), there are limits on their ability to the interpretation of the quality/quantity of the data/information presented.” Obviously, fired bullets and cartridge cases do not always carry definitive marks supporting inclusion or exclusion of a firearm.

But inconclusive decisions do not send people to jail—identifications do. Even PCAST judged error rates based on conclusive examinations. “When reporting a false positive rate to a jury, it is scientifically important to calculate the rate based on the proportion of conclusive examinations, rather than just the proportion of all examinations,” said the report. “This is appropriate because evidence used against a defendant will typically be based on conclusive, rather than inconclusive, examinations.” (Emphases in original.) In other words, when judging reliability, the false positive error rate is paramount.

Applying this rationale to firearms identification is reassuring. When an examiner opines that a fired casing came from a particular firearm, they are accurate more than 99 percent of the time.  And firearms identification evidence never stands alone in a criminal case. It’s only one brick in a wall of evidence that may include eyewitness testimony, video surveillance, electronic locating data, DNA evidence and more. Further, unlike some DNA analysis, ballistic evidence is never consumed and is, therefore, always available to be reexamined.

In the wake of PCAST’s report, a small number of critics have appeared. Some have testified in pretrial admissibility hearings attempting to preclude or dilute the opinion of firearms experts. These nonexperts are not firearms examiners, or even forensic science practitioners. They do not conduct any of their own studies. If these critics succeed where PCAST has failed—in convincing judges nationwide to exclude firearms identification evidence—countless homicide victims killed by firearms may be denied justice.

Nearly 100 years after Goddard’s work, there are over 200 accredited laboratories in the United States performing firearms identification analysis. Analysts must follow validated standard operating procedures framed around quality assurance systems and undergo rigorous training that includes regular proficiency testing.

As members of the National District Attorneys Association, we advocate for the use of reliable forensics to exonerate the innocent and inculpate the guilty. NDAA prosecutors, who are the “boots on the ground” in courtrooms throughout this country, know from experience that firearms identification evidence is scientifically sound and withstands rigorous testing in the crucible of the courtroom.

As John Adams, both a U.S. president and a defense attorney, once said: “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of the facts and evidence.” The facts, based on scientific studies, are that forensic firearms analysis is a reliable science that hones the accuracy of the justice system.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.



Over 10 days in early March 2022, five homeless men were shot in Manhattan and Washington, D.C. Two died. With the extraordinary tool of firearms identification analysis, law enforcement linked every shooting to the same gun.

Firearms identification analysis involves the microscopic examination and comparison of fired ammunition samples (typically fired bullets and spent cartridge cases recovered at crime scenes), in relation to each other and to test fires produced from recovered firearms. Qualified firearms examiners can identify a particular firearm as having fired a specific bullet or cartridge case. Investigators can then connect firearms to shootings, and even one shooting to another. From New York City to Los Angeles, hundreds of shooting investigations benefit every day from this analysis. As such, firearms identification evidence is critical to maintaining public safety and to holding shooters accountable.

Unknown to many, firearms identification analysis has a long scientific history. In 1925, Calvin Goddard, a physician, established the Bureau of Forensic Ballistics in New York City. At this independent laboratory, colleagues Charles E. Waite and Philip O. Gravelle adapted the comparison microscope for use in the identification of fired bullets and cartridge casings. As a result of his pioneering work, Goddard began the Scientific Crime Detection Laboratory at Northwestern University and was instrumental in the development of the FBI Technical Laboratory.

Nevertheless, firearms identification analysis has more recently faced criticism. A report from the President’s Council of Advisors on Science and Technology (PCAST) in 2016 concluded that there was only one appropriately designed study, known as Ames I, that validated firearms examination. The report indiscriminately dismissed several other such studies. Two years later, PCAST’s co-chair, Eric Lander, wrote in the Fordham Law Review that “PCAST judged that firearms analysis fell just short of the criteria for scientific validity, which requires reproducibility. A second study would solve this problem.”

That second study has been done, as well as several others that meet PCAST’s prescribed standards and vindicate firearms identification. The time has arrived for the scientific and legal communities to recognize its reliability in shooting investigations.

Building on the solid foundation of the Ames I study, the latest studies show remarkable accuracy for firearms identification. In fact, false positive error rates are less than 1 percent—and that is without technical review or verification to screen for errors. In other words, with a second set of trained eyes examining the evidence—just what happens in casework—those study error rates would be vanishingly low.

And the recent studies were intentionally challenging. In the 2022 Ames II study, 173 trained firearm examiners compared a total of 8,640 fired cartridge cases and bullets. The firearms and ammunition were carefully chosen for their “propensity to produce challenging and ambiguous test specimens.” Study ammunition, for example, had “steel cartridge cases and steel-jacketed bullets (steel, being harder than brass, is less likely to be marked).” With fewer microscopic markings, the comparison’s difficulty increases. Even faced with these stacked odds, the overall false positive error rate was less than 1 percent.

A study with even more participants led by Arizona State University’s Max Guyll, is noteworthy both for its results and its principal authors. They were nonpractitioners—not forensic examiners—who had no vested interest in the outcome. In the courtroom, we call those types of witnesses “independent” and “unbiased.” They asked 228 trained firearm examiners from across the United States to perform 1,811 microscopic comparisons of fired cartridge cases. This broad swath of examiners worked in private, county, state and federal laboratories. The authors concluded that “the results equally revealed a very low false-negative rate and a very low false-positive rate.” Of some 1,429 conclusive decisions, they included just one false negative and five false positives. No single examiner made more than one error. Again, the overall false positive error rate was less than 1 percent.

Study after study demonstrates the same reality: examiners are remarkably accurate when they identify casings and bullets.

Worth noting, a measure of the field’s integrity is its honesty about when it cannot link fired ammunition to a firearm. Inconclusive decisions are common both in the studies and in casework. This is a feature, not a bug, despite critics’ complaints on this point. As the Ames II study explained: “As with any instrument (the examiner being the instrument), there are limits on their ability to the interpretation of the quality/quantity of the data/information presented.” Obviously, fired bullets and cartridge cases do not always carry definitive marks supporting inclusion or exclusion of a firearm.

But inconclusive decisions do not send people to jail—identifications do. Even PCAST judged error rates based on conclusive examinations. “When reporting a false positive rate to a jury, it is scientifically important to calculate the rate based on the proportion of conclusive examinations, rather than just the proportion of all examinations,” said the report. “This is appropriate because evidence used against a defendant will typically be based on conclusive, rather than inconclusive, examinations.” (Emphases in original.) In other words, when judging reliability, the false positive error rate is paramount.

Applying this rationale to firearms identification is reassuring. When an examiner opines that a fired casing came from a particular firearm, they are accurate more than 99 percent of the time.  And firearms identification evidence never stands alone in a criminal case. It’s only one brick in a wall of evidence that may include eyewitness testimony, video surveillance, electronic locating data, DNA evidence and more. Further, unlike some DNA analysis, ballistic evidence is never consumed and is, therefore, always available to be reexamined.

In the wake of PCAST’s report, a small number of critics have appeared. Some have testified in pretrial admissibility hearings attempting to preclude or dilute the opinion of firearms experts. These nonexperts are not firearms examiners, or even forensic science practitioners. They do not conduct any of their own studies. If these critics succeed where PCAST has failed—in convincing judges nationwide to exclude firearms identification evidence—countless homicide victims killed by firearms may be denied justice.

Nearly 100 years after Goddard’s work, there are over 200 accredited laboratories in the United States performing firearms identification analysis. Analysts must follow validated standard operating procedures framed around quality assurance systems and undergo rigorous training that includes regular proficiency testing.

As members of the National District Attorneys Association, we advocate for the use of reliable forensics to exonerate the innocent and inculpate the guilty. NDAA prosecutors, who are the “boots on the ground” in courtrooms throughout this country, know from experience that firearms identification evidence is scientifically sound and withstands rigorous testing in the crucible of the courtroom.

As John Adams, both a U.S. president and a defense attorney, once said: “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of the facts and evidence.” The facts, based on scientific studies, are that forensic firearms analysis is a reliable science that hones the accuracy of the justice system.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment