Friday, September 11, 2009

Uncertain science


Sometimes it is impossible for science to produce certain knowledge about matters upon which we would dearly love to have definite answers. Consider the challenge that the Environmental Protection Agency faces in setting exposure limits in water supplies for substances known or strongly suspected of being carcinogenic . The concentrations of such substances in a water supply are typically very low, in the parts per million range. Still, such low levels could be harmful, at least to a fraction of people. We can’t get all of the carcinogen out of the water, but we could reduce its concentration at some cost. At what level of concentration would a particular carcinogen be expected to cause no more than a certain very low level of added cancer, and what would it cost to get to that level? Toxicologists can’t do useful experiments on test animals with the water as it is found, because at the low concentrations of pollutant, it would take a huge number of test animals to produce a statistically meaningful result. Instead, what is done is to use much higher concentrations of the pollutant in the laboratory, much greater than would ever occur in the natural situation, with a manageable populations of animals. Various concentrations of the carcinogen are used with groups of test animals, and the incidence of excess cancers is monitored after a period of using those levels. A graph is then constructed of excess cancers vs. the concentration used. We might then get the red data points, as shown in the figure above.
All the investigator sees of course are the red data points: they show that the more carcinogen, the more cancers. But what happens when we extrapolate backward, into the region of very low concentrations typical of natural water? The so-called linear dose-response assumption is that the data would fall along the straight line shown. But is it a good approximation to what happens at those low concentrations? Many scientists believe that the model is not a good one, and there are cases where it is known to be wrong. One could argue that the body has mechanisms for dealing with very low concentrations of carcinogens and that at some low level a carcinogen is not a threat at all. The actual response thus might be more like the curved line shown. This is called the threshold model.
To test the model on a specific case, George S. Bailey and colleagues at Oregon State University studied the effects of extremely low dosages of a known carcinogen, dibenzo[a,l]pyrene on more than 40,000 rainbow trout. With such a large population of fish, small excess cancer levels could be detected. These studies extended the studies of this compound to concentrations a thousand times lower than had been done before. The results showed that the linear dose response method greatly overestimates the cancer risk from this compound at these low concentration levels, by a factor of between 500 and 1,500. In other words, their data are consistent with the threshold model; their data would fall somewhere around the blue line in the graph (however, my drawing is just an approximation, not an accurate representation of the fish study). While this study applies for just this one substance and its effect on one species of fish it is important, because it shows that the linear dose-response model can greatly overestimate the dangers associated with very low concentrations of known carcinogens or other toxic substances. This study thus is significant for those in the EPA responsible for setting exposure limits.
Most people long for certainty in the affairs of their lives. They wish assurance that their jobs are secure for the foreseeable future, that their children are safe at school, that their spouse or significant other is faithful to agreements they have made. Yet we know that many aspects of life are uncertain. We’re unsure of the future of the housing market, of the impacts of climate change, of – well, of an awful lot of things! So we look for pillars of certainty, things that we know are true and will stay true regardless. The foundational documents of governance, such as the Constitution; religious dogma; fundamental scientific laws – all these promise certainty of a kind in particular domains of our concerns. Our reliance on these certainties is often so deep-seated that we instinctively react against evidence that they may not be as immutable as we have been led to believe.
This nearly universal need for certainty poses a continuing challenge for science in its attempts to inform the society outside the scientific community about how the world is. A great many ideas, theories and laws are very widely accepted in science and taken for all practical purposes to be true. Many of these form the basis of the technologies that undergird our modern life. If what science has to say about the operations of lasers were not true, how could they be effective in the multiple uses made of them, from removal of cataracts to reading the contents of CD’s and DVD’s? If very complex theories of combustion and turbulent gas flow were not very reliable descriptions of how airline jet fuel burns and propels a jet aircraft, how likely is it that jet aircraft would ever get off the ground? These examples and thousands of others like it promote the notion that science is a fountain of rigorously true statements and ideas. Yet a great deal that concerns scientists in the course of their everyday work , and that necessarily influences opinions they must deliver on matters consequential to the public good, is clouded by uncertainty. Just how much of a particular carcinogen can we have in our water supply before it constitutes a significant health risk? Scientists can attempt to narrow the range of uncertainty but there is no such thing as a single, true answer.

Saturday, September 5, 2009

Scientific witness in the courtroom



This past June the U.S. Supreme court issued a decision with fascinating ramifications for the notion of scientific authority. They ruled that reports from crime laboratories may not be used at trial against a criminal defendant unless the analysts responsible for creating the data are present to give testimony and bear cross-examination. This decision is an interpretation of the Sixth Amendment, which provides in part that an accused has the right “to be confronted with the witnesses against him. “

The case brought before the court arose from the conviction of Luis E. Melendez-Dias on cocaine trafficking charges. Part of the evidence against him was a laboratory report stating that bags of white powder allegedly belonging to Melendez-Dias contained cocaine. The lab report was submitted by prosecutors with an analyst’s certificate, but no analyst appeared to give testimony.

Adam Liptak, reporting in the New York Times on the court decision, notes the unusual way in which the court divided on the 5-to-4 decision. With the majority were Justices Scalia, Thomas, Stevens, Souter and Ginsburg. Dissenting were Justice Kennedy who wrote vigorously for the dissenters; Roberts, Alito and Breyer. This was no ordinary cleavage along conservative/liberal lines! Scalia, writing for the majority, took the view that defendants had the same right to confront adverse expert testimony as they enjoy with respect to any other form of testimony. Kennedy, in dissent, pointed to the huge disruptions that might occur in court cases if every laboratory report needed to be brought to the court by a bona fide representative of the laboratory making the report; that is, a real analyst who could as needed provide expert background testimony. However, according to Jeffrey L. Fisher, a law professor at Stanford, who represented Mr. Melendez-Dias, about a third of states already follow procedures that comply with the new decision.

The court’s decision raises several questions that are of importance for science’s relationship with society. The first level at which to approach this is to ask whether there is a matter here at all of science’s epistemic, or expert, authority. Justice Scalia, in his majority opinion wrote that the Constitution would require allowing defendants to confront witnesses even if “all analysts always possessed the scientific acumen of Mme. Curie and the veracity of Mother Teresa.” In other words, even assuming that the crime lab results are of the highest scientific quality, and that the reporter is a person of impeccable moral standards, the Constitution requires that the defendant have the opportunity to confront the witness. But of course not all crime lab personnel are fully competent, free from making errors of various kinds, or always above suspicion of reporting results tilted toward the prosecution’s case. In these respects, cross examination of a witness reporting forensic results is of a kind with cross examination of any other witness. It really doesn’t go to the question of whether the scientific principles and applied technology that undergird the reported results are sound and relevant to the evidence being presented. Nor does it cover the question of whether the laboratory has obtained the results through full and competent observance of all the required protocols.

There is plenty of reason to be concerned regarding the quality and veracity of much forensic evidence. The National Academies of Science in February of this year issued a report on the state of forensic science in the US, and on what steps might be taken to strengthen it. Quoting from the report’s executive summary, “…in some cases, substantive information and testimony based on faulty forensic science analyses may have contributed to wrongful convictions of innocent people. This fact has demonstrated the potential danger of giving undue weight to evidence and testimony derived from imperfect testing and analysis. Moreover, imprecise or exaggerated expert testimony has sometimes contributed to the admission of erroneous or misleading evidence.” Defense counsel thus may have plenty of grounds for questioning the technical witness that brings forth the forensic evidence. It may be a good strategy to question the basic scientific assumptions underlying the methods employed. On the other hand, when the matter at hand is fairly simple, raising too many questions can be a poor strategy in that it merely serves to call attention to the results. In any event, the defense now possesses a power it had not previously had. Justice Kennedy and the other dissenting justices seem to be very concerned about cases where the analyst may not be available, or where over the passage of time the analyst may have retired, changed jobs and so forth. These are legitimate concerns, but they seem to me to pale in comparison with the prospect of defendants facing a written laboratory report without any means of cross examining the person or persons responsible for the analyses.


It is not always easy to keep the non-scientific world attuned to the notion that the doing of science, even at the level of often mundane analysis of forensic materials, is a human activity. Science is not really about objective truth in some abstract and disembodied sense. It is about kinds of truth found in the course of looking at aspects of the world with a certain kind of eye, with a certain ethic of disinterestedness. That sort of work is done by humans. Even given the best of intentions, errors may be committed, omissions may occur. When scientists report to the larger society on what they have done they should not be perceived as oracles, presenting something drafted by Gods. It is one of science’s shortcomings that is has not engaged the larger society as fully as it should, that science is not seen as the product of human endeavor. Yes, the social structure of science does go a long way toward weeding out errors and falsifications during the formation of new knowledge, but in the day-to-day applications of science, as in a forensic laboratory, human nature is at work. When someone’s future may hang on the outcome of courtroom deliberations, the human who has generated scientific evidence that bears on the case should be there to testify to it.