Abstract
An important part of the work of forensic scientists is communicating accurate information to lay factfinders under conditions of uncertainty. It is an ethically demanding role as it obliges scientists to disclose information that may call their own authority into question. Similar issues arise in other areas of applied science, for example climate science. This article builds the ethical framework for scientific communication under uncertainty proposed by Keohane, Lane and Oppenheimer and argues that with some modifications their work provides useful guidance for forensic scientists. It also questions whether the current system of Streamlined Forensic Reporting is compatible with that framework.
‘Forensic ethics’, a term hitherto used mainly in relation to forensic psychiatry, addresses the relation between ethics of professions that provide evidence to the courts and the demands placed on them by the law. 1 This article discusses the adaptation to legal contexts of the ethics of scientific communication.
The duty to communicate their findings clearly to police, defendants, the courts and other stakeholders is arguably the most important and most onerous ethical responsibility of forensic scientists. 2 While the public have a right to expect scientists in general – or at least those whose research is publicly funded – to communicate their findings clearly and accurately to non-scientists who have an interest to them, forensic science is unusual in the degree to which the communication of information figures as a core skill of the profession, 3 as well as in the dire consequences that can follow from misleading or unclear communication. 4
Scientific communication should aim not only to be accurate but, as Onora O’Neill argues, to be both accessible to and assessable by its audience. 5 These aims are particularly important where scientific knowledge is uncertain and where non-scientists have to make important practical decisions in the light of this uncertain information. This is true in the forensic context, and it is also true in many areas of public policy, such as responses to pandemics and climate change. Citizens or representative bodies must make judgments as to whether the claims of the experts are significantly persuasive to justify (or forestall) action. 6 Experts have a responsibility to explain their conclusions, and any uncertainty that attends them, in a way that makes such judgment possible. The implications of this responsibility for the ethics of communication in policy contexts have been explored by a trio of scholars at Princeton: political scientist Robert O. Keohane, philosopher Melissa Lane and environmental scientist Michael Oppenheimer. 7
This article will use the five ‘core principles’ developed by Keohane et al. as the basis for a discussion of the ethics of communicating forensic science (I take much the same principles to apply to other kinds of expert evidence, 8 but I shall not discuss these here). In a somewhat optimistic vein, I shall argue that adherence to these ethical precepts can result in expert evidence that is ‘accurate, accessible and assessable’ – the ‘three As’ discussed in the first section below. Such effective communication depends, however, on the willingness of judges and juries to approach expert evidence with an attitude of ‘critical trust’. 9 The second and longest section of the article considers how to adapt the five core principles in the forensic context, and the third considers the ethical challenges presented by ‘streamlined forensic reporting’.
The Three ‘A's
I have taken the liberty of adding another ‘A’, for accuracy, to the already alliterative principle formulated by O’Neill: that communication of any kind ‘is ethically acceptable only when it aims to be accessible to and assessable by its audiences’. 10 In the absence of those aims there would not be any genuine act of communication at all, but only some kind of manipulation or self-expression. Some acts of communication, and particularly of scientific communication, are intended to be understood and assessed only by a restricted audience. According O’Neill, however, it is a feature of the scientific enterprise that its ‘intended audiences’ embrace ‘the world at large’. Anyone (of ordinary communicative capacity) should be able to understand what is claimed to be scientific truth and should be able to assess the reasons that science offers to ‘the world at large’ for believing those claims 11 – reasons that may differ from the reasons scientists provide to their fellow scientists.
Accessibility to, and assessability by, the world at large are also important legal values. 12 One of the advantages of trial by jury is that it compels lawyers, witnesses and judges to convey their arguments and evidence in ways that can be understood and accepted (or criticised) by the general public. 13 What is conveyed to the jury must not, in the words of a famous Scottish judgment, be mere ‘ipse dixit’, but must ‘furnish the Judge or jury with the necessary scientific criteria for testing the accuracy of their conclusions, so as to enable the Judge or jury to form their own independent judgment by the application of these criteria to the facts proved in evidence’. 14
The ‘scientific criteria’ demanded by the Davie principle – as I shall term this legal version of the three ‘A's – will generally differ from the criteria by which other scientists judge a colleague's work. They must be ‘scientific’ in the sense that they accord with the knowledge and methods of the relevant scientific community, but they must also be criteria that can be understood and applied by a lay audience. As a Royal Society report (to which Baroness O’Neill contributed) puts it, ‘assessable’ information is information about which ‘judgments can be made as to the data or information's reliability…. Data must therefore be differentiated for different audiences’. 15 It must allow ‘those who follow it not only to understand what is claimed, but also to assess the reasoning and evidence behind the claim. If scientific communication is not assessable by its audiences they will be unable to judge when or why its claims are trustworthy’. 16 In the legal context, ‘assessment’ involves both assessment by the judge of whether the evidence is ‘relevant and sufficiently reliable to be admitted’, 17 and assessment by the jury in light of the factual evidence in the case.
On O’Neill's account, some degree of trust on the part of the audience, as well as trustworthiness on the part of the expert, is required in order for scientific communication to succeed: Unless speakers and audiences adhere to certain mutually accepted epistemic and ethical norms, and take one another to adhere to those norms, communication cannot succeed.…If an audience is suspicious, or mistrusting, and assumes that a speaker has violated certain epistemic or ethical norms – e.g., is lying, or misleading, or irrelevant – they may not follow, let alone accept, what the speaker says: a speaker cannot inform a radically mistrusting audience.
18
What is needed is not ‘blind trust’, which would preclude any independent assessment of the expert's conclusions, but rather ‘critical trust’. 19 What O’Neill's remarks point up, however, is that a certain initial trust in the honesty of experts is necessary for critical trust to get off the ground. 20 A critical audience will trust the experts to arrive at reliable conclusions only to the extent that the experts can justify that trust by pointing to second-order reasons for belief such as their track record and an intelligible account of the basis of their conclusions. 21 That account of why their testimony should be trusted as being scientifically valid will normally itself form part of their testimony (including hearsay testimony about the research of others). 22 If the expert can be trusted to provide an honest account of the scientific basis of their testimony, that account can form the basis of a judgment of what weight to give to the expert's conclusions. This makes possible an attitude of ‘weak deference’ – deferential inasmuch as the expert's conclusions are accepted without fully understanding or assessing the reasoning behind them, but only weakly so inasmuch as the weight accorded to the evidence depends on an assessment of the second-order reasons for believing the expert is likely to be right. 23 An attitude of ‘radical mistrust’, by which every claim to expertise is suspected of being self-serving and dishonest, would undermine the conditions for this weak form of deference. The Court of Appeal has shown an awareness of the danger of ‘radical mistrust’ both in its strictures against unfounded attacks on an expert's integrity, 24 and in its condemnation of experts whose behaviour is seen as corrosive of trust. 25
Forensic science evidence is often characterised by uncertainty. Even the most reliable techniques can provide only probabilistic conclusions, rather than the certain individualisation of a unique source. 26 Probabilities and error rates often cannot be specified with precision, or can only be specified relative to a reference class of uncertain appropriateness. 27 Even where the source of trace evidence can be identified with a high degree of probability, there is often uncertainty as to how the trace came to be deposited where it was found (in other words, as to ‘activity-level’ propositions that might be inferred from the evidence). 28 Consequently, making evidence accessible and assessable is often a matter of conveying accurately the limited degree of accuracy of which the science is capable. This is something that forensic science has in common with many other kinds of scientific evidence relevant to civic decision-making.
Keohane, Lane and Oppenheimer's ‘Core Principles’
In ‘The Ethics of Scientific Communication under Uncertainty’, Keohane et al. focus on the problems of communicating uncertain science (in particular, climate science) to what they call ‘attentive publics’: that is, to citizens who take a close interest in policy questions which require the consideration of scientific evidence. 29 They build on O’Neil's dictum about ‘accessible and assessable’ science, together with what they take to be key scientific values (including accuracy or precision) and the ‘social contract’ by which scientists are accorded a measure of deference in exchange for producing intelligible and relevant information. By justifying the core principles in this way they avoid committing themselves to any overall ethical theory, such as deontology, consequentialism or virtue ethics. 30 Their appeal to scientific values should resonate with the values of forensic scientists who like to see themselves as embodying scientific objectivity and integrity rather than serving the interests of the police, the defence or other criminal justice agencies. 31
A central theme of Keohane et al.'s article is the need for ‘trade offs’ between competing values; because the information that can be communicated effectively to lay publics is inevitably less comprehensive than that which can be communicated to fellow scientists, some degree of accuracy may have to be sacrificed to achieve accessibility, or vice versa. They propose a framework of five ‘core principles’ of scientific communication: honesty, precision, audience relevance, process transparency and specification of uncertainty about conclusions. Honesty must never be ‘traded off’ against any of the other four principles, but trade-offs among those four, for example between precision and audience relevance, are permissible.
All of the core principles can be derived from the obligation to communicate accurate information in ‘accessible and assessable’ ways: Honesty is a condition of making a speech-act one of communication rather than manipulation; precision is a value of communicating science in particular; audience relevance is a condition of making a communicative act successful, in that it succeeds in actually making sense to (being ‘accessible to’) its audience; while process transparency and specification of uncertainty about conclusions are both conditions of making communication ‘assessable by’ its audience.
32
Considering the five principles in turn will show that they are a good fit for the work of forensic scientists (although in this context the last three principles on the list may seem to overlap a good deal), but that they are also quite demanding.
Honesty (Candour)
It is uncontroversial that scientists like anyone else should tell the truth in the witness box, or in a written witness statement. 33 Honesty, however, is a more demanding virtue than that. According to Keohane et al, it also precludes ‘presentation of deliberately incomplete information in a biased way’ and ‘statements that fail to take due account of criticism’, 34 and it entails an obligation to be careful with the truth.
These wider requirements of honesty can also be described, in the terminology of draft statutory Code of Practice issued by the Forensic Science Regulator, as the ‘duty of candour’. 35 The application of the phrase ‘duty of candour’ to expert witnesses appears to originate in the case of Gardiner & Theobald LLP v Jackson, 36 where Holgate J stated that experts owe a similar duty of candour to that of solicitors, and could be subject to a similar procedure of being called before the court to explain why the issue should not be referred to their professional body. 37 The duty of candour owed by solicitors is a duty ‘to disclose all material facts to the judge, even if they are not of assistance to [their client's] case’. 38 A similar duty also rests on public authorities in judicial review proceedings, and those who make witness statements on their behalf, ‘to assist the court with full and accurate explanations of all the facts relevant to the issues which the court must decide’ and to be careful not to mislead the court by ambiguity, omission or ‘spin’. 39 Although not defined in the draft code, as applied to experts the duty is evidently one to assist the court with a full and accurate explanation of the scientific (or other) basis of the expert's conclusions, including any factors which tend to undermine the reliance placed on those conclusions by the party calling the expert.
Some of the most important elements of the duty of candour are encapsulated in the ‘Statement of Understanding Declaration of Truth’ that expert witnesses are required to append to their reports:
I have exercised reasonable care and skill in order to be accurate and complete in preparing this report. I have endeavoured to include in my report those matters, of which I have knowledge or of which I have been made aware, that might adversely affect the validity of my opinion. I have clearly stated any qualifications to my opinion.
40
where there is a range of opinion on the matters dealt with in the report—
As Holgate J stressed in Gardiner & Theobald, signing such a declaration must not be considered ‘a mere formality’.
41
Indeed where such a declaration is flatly untrue it may expose the witness to severe penalties for contempt of court.
42
Further, the Criminal Procedure Rules state that an expert's report must:
summarise the range of opinion, and
give reasons for the expert's own opinion[.] 43
So an expert in a controversial field such as gait analysis 44 should, as a matter of both candour and compliance with the rules, set out the criticisms that have been made of the validity of their methods and explain why they nevertheless regard such methods as yielding reliable evidence. This is also required to comply with the duty to ‘include such information [in the report] as the court may need to decide whether the expert's opinion is sufficiently reliable to be admissible as evidence’. 45 An expert who knowingly omits this information acts dishonestly by signing the declaration of truth, and may well be guilty of an offence under the Criminal Justice Act 1967, s 89, and also of professional misconduct, as in Kumar v GMC. 46 In Kumar, Ouseley Jupheld a number of findings of professional misconduct against a consultant psychiatrist who prepared a medical report in support of a defence of diminished responsibility in a murder case. One of those findings was that, by failing ‘to mention’ that the diagnosis he relied on was a controversial one, Kumar had acted recklessly, in that he created an unacceptable risk that the defence solicitors would rely on his diagnosis without realising that because of its controversial nature it would be likely to be challenged. 47
Because the principle of honesty sometimes requires experts to undermine the authority of their own opinions and factual conclusions, it has not been free from controversy in the wider debate about scientific communication. Stephen John argues that sometimes, scientific statements that are dishonest in one of the above senses may nevertheless be ethically justified if they are effective in getting the audience to believe something that is true and that it is in the audience's interest to believe. 48 In other words, scientists do not, at least in contexts such as climate change where false claims about scientific uncertainty are commonplace, have a duty to disclose information that undermines conclusions which they confidently take to be true. Rather than being misleading, John argues that such partial information may be ‘well-leading’. 49
The notion of ‘well-leading’ testimony is unacceptable in a criminal trial, 50 because it disregards the importance of assessability. What is important in a criminal trial is not simply that the jury arrives at the truth, but that they arrive at it for reasons which they as ‘peers’ of the defendant and representative members of the community can understand and rationally accept. 51 A jury that is led by the nose to a verdict which reflects what the experts regard as true has not fulfilled its function. The expert owes the jury an honest account, not simply of the conclusions that the expert's methods support, but of the reasons why the jury should accept those conclusions. This explains why a demanding conception of honesty is important and it also explains the importance of some of the other core principles in the legal context.
Precision
Keohane et al. observe that precision is always desirable in communications among working scientists, but in communications with the public it may not be desirable if it confuses the audience and is only marginally relevant to what they want to know. 52 Beth Bechky gives a simple example in her ethnography of American forensic scientists. For scientific purposes it was standard practice to record separately the levels of gunshot residue found on each hand of a suspect or victim, but scientists felt that this information might confuse investigators or jurors by leading them to assume that it was relevant to which hand or hands had been used to hold the gun – in the scientists’ view, no such inference could be safely drawn and it was more helpful to report a single combined figure. 53
Precise communication with fellow scientists is an ethical rather than merely a technical standard because scientists have a responsibility to make their full results available for scrutiny by their peers. In the forensic context, it is important that experts make the precise methods and any calculations behind their conclusion available for scrutiny by lawyers and experts for other parties in the case. 54 Subject to any issue of public interest immunity, making this information available is one of the obligations of the prosecution in the disclosure process. 55
When it comes to giving evidence to a jury, the value of precision is not always so clear. It may not serve to make the evidence ‘accessible and assessable’. We shall return to the issue of accessibility, or intelligibility, in the next section. The question whether ostensibly precise information is susceptible to assessment by juries has troubled the Court of Appeal in a number of cases. For example, in Slade, the Court was concerned about how evidence from a then novel form of Automatic Speaker Recognition technology should or could appropriately be presented to a jury…. [T]he system simply produces a result, expressed in a mathematical formula (with the attendant danger of a potentially misleading appearance of certainty), but without any explanation of which features of similarity or dissimilarity have contributed to that result.
56
Consistently with the Davie principle, the court places more emphasis on an intelligible explanation of the findings (an aspect of ‘process transparency’) than on their precision.
The difficult relation between precision and assessability is illustrated by a case which caused some consternation a decade ago, T (Footwear Mark Evidence). 57 The prosecution's footwear expert expressed the view that there was ‘a moderate degree of scientific evidence to support the view that the [Nike trainers recovered from the appellant] had made the footwear marks’. 58 He omitted to mention the calculations by which he had arrived at an estimated likelihood ratio of 100:1, which was at the top end of the range that a group of forensic science providers had agreed to label as ‘moderate support’. That omission led to defence counsel cross-examining the expert on a statistical basis significantly less favourable to the defendant than the expert's own assumptions. The non-disclosure of the calculations was ethically and legally indefensible, ‘an elementary and catastrophic failure of transparency’. 59 Unfortunately rather than simply insisting that the calculations should have been revealed, the Court of Appeal, worried that the apparent precision of the 100:1 figure would create an unwarranted ‘verisimilitude of mathematical probability’, 60 said that the expert should not have used such calculations at all, and suggested that he should have limited himself to saying that the shoes in question ‘could have made’ the marks. While the court was right to think that the most precise statement possible is not necessarily the most helpful to the jury, it is difficult to see the advantage of a conclusion so imprecise that the jury has to make a wholly uninformed guess as to how many other shoes ‘could have made’ the marks. It is also very strange from an ethical point of view to demand that a scientist should refrain from making as precise an estimate as possible so as to be able to determine how best to convey the probative value of the evidence to the jury. What both precision and honesty demanded was that the expert explained the calculation while acknowledging its imprecision: ‘Admit when you don’t know, when you’re guessing, and when your opinion is only a reasonable estimate’. 61 Luckily the Court of Appeal's attempt to place a ban on statistical estimates seems to have been quietly forgotten – the case is listed in Westlaw as having attracted ‘no significant judicial treatment’.
The T case also illustrates the Royal Society's point that data must ‘be differentiated for different audiences’.
62
It was essential that the defence lawyers, and any experts they might wish to consult, had full access to the calculations, but not that the same information be presented in full to the jury. A group of forensic scientists and statisticians commenting on the case observed: candour and full disclosure in court can undermine comprehensibility when scientific evaluations involve technicalities. Pretrial hearings should be used to explore the basis of expert opinions and to resolve if possible any differences between experts.
63
This is consistent with the guidance given in Reed 64 about the need for experts in their pre-trial report to set out ‘with precision’ any issue that might be in dispute; for judges in appropriate cases to exercise the power under what is now Crim PR 19.6 to direct the experts to hold a pre-trial discussion and prepare an agreed statement; 65 and for the court to vet the terms in which it is proposed the evidence should be given and ensure that avoids giving an undue ‘verisimilitude of scientific certainty’ to evaluative opinions. 66
Audience Relevance
Keohane et al.'s principle of ‘audience relevance’ has two aspects: intelligibility and ‘policy relevance’. The main point they make about intelligibility is that, as we have already noted, it necessarily involves some simplification and therefore some loss of information, but that any simplification that is either deliberately or negligently misleading will breach the cardinal principle of honesty. This is closely related to the problems that will be discussed under ‘specification of uncertainty’ below.
It is beyond the scope of this article to consider which forms of presentation will in fact succeed in making scientific information intelligible. 67 The important ethical point is that the expert witness should strive to maximise the amount and precision of information that is effectively communicated to the jury and that while to some extent precision may need to be sacrificed for the sake of clarity, this should never permit the use of terms with a clear potential to mislead, like describing a sample as a ‘match’ to a particular source. 68
‘Policy relevance’ is not, of course, the relevant kind of relevance in a forensic context. What is needed here might be termed ‘narrative relevance’. An abundance of literature attests to the importance of narratives in legal factfinding. 69 In a criminal trial, the prosecution is generally trying to establish that there is only one coherent and plausible narrative consistent with the whole body of evidence in a case, and it is one that includes all the elements of criminal culpability on the defendant's part. 70 The defence, on the other hand, needs to establish that the evidence could fit one or more coherent and plausible narratives consistent with innocence. What the jury needs to know, therefore, is how plausibly any piece of scientific evidence can be fitted in to any of the competing stories.
This is why ‘activity-level’ propositions (e.g., about the likelihood that DNA was deposited when the defendant handled a knife) will typically be more helpful to juries than source-level propositions alone. Knowing that a trace is highly likely to originate from a particular source is of limited assistance if one has no idea of how the trace may have been deposited, and may be misleading if a jury assumes that a trace must result from direct contact because they are not informed about other possibilities. It is important in this context to bear in mind that part of what makes scientific evidence assessable by juries is that whether certain possibilities can be ruled out often depends on ‘common sense’ judgments which the jury is as well-equipped to make as the scientists. For example in Iqbal, 71 the defendant ran a car valeting service, and it was suggested that his DNA might have been transferred to the magazine and ammunition of a loaded gun via a cloth or glove that he had left in a car – as, he said, customers sometimes requested him to do. The Court of Appeal thought it was obviously implausible that a ‘professional criminal’ would use a glove or cloth provided in this way. 72 Whether that is really so obvious may be debatable, but it is not a matter on which science is likely to shed light, and such contestable judgments as to what is or is not a plausible story are impossible to eliminate from criminal proof. The important point is that audience relevance is not a licence for ‘epistemic trespassing’ 73 – the audience needs to understand that on some questions, science has nothing relevant to contribute.
Process Transparency
In Keohane et al.'s scheme, ‘process transparency’ is a principle that refers largely to communication with specialised audiences. They initially formulate it as follows: providing a clear description of the scientific process of inference, and the process of peer review, in such a way that scientifically qualified members of the audience could check the validity of the conclusion for themselves.
74
The duty to make records of tests, etc., and the material on which they were carried out available for inspection is recognised in the Crim PR and in case law. 75 Process transparency, however, involves providing juries and defendants, as well as scientists, with the information about scientific processes that they need in order to assess the expert evidence.
The importance of process transparency in this wider context stems from the importance of explanatory narratives to the assessment of testimony, including expert testimony. A lay audience will probably have a limited ability, if any, to assess critically an expert's reasons for their conclusions. What they can assess is whether there are good second-order reasons to believe that the experts have good reasons for their beliefs – for example, that they have been trained in a technique that has been shown to be reliable, and have applied it to the case at hand. The Davie principle discussed above requires that experts provide fact-finders with an explanation of the process of testing, inference and peer-review (if any) by which they reached their conclusions. It is this explanation which, if ‘intelligible, convincing and tested’ (by cross examination and in some cases by independent expert evidence) provides a basis for the jury to rely on the expert evidence. 76
Admittedly, process transparency as a basis for the assessment of expert testimony has significant weaknesses. One is that, as Harry Collins puts it, ‘distance lends enchantment’ – in the narratives of scientific discovery that are accessible to the public, what is ‘nuanced and unclear’ to those within the relevant expert community ‘becomes, paradoxically, sharp and clear to those outside it’. 77 On the other hand, the scientific process is vulnerable to ‘the manufacture of doubt’ 78 – artificially created scientific controversies designed to convince the public that the science is uncertain – which is considered a particular danger in adversarial litigation. 79 The adversarial process can also create the opposite danger: that because the opponents of the party calling an expert witness have the opportunity to test the evidence, the jury will take a failure to challenge the evidence effectively as equivalent to a test that the evidence has ‘passed’, rather than reflecting a lack of skill, motivation, or funding on the part of the relevant legal team. 80
If forensic scientists are committed to making their science accessible and assessable, they have an ethical obligation to counter these drawbacks by being transparent from the start about those aspects of the scientific process that give rise to ambiguities or potential unreliability in its results. This also follows from the ‘narrative relevance’ aspect of the principle of audience relevance: it is highly important that the audience be aware of those aspects of the process which are consistent (in the case of prosecution evidence) with the defence narrative, for example that information given to the scientist might have led to cognitive bias. This ethically demanding aspect of the duty of candour is consistent with the ‘FoRTE’ (Forensic Reconstruction of Trace Evidence) model developed by Ruth Morgan and her colleagues. In this approach it is an essential part of the process of reconstruction to ‘understand the role of human decision making and expertise in the production of inferences, and identify any assumptions being made that impinge upon those inferences and the significance assigned to the conclusions drawn’.
81
It is essential that (for example): the verification process should be well documented and be disclosed as a matter of normal practice. It needs to be clear whether the verifier was blind to the extraneous case details, decisions and notes of examiners before the verification, and whether the decisions or confidence levels on decisions made by verifier were different than that of the examiner.
82
These points may be important in an admissibility hearing, in determining how the findings can be presented to the jury without being misleading, and might also be worth highlighting in the judge's summing up where the possibility of scientific error was important to the defence.
The requirements for reporting in the Crim PR do not explicitly require experts to state whether their conclusions have been verified by another expert or, if so, whether the verifier was aware of the expert's identity or the conclusion reached. Rule 19.4(h), however, requires a report to ‘include such information as the court may need to decide whether the expert's opinion is sufficiently reliable to be admissible as evidence’, and the criteria applied to determine whether the evidence is sufficiently reliable include ‘the extent to which any material upon which the expert's opinion is based has been reviewed by others with relevant expertise … and the views of those others on that material’. 83 The extent of review within the laboratory appears to fall within this requirement, and ‘extent’ can be interpreted as including the extent to which the review was independent. The Crim PR also requires a statement of ‘the substance of all facts given to the expert which are material to the opinions expressed in the report, or upon which those opinions are based’. 84 Facts which are not strictly relevant to the opinions expressed (e.g., about the previous convictions of someone associated with trace evidence) could be considered ‘material to’ those opinions if they were capable of creating a cognitive bias that affected those opinions. From an ethical point of view, it would be appropriate to adopt this broad interpretation.
Unfortunately, these legal requirements do not apply to ‘streamlined’ forensic reports. We will return to this issue after considering the last of Keohane et al.'s core principles.
Specification of Uncertainty
What is required under this heading is well summarised in the influential report by the US National Academies of Science, Strengthening Forensic Science in the United States: [L]aboratory reports…should describe, at a minimum, methods and materials, procedures, results, and conclusions, and they should identify, as appropriate, the sources of uncertainty in the procedures and conclusions along with estimates of their scale (to indicate the level of confidence in the results). Although it is not appropriate and practicable to provide as much detail as might be expected in a research paper, sufficient content should be provided to allow the nonscientist reader to understand what has been done and permit informed, unbiased scrutiny of the conclusion.
85
This recommendation is in harmony with the Davie principle and is also echoed in the Forensic Science Regulator's guidance. Staff providing expert evidence should be ‘able to explain their methodology and reasoning, both in writing and orally, concisely in a way that is comprehensible to a lay person and not misleading’, and this includes explaining the extent to which the body of specialised literature supports or undermines their conclusions, and ‘any outstanding concerns’ their peers may have about their methodology and reasoning. 86
The Criminal Procedure Rules contain a number of provisions requiring experts to reveal sources of uncertainty about their conclusions. 87 These are, however, subject to an enormous loophole, which will be considered in the next section on ‘streamlined forensic reports’.
In the case of forensic science, where the certainty or uncertainty of the results is largely a product of the processes by which those results were obtained, and by which the methods used to obtain them have been validated (or not), it is hard to draw a sharp distinction between ‘specification of uncertainty’ and ‘process transparency’. In some cases the degree of uncertainty can be specified with some precision, as a known error rate or random match probability, and the process by which those specifications were arrived at can be explained. In other cases the sources of uncertainty can be specified but the degree of uncertainty is unknown. For example, in Atkins the prosecution's facial mapping expert ‘identified nine different factors which could affect the reliability of his exercise’. 88 Although this appears exemplary from an ethical point of view, what was more controversial (and was unsuccessfully challenged in the Court of Appeal) was his specification of the degree of support his work provided for an identification of one of the defendants ‘at the top of 3 and into 4’ on a five-point scale – that is, between ‘lends support’ and ‘lends strong support’. 89 Arguably, this was too specific – it implied a degree of certainty about how uncertain the identification was which could not be scientifically justified. 90
Cases like this raise a difficult problem for the ethics of scientific communication. In this kind of situation, where the sources of uncertainty can be specified but the degree of uncertainty cannot, is it possible to convey the uncertainty in a way that is ‘accessible to, and assessable by’ the jury? In an important contribution to the ethics of forensic science, Edmond and colleagues argue that it cannot. They defend an ethical view not very different in its practical consequences from the one advanced here, based on principles of ‘disclosure, transparency, epistemic modesty and impartiality’; but they also deny that adherence to these principles can solve the problems inherent in epistemically weak forensic methods of the type used in Atkins: While greater disclosure, transparency, epistemic modesty and impartiality might signal limitations to the technically literate these might not be appreciated by less methodologically sophisticated audiences. Mere disclosure of oversights and limitations does not necessarily enable the tribunal of fact to evaluate the evidence …. Knowing about unknowns does not necessarily facilitate rational decisionmaking. It may produce too much caution or, more problematically given the burden and standard of proof, may lead to the over valuation of evidence.
91
The problem, however, is that in such cases nobody knows what is ‘too much caution’ or an ‘over valuation of evidence’. There is likely to be a range of views which could be considered ‘rational’, in the sense that they respond both to the reasons for caution and to the reasons for attaching some evidential weight to the relevant piece of testimony. This is not just a problem about forensic science but about the evaluation of most kinds of evidence, in particular witness testimony. We know, for example, that eye-witnesses are less than fully reliable, but we do not know with any precision just how unreliable they are. Different jurors may, without any irrationality, 92 balance the reasons for credulity or scepticism in different ways. In a case like Atkins, for example, some may be more impressed than others by the unlikelihood that any of the nine factors set out by the witness would coincidentally create a false resemblance to a man against whom there was considerable circumstantial evidence; or some may be more disposed than others to think that a witness who appears to be scrupulously setting out the weaknesses of his own evidence can be accorded a degree of trust when he estimates how strong the evidence nevertheless is. 93 Jurors construct or accept explanatory narratives that fit the evidence as a whole, not just the scientific evidence in isolation, and this is a rational process of inference, even if the probative value of any one piece of evidence can rarely be specified with mathematical precision. 94 What the jury needs to know (as discussed under ‘Audience Relevance’ above) is, in the case of prosecution evidence, whether there are sources of uncertainty that cannot be ruled out on scientific grounds alone; 95 and in the case of defence evidence, whether the existence of the evidence can be explained in a way consistent with the prosecution case.
The Ethics of Streamlined Reporting
Compliance with the legal standards for the report which must be submitted where a party wants to introduce expert evidence other than as admitted fact will also involve a large measure of compliance with the principles of process transparency and specification of uncertainty.
We saw in respect of process transparency that the information required under Crim PR r. 19.4(h) to determine whether the evidence is admissible implicitly includes the matters which Crim PD 19A.5 recommends should be taken into account as part of the admissibility decision. These include:
the extent and quality of the data on which the expert's opinion is based, and the validity of the methods by which they were obtained; if the expert's opinion relies on an inference from any findings, whether the opinion properly explains how safe or unsafe the inference is (whether by reference to statistical significance or in other appropriate terms); if the expert's opinion relies on the results of the use of any method (for instance, a test, measurement or survey), whether the opinion takes proper account of matters, such as the degree of precision or margin of uncertainty, affecting the accuracy or reliability of those results…. serve a response stating—
Unfortunately, these salutary requirements to explain the sources of uncertainty are gravely undermined by the exemption of reports admitted under r 19.3(1), which applies where a party ‘wants another party to admit as fact a summary of an expert's conclusions’. Such a summary is known as a ‘Streamlined Forensic Report’, more specifically ‘SFR1’, as distinct from ‘SFR2’ which contains responses to any questions raised by the defence. Under r. 19.3(2)(a), on receiving SFR1 the other party must
which, if any, of the expert's conclusions are admitted as fact, and
where a conclusion is not admitted, what are the disputed issues concerning that conclusion[.] 96
One might think that the onus on the other party (most often the defendant) to identify disputed issues would make it particularly important to inform them of the process by which the conclusions were reached and to specify carefully any uncertainty in those conclusions. If the other party is being asked to agree to conclusions that are contrary to their interests, they ought to be conclusions to which it is reasonable to expect them to agree, so one might expect conclusions to be stated as a ‘lowest common denominator’ to which no reasonable objection could be made. Yet it appears from Gary Edmond, Sophie Carr and Emma Piasecki's study of SFRs that at least in some cases the conclusion is stated in more categorical terms than forensic scientists today usually adopt – namely categorical identifications of the defendant. They also point out that the only justification provided for exempting streamlined reports from the requirements of these rules are considerations of cost and efficiency. 97
From an ethical point of view SFRs are clearly troubling. In terms of the five core principles, preparing such reports meets the requirement of audience relevance inasmuch as it provides a particular audience, typically the CPS, with information in a form that is relevant to their practical purposes. But ethical scientists cannot disregard the effect of their statements on audiences other than their immediate clients 98 – in this case, on defendants and their lawyers. To give the information relevant to the decisions they have to make, and to avoid ‘misleading incompleteness’, the report must comply with the principles of process transparency and specification of uncertainty.
A possible escape from this dilemma is suggested in a highly significant footnote in the Forensic Science Regulator's Codes of Practice and Guidance, advising that ‘where those preparing the SFR1 are aware of further information that might meet the test for common-law disclosure set out above, that information should be communicated to the investigator and by the investigator to the prosecutor’. 99 The prosecutor is then under a legal obligation to disclose the material to the defence as soon as is reasonably practicable. 100 The test that is ‘set out above’ forms part of the guidance as to the duty (not just at common law but under the Criminal Procedure and Investigations Act 1996) to disclose any failure to comply with required standards, as this is ‘information that could significantly detract from the credibility of a witness and may have a bearing on reliability’. 101 Not only non-compliance but other matters that significantly affect reliability fall within this test: for example the Regulator mentions the case of Kumar which, as we have seen, upheld a finding of professional misconduct where a witness failed to mention the controversial nature of the diagnosis used in his report. 102 By analogy, to rely on a controversial method, or one that had not been scientifically validated, without either making this clear in the report or informing the investigator as the Regulator recommends, could be considered reckless (and contrary to the inviolable principle of honesty) because it creates an unacceptable risk that defence solicitors on whom the SFR is served will fail to realise that the results are ones that they ought to challenge. This is not strictly speaking a matter of common-law duties of disclosure but rather of professional ethics, combined with the statutory duty of the prosecution to disclose the expert's reservations to the defence.
An ethical scientist should also ensure that the report itself does not use misleading or unduly categorical language. It would not detract from the time- and cost-saving goals of streamlined reporting to eschew categorical identifications in favour of terms such as ‘associated with’, or, where appropriate, to add ‘boilerplate clauses’ pointing out, for example, that an association between an individual and a trace on an object does not necessarily indicate direct contact between the individual and the object.
Conclusion
Although developed in a very different context from that of forensic science, Keohane et al.'s core principles are based on values of accessibility and assessability that are highly pertinent in the forensic context, and they should resonate with forensic specialists who cherish their identity as scientists rather than servants of law enforcement or of defence lawyers. With some modification, they provide both a good fit with existing legal standards and professional guidance, and a salutary reminder of the importance of openness and ‘epistemic humility’ in reporting forensic results.
The two significant departures I have proposed from Keohane et al.'s framework reflect the differences between forensic and policy-related communication, and the importance of explanatory narratives in legal fact-finding and the evaluation of testimony in general. 103 Thus I have replaced ‘policy relevance’ with ‘narrative relevance’ and have placed greater emphasis than Keohane et al. do on making processes of research, testing, inference and peer review transparent not just to scientifically qualified audiences but to all those to whom scientific communication is addressed – particularly, in this context, the parties and the jury. Because Keohane et al. focus on communication to ‘attentive audiences’ such as parliamentary committees and NGOs, they can reasonably assume that many members of the audience will be scientifically qualified, and those who are not will rely on the judgment of those who are. In general, however, as Keohane's co-author Lane points out in a related article, 104 the ability of experts to explain the basis of their conclusions is crucial to establishing their legitimacy from the perspective of a lay audience.
The explanation that a lay audience needs is one that not only justifies the expert's claim to authority but also points out the parts of the process where there are possibilities of error, bias, disagreement or alternative explanation. To expect experts to highlight areas of uncertainty is a demanding standard – particularly given the pressures to depart from it in the interests of efficiency and ‘streamlining’ – but it is one that is endorsed in the forensic science literature. 105 If experts live up to it they can deliver evidence that is reliable, accurate, accessible and assessable.
Footnotes
Acknowledgements
The author has benefitted greatly from discussions both of Keohane et al.'s work and of a draft of this paper in an online Reading Group on the Philosophy of Experts and Expertise organised (via Facebook) by Gabriele Contessa. He thanks especially Gianluca Andresani, Rachel Herdy, Charlie Lassiter and Jamie Watson.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship and/or publication of this article.
