Abstract
Risk Assessment Technologies (‘RAT’) are currently used by criminal courts to evaluate defendants: their risk to reoffend. Despite critique on opacity, complexity, non-contestability or discriminative impact, some courts have favoured RAT-considerations, in light of alleged accuracy, effectiveness and efficiency. This contribution assesses whether and the degree to which RAT can comply with evidence-related rules and fundamental human rights. After detecting key legal challenges in the United States (US), it makes recommendations for the proper regulation and consideration of RAT. It then moves on to the European regime with a view to assessing whether and how it can better tackle RAT-implementations in the criminal justice system.
Introduction to RAT
A smart map-app works like this: the user gives destination and location data (input); the map-app processes the input and considers relevant factors possibly affecting the trip (weather conditions or traffic, for example); and it gives an output, an optimal route that the user can follow. Which factors are considered by the map-app, whether all relevant factors are properly taken into account, how they are weighed, how the output is reached, how reliable/accurate the map-app is (compared with other map-apps) or with whom the user’s location or other data might be shared by the map-app may not be known by the user.
Do we want this decision-making-rationale in criminal justice settings?
In many criminal justice systems, judges must consider and evaluate evidential materials that are relevant and sufficiently reliable, often in conjunction with other corroborating elements, that they properly weigh. On this basis, they make decisions that need be adequately transparent, fair, justifiable and challengeable. These important dimensions of judicial decision-making can be affected today, where judges consider ‘Risk Assessment Technologies’ (‘RAT’), whose relevance and reliability may be questionable and whose legal status and technological modus operandi can resist transparency, fairness and justifiability.
Trained on group data and applied to individual defendants, RAT input various items of information (like education, gender or criminal history), analyse these items through an algorithmic process and give an output, a score, labelling the defendant as being at a certain (low/high) risk-level to reoffend. In the US, RAT have long been used at various stages of the criminal process, including pretrial-, parole-, prison-, probation- and sentence-related contexts. 1 In Europe, RAT may, in the near future, be actually introduced, since national lawmakers have recently been urged to take measures to regulate artificial intelligence in criminal justice. 2
This contribution analyses the US experience on RAT with a view to identifying key legal challenges and, ultimately, assessing whether and the extent to which Europe is prepared to implement such technologies. More concretely, this paper considers the current (US) legal regime on RAT inadequate. The overarching narrative is this: most regulators have not yet expressly recognised the legal status of RAT as integral part of the criminal procedure (via introducing specific legal provisions in, for example, codes of criminal procedure or rules on evidence); as a result, they have failed to guarantee, first, that judges can fully access and comprehend how RAT work and thus properly consider/review them and, second, that the defence can fully access and effectively challenge RAT. To overcome these failures, this contribution recommends the recognition by law of the legal status of RAT and, in particular, their subjection to evidence rules and rigorous human rights-compliance scrutiny. It further proposes: to make the RAT-source code visible to and reviewable by the judge, the defence and its expert; to exclude certain insufficiently reliable RAT and scrutinise all RAT; to set standards on relevant risk factors to avoid undue discrimination; and to make concrete laws regulating RAT-implementations.
The discussion proceeds as follows. After analysing key functions of RAT, their promises and their limitations, this paper addresses whether and the degree to which RAT may comply with US evidence-related rules. Thereafter, it examines whether and how RAT can respect fundamental human rights and, in particular, due process, equal protection and privacy; and it makes recommendations to regulators. Moreover, it analyses the European legal scheme (vis-à-vis the US-discussion and findings) and makes comments that might be useful for European audiences, who may in the near future see RAT-implementations in national criminal courts. Last, this paper summarises and draws conclusions.
Modus operandi and key challenges
RAT are trained on group data, a reference group of offenders. They further consider certain risk factors, such as criminal history, education, economic situation, family, gender, age or employment status. 3 Thereafter, they evaluate an individual defendant via risk scales and scoring; for example, from 0 to 10 points, where ‘0-3’ may mean low risk, ‘4-6’ can refer to moderate risk and ‘7-10’ may be seen as high risk. 4
The factors considered to assess the risk of reoffending have been selected as (allegedly) accuracy-enhancing criteria after long empirical studies in the field; and, in general, RAT are believed to promote efficiency in the criminal process, as well as effectiveness of the overall evaluation of a given defendant. Given these promises that can, according to RAT-advocates, offer a more complete picture of the accused, state courts in the US consider such technologies to make determinations on pretrial-, parole-, prison-, probation- and sentence-related issues.
Despite the above desirable emergent features, thus far literature has reported a number of serious challenges. First, there can be opacity and complexity. 5 RAT may be proprietary and access to the way in which they function may be denied; and, even where access is granted, judges may lack the expertise to understand the modus operandi of contemporary sophisticated RAT, engaging in non-human-guided processing. 6 Second, there is the unfair discrimination-argument: sorting defendants into groups on the basis of certain protected grounds (from gender to wealth) may raise constitutional concerns. 7 Third, there is critique for error rates, bias or lack of review/validation. 8 In fact, at least one court considered a RAT, which had not been validated for the state’s own population. 9
In light of the above critique, it may be fair to claim that reliability of a given RAT is not to be taken for granted. This is particularly the case with private RAT that may be subject to proprietary rights not allowing for access to the source code; the latter is an essential element in assessing reliability, because it reveals what tasks are performed by a technology and in which way and sequence it performs these tasks. 10 That certain RAT may not be fully accessible and reviewable by the judge and the defence calls for an assessment of RAT’s compliance with evidence law and fundamental human rights.
Compliance with evidence law
In general, fact-finding has been seen as a human-oriented process, aimed at giving transparent and reliable information. 11 To satisfy these basic demands of the fact-finding process (a ‘human-oriented’ process favouring ‘transparency’ and ‘reliability’), RAT-assessments must involve a human expert, capable of accessing and scrutinising the technology and its reliability. This is of utmost importance, given that RAT, as stressed earlier (in section 2), may not be fully human-driven or entirely intelligible to humans. 12
In what follows, this section focuses on the US, where RAT are applied in practice. It examines the US legal regime on evidence, in general, and expert evidence, in particular, with a view to assessing whether and the degree to which RAT can comply with general evidence-rules and qualify as expert evidence. 13
In the US legal regime, the Federal Rules of Evidence are aimed at, among others, establishing the truth. 14 Here, information can be admitted as evidence, if it is relevant. 15 Relevant evidential materials can be excluded, if their probative value is outweighed to a considerable degree by a possible harm, stemming from their admission, including unfair prejudice or misleading. 16 In the RAT-context, this can mean that RAT may be excluded, insofar as their probative value is outweighed by potential unfair prejudice, in the sense that they unfairly mistreat on certain grounds –especially in light of the aforementioned discrimination-critique, as well as possible error rates, bias and lack of review and/or validation. 17
Concerning expert evidence (see RAT-assessment below), witnesses can testify, provided: they are qualified; their expertise will assist the court in comprehending the evidence or determining a matter; their testimony relies upon adequate data and stems from ‘reliable principles and methods’; and they have ‘reliably applied the principles and methods’ to the particularities of a given case. 18 Furthermore, experts can in principle rely ‘on facts or data in the case that the expert has been made aware of or personally observed’. 19 In addition, experts can testify without having to analyse ‘underlying facts or data’; but they can be obliged to reveal such data when cross-examined. 20
Settled case law has offered useful insights into ‘when and how’ to consider expert evidence. When determining whether to accept such evidence, judges need consider certain minimums (in a non-exclusive fashion), known from the Daubert-test: - ‘whether the evidence can be (and has been) tested’; - ‘whether the theory or technique has been subjected to peer review and publication’; - ‘the known or potential rate of error’; - ‘the existence and maintenance of standards controlling the technique’s operation’; and - ‘the degree of acceptance within the relevant scientific community’.
21
To qualify as expert evidence, RAT need involve a person (expert to testify on the RAT-output), who (according to the US legal provisions analysed above): 22 is qualified; has expertise that can assist judges in understanding the RAT or in determining a matter (like parole- or sentence-related issues); has based the testimony on adequate data and has used ‘reliable principles and methods’; and has ‘reliably applied the principles and methods’ to the case at hand. In this regard, ‘adequate data’ would demand large training datasets and accurate information; this requirement might not be met, in case of lack of representative samples or where biased and manipulated data may be involved. In fact, representativeness is crucial, since the choice of the reference/training group can affect the meaning/interpretation of the risk scoring. For example, if a RAT is trained only on high-risk offenders, then the ‘0-3’ scoring (in a ‘0-10’ scale) will not be ‘low’ risk. In addition, the ‘reliable methods’-demand, in light of the Daubert-test, might not be fulfilled, especially where error rates of a given technology are particularly high or the RAT has not been validated for a state’s own population. The other Daubert-criteria (described above) can be satisfied, if a given RAT is sufficiently tested/peer-reviewed, uses operation-standards and is accepted by the scientific community. 23 Last, as mentioned above, experts can testify without having to analyse ‘underlying facts or data’. 24 This is relevant to alleged opacity of RAT. Still, experts, requested to reveal such data in cross-examination, would probably need to disclose essential items of information, such as the source code.
To sum up, to fully comply with evidence-related law, RAT must, at least: be reviewable by a human expert, who is capable of accessing the RAT and rigorously scrutinising their reliability; meet adequate levels of accuracy; have minimised bias; be validated for a state’s own population (where it is applied); be regularly tested; deploy operation-standards; be accepted by the scientific community; and have source codes and/or use methods/methodologies that are visible to the judge and the defence.
Compliance with fundamental human rights
This section aims to assess whether and the degree to which RAT-implementations can respect fundamental human rights and, in particular, the right to due process, equal protection and privacy, which appear the most relevant, in light of RAT-limitations (detected in section 2) and, especially: opacity and complexity; 25 unfair discrimination (on the basis of protected grounds); 26 as well as error rates, bias and lack of adequate review and validation. 27
The right to due process
The above critique on RAT appears to call for a rigorous assessment of whether and how these technologies can comply with due process.
28
This subsection analyses: - due process in general (in pre-trial, trial and post-trial phases) requiring high-levelled fairness and reliability (in pre-trial phases), unbiased decision-making (in trial and post-trial settings) and other crucial minimums (like accessibility) that contemporary RAT might fail to satisfy (for instance, due to alleged error-rates or lack of transparency); - due process in sentencing, as addressed by the Loomis-court, offering useful insights into what types of RAT could be considered and how they should be taken into account by judges; - the right to confrontation focusing on the right of the defence to have alleged reliability questioned; and - the compulsory clause requiring accuracy and reliability.
Due process in general: Pre-trial, trial and post-trial procedures
In general, the right to due process can apply to numerous procedures, 29 including pretrial, trial and post-trial settings.
On pre-trial phases, the US Supreme Court has stressed the need for fairness and reliability, including, among others, fairness in prosecution, 30 justness and reliability of identification conducted by the police 31 or true notice and voluntariness of plea-bargain. 32
On trial, the same court has offered useful insights into: - the presumption of innocence: instructions on this presumption need be given upon request;
33
- the proof beyond reasonable doubt: the accused can only be convicted where such proof is established;
34
- evidence-related issues, including: the right to confrontation and cross-examination,
35
the right to adversarial proceedings,
36
the defence’s genuine chance to be heard and present its own evidence,
37
the protection against convictions relying on false testimonies that the prosecution knew or should have known they were false,
38
access to evidence that is ‘material’ to sentencing and punishing (in the sense that the result of the case would have not been the same in case of disclosure of such evidence);
39
and - the requirement that judges and juries be unbiased.
40
On post-trial stages (sentencing is addressed below on more detail), like parole-revocation hearings, there are some minimums, including disclosure of relevant evidence, the genuine chance to be heard and bring witnesses or other evidence, the right to confrontation and cross-examination or impartiality of parole-boards. 41 It is added that, on prison-issues, similar minimums are demanded in relevant hearings, such as the right to prepare for transfer-hearings and bring evidence or have other evidence disclosed, the chance to bring, confront or cross-examine witnesses and impartiality of decision-makers. 42
To sum up, the above analysis seems to suggest that RAT with particularly high error rates or bias could seriously affect fairness and reliability (in pre-trial phases) or run counter to the demand that decision-making be unbiased (in trial and post-trial settings). Moreover, private RAT, offering no access to the source code, could have a hostile impact on various due process-aspects, including: confrontation and cross examination (further discussed below); the adversarial nature of proceedings; reviewing reliability of the expert testimony; or accessibility of evidence (in trial and post-trial stages).
Due process in sentencing
In the (pre-) sentencing phase, due process can entail various rights, such as the right to be sentenced on the basis of accurate information and the right to individualised sentencing. Whether the use of RAT respects these rights has been addressed in Loomis, a case favouring the use of RAT at the presentencing phase.
In this case, the Supreme Court of Wisconsin found no violation of the right to be sentenced on the basis of accurate information: it was satisfied by the fact that the defendant (Mr Loomis) had access to and could challenge the input (such as the public records) and the output (the bar charts) of a private RAT (namely, COMPAS, protected by trade secrecy). Even though Mr Loomis had the chance to access/contest inputs and outputs, he was given no chance at all to challenge the black box between these inputs and outputs (which was protected by trade secrecy and) which could affect accuracy of the output (and, thus, accuracy of the content of the presentence investigation report upon which the sentence-decision was based). Furthermore, denial of access to the modus operandi (including the proprietary source code as the means to assess accuracy) 43 can prejudice effectiveness of the challenging. Moreover, it is noted that the Supreme Court of Wisconsin, when rejecting Mr Loomis’ claim on accuracy, made explicit reference to the designer’s guide; 44 yet, it may be doubtful whether a for-profit firm, benefiting from the use of proprietary COMPAS by courts and, thus, biased, can be relied upon to assess accuracy of its own product. Last, in this case, the Supreme Court permitted the consideration of COMPAS, despite lack of validation for Wisconsin’s own population; 45 this appears to run counter to the need, highlighted by the Supreme Court of Wisconsin in this case, for continuous monitoring and testing of the technology 46 (admittedly, testing on the population to which it is applied; in that case, the population of Wisconsin).
Thereafter, the Supreme Court of Wisconsin found compliance with the right to individualised sentencing. The court was satisfied by the fact that COMPAS was neither the sole nor the determinative criterion to decide upon the sentence (it was only used as corroborating tool). Moreover, the court referred to effectiveness and the need for more complete information; and it concluded that judges, when adjudicating on a concrete case, are ‘expected’ to take into due account that COMPAS-like tools process/consider group data. Still, it may be doubtful whether, the way in which and the extent to which judges may consider RAT, especially in light of automation-bias that could lead to over-reliance upon the technology (probably without judges being aware of their over-trust on the technology). 47
On the positive side, the Loomis-court permitted the consideration of (the risk portion of) COMPAS at the presentencing stage and conditioned it upon certain key-disclaimers that could to a considerable extent tackle (the above-mentioned) critique on RAT’s opacity, complexity, unfair discrimination, bias, inaccuracy or insufficient review/validation. More concretely, it conditioned this RAT-consideration on the provision by the presentence investigation report of certain items of information on the tool’s opacity (due to possible subjection to intellectual property rights), lack of validation for Wisconsin, alleged bias and the need for review. 48
Although there have been cases benefiting RAT-implementations, often with the main argument that RAT serve effectiveness and accuracy and are given little weight by judges, 49 a rigorous due process-compliance-test (for pre-sentencing, in light of the Loomis-court's findings) would, in my view, enhance legal certainty, if it required that: judges and the defence be granted access to the RAT (and its source code, if this is necessary to understand its decision-making process) and, thus, be enabled to comprehend and effectively review RAT; RAT be validated and regularly reviewed (for specific populations, to which they are applied); RAT be used as mere corroborating tools, only in conjunction with other materials that can independently support the judicial decision; and RAT be given little weight by courts.
The right to confrontation
The right to confrontation (protected under the Sixth Amendment of the US Constitution) 50 appears particularly relevant for RAT, given the latter’s possible protection by intellectual property rights and, hence, potential resistance to accessibility that is, in turn, demanded to scrutinise reliability and credibility of a given technology.
More precisely, according to settled case law of the US Supreme Court, the right to confrontation is aimed at enhancing fairness in proceedings and providing minimum safeguards for the defence to effectively challenge allegations against it, including the genuine opportunity for cross-examining witnesses, as well as assessing their credibility and reliability. 51 The US Supreme Court has stressed that consideration of others’ interests can create no exceptions to ‘the irreducible literal meaning of the clause’; 52 and that reliability is something that has to be assessed, not presumed, not taken for granted (even if a testimony is ‘obviously’ reliable, the right to confrontation cannot be reduced). 53 Still, in some cases it can be limited. Of particular relevance for the purposes of the analysis is the trade secrecy privilege, offering the secret-holder the privilege to deny disclosure of its secret.
More concretely, under the US regime, ‘trade secret’ is information, including programs or codes, whose holder has made reasonable efforts to keep it hidden and whose value stems from its secrecy. 54 Thereafter, ‘misappropriation’ of trade secrets is prohibited, when, among others, conducted by ‘improper means’; this does not, however, include ‘reverse engineering (…) or any other lawful means of acquisition’. 55 Other exceptional, non-prohibited activities on trade secrets refer to lawful actions of the government, as well as disclosures in judicial proceedings, provided confidentiality is respected. 56
Despite the above trade secret-provisions, excluding ‘reverse engineering’, ‘any other lawful means of acquisition’ 57 or disclosures in judicial proceedings under certain circumstances, 58 and even though the US Supreme Court has called for narrow interpretation of privileges ‘because they impede the search for the truth’, 59 historical developments 60 and recent case law have supported the trade secret privilege in criminal proceedings. 61 A prime example is Chubbs, 62 a 2015 (unpublished) judgement of the California Court of Appeals that denied disclosure of the source code of a technology, after considering it a trade secret and applying the privilege-provision (of the California Evidence Code). 63 To this appellate court, Mr Chubbs failed to demonstrate that disclosure was relevant and necessary to assess reliability of relevant information. 64 It is added that case law on RAT, including Loomis, has also seen these technologies as trade secrets, not subjectable to disclosure. 65
In a thorough analysis, Wexler has argued that hiding ‘information from the accused because it is a trade secret mischaracterizes defense advocacy as a business competition’. 66 The author has further detected serious challenges that may be raised by the application of the trade secrecy privilege in criminal proceedings. More precisely, to Wexler, trade secret privileges can exclude information with high probative value 67 (that could include the source code in the case of RAT). It could be further claimed that such exclusion could prejudice fundamental principles of the fact-finding process, especially transparency and the need for enhanced reliability. Moreover, as Wexler analyses, when applying the trade secret privilege, judges follow a three-step-test: whether there is a valid trade secret and what harm may be caused by its disclosure; whether it is relevant and necessary to assess reliability (whether it can prove or disprove something); and a balancing task: potential harm of disclosure (to the party, not the public) versus the need for information to find the truth. 68 The author points out that this balancing task may be troublesome, because it risks striking a balance between intellectual property rights (admittedly of economic nature) and defence rights -and it often favours the former over the latter. 69 In addition, Wexler doubts proportionality of this privilege, since there are other means, less harmful for the defence, to achieve the goal of deciding on disclosure of a trade secret (such as court orders or subpoenas). 70 Furthermore, the author stresses that the trade secret privilege may not comply with some aspects of trade secret- and privilege-law. 71 On trade secrecy-rules, the privilege may run counter to the law’s exemptions on ‘reverse engineering (…) or any other lawful means of acquisition’ 72 or disclosures in judicial proceedings; 73 this is particularly problematic, because, in this way (Wexler argues), trade secrets are unjustifiably given much more protection in the criminal justice setting than in other areas, like business and competition. 74 On privilege-law, as stressed by the US Supreme Court, privilege rules need be narrowly interpreted, because they have a hostile impact on truth-finding. 75
To Wexler’s critique one could add: that privilege rules apply to all stages of criminal proceedings, 76 which may not be the case with fundamental rights of the accused that may be limited to concrete phases (like the confrontation right that has been seen as a trial-right); 77 that the trade secret privilege may run counter to the criminal procedure’s (and fact-finding’s) demand for fairness, access and effective challenging, creating a presumption of RAT-reliability that the defence must rebut, by demonstrating that, for example, the source code-scrutiny is relevant and necessary to assess reliability; and that existing discovery- or inspection-instruments, especially protective orders, may not suffice. Although RAT can be subject to disclosure, under the Federal Rules of Criminal Procedure: 78 first, it is the defence that must demonstrate that, for example, the source code is relevant and necessary for assessing reliability; second, judges enjoy discretion and may reject the disclosure-request; 79 and, third, even if they accept it, they may only grant access to the summary of the RAT-report, whose content might be limited to the opinions (and their grounds) and the qualification of the RAT-expert (not, for instance, the source code). 80 Same can be the case with legal provisions at state level that can still put the burden of proof on the shoulders of the defence: it must demonstrate that the information, sought for disclosure, is relevant and necessary to assess reliability. 81
In light of the above considerations, it would be fair to argue that a rigorous confrontation-compliance test would demand, not balancing financial interests of a RAT-designer against fundamental rights of the accused but, striking a fair balance between, on the one hand, the need for enhanced accuracy and effectiveness, served by the use of allegedly reliable RAT, and, on the other hand, the right of the defence to have alleged reliability questioned. This would perhaps mean, in light of judicial discretion in rejecting disclosure-requests or the risk that judges only grant access to the summary of the RAT-report, the preference of free source code technologies in criminal settings and the introduction of a presumption of unreliability for all proprietary RAT: judges could, on their own motion, order disclosure of proprietary RAT –provided, of course, that confidentiality-minimums are respected.
The compulsory clause
Protected under the Sixth Amendment of the US Constitution, 82 this clause is aimed at guaranteeing that evidence favouring the defence be included and at promoting the adversarial nature of proceedings, 83 allowing the defence to present its own version of the facts. Although this can ensure that the accused can bring witnesses in its defence, it is not an absolute right; it may be limited in light of the public interest, such as where discovery-provisions are intentionally breached to gain ‘tactical advantage’. 84 It is added that the US Supreme Court has found that this right can offer no more protection than the right to due process; it is the latter, as a broader right, that applies when determining whether, for example, to demand the government to present exculpatory materials. 85
In light of these considerations, one could argue that a stringent compulsory-clause-compliance test would demand disclosure of the RAT and its source code to the defence, so that the latter can, upon expert review, present its own interpretation of a given RAT-output. Whether this disclosure could be reduced, in light of the public interest, may be doubtful: what is at stake seems to be, not private interests, like trade secrecy analysed above but, accuracy and reliability of the process. Such accuracy and reliability may only be enhanced by expert-review; something that would hardly be regarded as breach of discovery-related law to gain ‘tactical advantage’.
The right to equal protection and non-discrimination
The right to equality and non-discrimination is protected by the Fourteenth Amendment of the US Constitution. 86 In general, the US Supreme Court has interpreted the equal protection clause as prohibiting, not all forms of discrimination but, only unfair discrimination. 87 Unfairness has been found in cases where discrimination leads to ‘total arbitrariness’ and lacks any reasonable grounds; the defence may bear the burden to demonstrate such absence of any reasonable basis. 88 While, in some cases, judges assess whether a practice or law has a permissible purpose, 89 in other cases, the test aims to find reasonableness (or, contrary, arbitrariness) of a discriminating practice or law and whether there is a basis substantially connected with the subject-matter of the law or practice. 90 As the analysis will demonstrate, depending on the discrimination-ground, judges can apply an intermediate test (focusing on rationality of the classification) or a more rigorous scrutiny, demanding heavier burden of justification and necessity of the discriminating measure/law.
In light of the factors considered by RAT, in what follows, this section addresses undue discrimination on the basis of race, gender, wealth/poverty, age and undue discrimination that may affect other fundamental rights or freedoms.
Race
In case of race discrimination, the US Supreme Court, in general, applies a rigorous test 91 demanding ‘a far heavier burden of justification’, as well as necessity of the measure/law at hand (not only its rational linkages to the achievement of the goal pursued). 92 Although there have been cases, where racial discrimination may be deemed undue as such, regardless of intent, 93 in other cases the US Supreme Court has demanded, not only discriminatory impact but also, discriminative intent; 94 something that the defence need demonstrate. 95 Race discrimination has been addressed in various contexts, 96 including the judicial system, where race-discriminating convictions 97 or racially discriminative sentencing can be prohibited (provided intent and effect are demonstrated by the defence). 98
In the RAT context, it seems that the consideration of race (or proxies for race, like residence-related information) can be subject to the above-analysed rigorous test demanding heavy burden of justification and necessity. However, the intent-requirement may place too high a burden on the defence. For example, although it can be argued that race-discriminating sentencing (after the consideration of a given RAT) should be prohibited, it may be particularly hard for the defence to demonstrate that race (or a race-proxy) was included with the objective of discriminating.
Gender
In the past, gender discrimination against females 99 (or in favour of males) 100 was tolerated; albeit, such differential treatment was later (in the nineteen-sixties) abandoned. 101 Today, judges apply an intermediate scrutiny test, under which gender discrimination must be aimed at a crucial goal, as well as strongly linked to achieving this goal. 102 Moreover, there have been cases where the US Supreme Court has demanded both discriminating effect and discriminatory purpose, 103 as well as cases demanding ‘exceedingly persuasive justification’. 104
In the RAT-context, the above intermediate-level scrutiny could focus on the importance of the goal served (like accuracy of the defendant’s evaluation) and the discrimination’s linkages to this goal. In this regard, case law on RAT has favoured the inclusion of gender as accuracy-enhancing element (at least, at the presentence phase). 105
Poverty/wealth
On wealth discrimination, the US Supreme Court has, on the one hand, demanded careful scrutiny 106 and, on the other hand, been satisfied by seeking for a rational relationship between the discriminating measure/law and the goal pursued. 107 In any event, it seems that, where the area affected is sensitive, the discriminator need demonstrate that the measure/law is sufficiently justifiable. 108 In the concrete context of criminal justice (that may be deemed sensitive, demanding fair treatment of defendants), the US Supreme Court has repeatedly opposed to wealth-based discrimination and has demanded: open and equal access; 109 adequate and effective challenging of judicial decisions (including post-trial hearings); 110 the provision of free transcripts in certain cases (including post-conviction proceedings); 111 full respect for the poor’s right to counsel, 112 including effective (active) legal aid of the counsel (on appeal); 113 prohibition of sentencing for a prison-term beyond the statutory maximum, on the basis of indigency; 114 as well as prohibition of imposition of imprisonment for a crime punishable only by fine, due to the sole fact that the poor defendant cannot pay. 115
In the case of RAT, it may be fair to argue that, given the sensitive nature of criminal proceedings, the discriminator, not the defence, would bear the burden to prove adequate justifiability of the RAT-implementation. Here, of special importance appear to be: equal access to the technology (by the parties and, probably, the judge); sufficient and effective challenging by the defence (that might demand access to the source code); or active and effective legal aid when challenging the RAT (this could include expert assistance in evaluating the source code).
Age
The US Supreme Court has found that age-based discrimination is not suspect and therefore not subject to rigorous scrutiny; here, a rationality check may suffice. 116 This could admittedly favour RAT consideration, in light of the rather intermediate-level testing seeking for mere rationality; for instance, age may rationally be regarded as an accuracy-enhancing factor, given empirical evidence proving that younger individuals tend to re-offend more often than older people.
Impact on other fundamental rights
Where other fundamental rights are affected by a classification, the US Supreme Court has demanded scrutiny. Not only the discrimination at hand must be justified by a compelling interest, but it must further be deemed necessary to achieve the desired goal. Importantly, in such cases there is no presumption of constitutionality of the discriminating measure. 117 For instance, the US Supreme Court has found that some laws on mandatory sterilisation of certain (habitual) offenders can seriously affect the right to have family and has subjected them to rigorous scrutiny. 118 In the RAT context and in light of possible prejudice of other fundamental rights, scrutiny could refer to key issues, including: the existence of a compelling public interest (such as the need for accuracy); or necessity of the RAT-implementation to achieve the goal pursued. In any event, it would be fair to demand that no presumption of constitutionality be applied to favour the RAT-consideration by judges.
The right to privacy
From the outset, it need be stressed that any criminal court must have access to the profile of the defendant to more accurately adjudicate on an individual case. 119 Exceptions to this open-trials-rule can include confidentiality of some items of information, anonymity of certain witnesses, as well as secret proceedings in special cases (for example, to protect national security). 120 However, privacy becomes relevant, because RAT process personal data and there may be a need for concrete laws regulating, among others, data integrity, quality, reliability, security, storage, retention or communication.
Whether the US Constitution expressly protects privacy remains debatable. Some aspects of privacy could be protected under the First Amendment (for example, privacy of beliefs) 121 or the Fourth Amendment (on privacy and protection against unreasonable searches and seizures). 122 In general, the US have adopted a piece-meal approach toward privacy, with specific legal instruments targeted at specific technological uses. 123 For example, the US Privacy Act of 1974 includes fair information practices on personal data processing by federal agents; or the Fair Information Practice Principles (the ‘FIPP’) can be seen as soft law with a long influential effect on privacy-related regulations. 124 Many state laws have implemented these principles. 125
Of relevance might in the future be the proposed Data Protection Act of 2021. Introduced in 2020, 126 this initiative aims to create an independent federal agency –the Data Protection Agency– to tackle high risk data practices and the ‘collection, processing, or sharing of personal data’. 127 Such practices would include activities linked to the processing of data referring to protected characteristics of individuals. 128 It may however be questionable whether those involved in RAT-data-processing would qualify as ‘data aggregators’, a term demanding large-scale and for-profit processing. 129
At state level, one may detect legal provisions reflecting some privacy aspects, when, for example, treating certain items of information as confidential, as part of the record of a criminal case. Although, as mentioned above, it can be argued that privacy concerns, in the context of RAT, are not of much relevance (for it may be preferable that criminal courts enjoy access to personal data of the accused to better and more accurately evaluate her/him), detailedness in laws governing the consideration of RAT would be desirable to enhance integrity and reliability of relevant data, as well as other processing-issues, ranging from supervision of RAT to training personnel. In this regard, a prime example is California. Its 2022 Rules of Court, expressly influenced by Loomis, offer a particularly useful standard-setting on RAT. Aimed at minimising sentencing-bias and recidivism rates, these rules: provide for concrete definitions of the technologies, their inputs and their outputs; expressly demand validation; require consideration of RAT only in conjunction with other evidential materials that can independently support the decision; oblige judges to take into due account comments of the experts on RAT-limitations (for example, reliance on group data, subjection to property rights, potential unfair and discriminating labelling on the basis of protected characteristics or non-validation); explain how judges should interpret their output; and demand training. 130
To sum up, it seems desirable to have detailed rules on the processing of personal data in RAT-contexts, not only, or not that much, to protect against unauthorised communication or inadequate supervision and monitoring, but also, or but primarily, to offer minimum guarantees on integrity, reliability and accuracy of relevant data that do have an impact on the fact-finding process.
Recommendations
Recommendation 1: Recognise by law the legal status of RAT
Thus far analysis has demonstrated that the subjection of RAT to evidence-related law and the application of a rigorous human-rights-compliance-test (at least, for due process, equality and privacy) could to a considerable extent resolve uncertainties and challenges posed by the consideration of such technologies by judges. Indeed, if the law expressly recognised (for instance, via concrete provisions in the code of criminal procedure) the legal status of RAT as evidence (in the sense that a human expert gives an opinion on the RAT-output) and as integral part of the criminal process, which must respect human rights, then the defence and the judge would be given the genuine chance to both access and review the RAT, their accuracy and their reliability. This is hardly the case with contemporary RAT, which (as mentioned in the introduction, are in practice considered by judges but) remain un/under-regulated.
Recommendation 2: Make the source code visible
What both the evidence- and the human rights-related analyses have shown is, first of all, the need for considering RAT, whose source code is visible to and reviewable by the judge, the defence and its expert. More concretely, on evidence, an adequate reliability-assessment by a human expert would admittedly presuppose accessible source codes. On due process, access to the source code would: promote the right to confrontation and cross examination; respect the compulsory clause, by allowing the defence to present its own view on the RAT-output; enhance the adversarial nature of proceedings; guarantee reliability of the expert testimony; ensure openness of evidence at trial and post-trial stages; offer the defence a genuine chance to effectively review and challenge the RAT; and give both the defence and the judge the opportunity to truly comprehend the technology. On non-discrimination, access to the source code is advocated by the US Supreme Court’s call for minimum safeguards (especially on wealth-based discrimination), including equal access to the RAT, sufficient and effective contesting of the RAT or active and effective legal aid when challenging the RAT. On privacy, unimpeded access to the source code can be supported by the need for concrete laws on RAT to safeguard reliability or accuracy of data, whose processing can have an impact on the fact-finding process.
It is expressly noted that, in case of private RAT, as the right to confrontation suggests, a presumption of unreliability for all proprietary RAT can be introduced to allow disclosure of the secret source code. Here, there can be confidentiality-minimums (like closed hearings), limiting the circle of persons that are authorised to access and review the source code; but this circle should include, at least, the judge, the defence and its expert. In this way, unimpeded access by the judge, the defence and its expert would enhance transparency, objectivity, reliability and validity of fact-finding and, at the same time, respect trade secrecy-law.
At this point, it need be added that access to the source code, though desirable as offering key insights into the technological performance and decision-making rationale, might present the defence with concrete difficulties. More precisely, contemporary complex technologies may use particularly lengthy and unintelligible source codes, whose scrutiny by experts might hardly offer any useful insights into the technological decision-making rationale. Hence, it might be advisable that, in such complex cases, information be given on methods/methodologies used and relied upon by the relevant RAT (for example, on the mathematics underpinning relevant simulations and/or calculations). 131 This could work as an alternative to disclosing the proprietary source code; and, to this end, support could be demanded from the party providing the RAT in question (that would be otherwise required to disclose its source code).
Recommendation 3: Exclude and scrutinise
The discussion on admissibility of evidence suggests the exclusion of RAT with particularly high error rates and bias or RAT that are insufficiently reviewed or non-validated, since their probative value may be outweighed by unfair prejudice or misleading. In this regard, as shown by the evidence-reliability-analysis, a human expert can review and comment on the RAT and further be demanded to: demonstrate actual expertise; be given adequate, in terms of quality and quantity, data; base its review upon reliable methods; and reliably apply them in individual cases. In assessing reliability of a given RAT, the expert can focus on, among others: possible error-rates or bias; representativeness of training data; whether the RAT has been sufficiently tested/peer-reviewed and validated; whether it uses operation-standards; or the extent to which it is accepted by the scientific community.
It is, moreover, advisable that an additional expert entity be introduced by law to perform the above reliability-assessment. This can be a public authority (individual expert or group of experts), bound by judge-like duties (such as impartiality and independence) and enjoying a veto-authority denying the judge to consider the RAT-output, if it finds failures, including non-representativeness of samples during training, biased or manipulated data within the processing, high error rates or lack of validation.
The above rigorous scrutiny of RAT can be further supported by the discussion on due process; for the use of RAT with particularly high error rates or bias may seriously affect fairness and reliability in pre-trial contexts or prejudice impartiality of decision-making in trial and post-trial phases.
Recommendation 4: Standard setting to avoid undue discrimination
The discussion on non-discrimination seems to demonstrate the need for standard setting on the factors considered by RAT. Experts from various disciplines, including criminal law and psychology, can collaborate and agree upon the factors that serve accuracy, instead of having an unfairly discriminative impact. It is through the collaboration of such experts that a set of standard accuracy-enhancing risk factors could be introduced. This standard setting could minimize legal uncertainty; for it would absolve judges from determining on a case-by-case basis whether race-, gender- or poverty-based discrimination can be justified by the goal pursued. In any event, it may be desirable that there be no presumption of constitutionality regarding the inclusion of critical factors, such as race-proxies interfering with other human rights.
Recommendation 5: Make concrete laws on RAT
As the privacy-analysis revealed, there appears to be an acute need for concrete laws regulating RAT; this, not that much because the consideration of RAT by criminal courts is a matter of privacy, but primarily because people may wish to have detailed laws regulating technologies that may interfere with fundamental rights, as well as laws safeguarding security, integrity, reliability or accuracy of data, whose processing can affect fact-finding. In this regard, useful lessons could be learned from California’s legal scheme on RAT.
European preparedness in introducing RAT in the criminal justice system
This contribution now moves on to Europe that, as mentioned in the introduction, has recently been urged to regulate technologies in the criminal justice area. In light of the tendency to adopt risk-based perspectives in various legal areas, from environment-regulations to privacy laws and (proposed) artificial intelligence acts 132 (more on this below), European audiences may in the near future see RAT implemented at national level. When regulating RAT, it is important to avoid having desired accuracy, effectiveness and efficiency (promised by a risk-based approach) overemphasised, perhaps at the expense of (a human rights-based perspective’s counter-promise of) fairness and justice.
Before proceeding to the comparative analysis, it is noted that one can distinguish between two ‘Europes’, namely: the ‘Europe of the Council of Europe’; and the ‘Europe of the European Union’. The former has a larger audience, involving many Contracting Parties. Its primary (human rights-related) tool is the European Convention on Human Rights, sufficiently interpreted by case law of the European Court of Human Rights. On the other hand, the ‘Europe of the European Union’ has a smaller number of addressees (the Member States); its main (human rights-related) legal instrument is the Charter of Fundamental Rights of the European Union; and the Court of Justice of the European Union is the entity responsible for overseeing compliance with Union law. As will be observed (see subsequent footnotes referring to adequate case law and law of both ‘Europes’), the Council of Europe and the European Union have followed a more or less similar approach to human rights protection. 133 In some instances, however, it can be claimed that the European Union can regulate in more detail and provide for more specific rules (through Directives or Regulations, for instance), 134 compared with the Council of Europe’s legal instruments that cannot go into detail and mainly regulate through imposing some broad, general principles that need be respected by the numerous addressees.
In general, one may observe some commonalities between the European and the US legal schemes. This seems to be the case with: - first, evidence-related law and the concepts of ‘admissibility’ (a matter of national law in Europe),
135
‘relevance’, ‘excludability’, ‘reliability’,
136
as well as ‘expert opinions’; indeed, the key requirements are similar and include: competence (actual expertise); independence and objectivity (the expert need be unbiased and have no interest from the outcome of the case); assistance (the expert must give the court helpful information); and reliability (with a non-exhaustive list of demands, referring to the quality and quantity of data, validity and accuracy of the methods followed, whether there has been review, completeness of data available or whether the practice is established in the field);
137
- second, Europe’s fair trial-regime (Article 6 ECHR)
138
that presupposes adherence to some minimums, similar to the US’ requirements, including: • the need for sufficient time and facilities to prepare the defence (ECHR, art 6 para 3 lit b); • the defence’s genuine chance to present the court with its arguments (including the opportunity to challenge admissibility and reliability of evidence and contest its use);
139
• the principle of equality of arms, requiring: that the defence be granted access to the file;
140
expert evidence for or against the defence be disclosed to the defence;
141
the defence enjoy the chance to present counter-expertise;
142
and court-appointed experts be neutral;
143
• the right to adversarial process demanding that both parties be given the opportunity to become aware of and comment on evidential materials that may influence the judge;
144
• the obligation imposed on judges to give adequate reasons for their decision-making;
145
or • the presumption of innocence demanding, among others, any doubt favour the accused;
146
and - third, Europe’s approach to privacy that (though more abstract than the US’ concrete, targeted and piece-meal approach) appears to require, same as the US, concrete laws, in light of the legality-test regularly applied by the ECtHR. More concretely, the legality assessment requires a national legal basis
147
that is adequately accessible
148
and foreseeable.
149
Given the ECtHR’s requirement of fulfilment of certain foreseeability-minimums, addressed below,
150
adequate RAT-rule-making could be focused on: the categories of people liable to be subjected to the measure (for instance, limit RAT applications to those convicted for serious offences); the nature of the offences that may trigger application of the RAT (such as RAT-uses only for felonies); limitations on duration of the measure (for instance, limiting RAT-implementations in concrete stages of criminal procedure); the procedures of data processing (such as safe and secure processing of personal data by authorised entities); precautions regarding communicating data (for example, limiting the circle of entities authorised to access such data); or the circumstances for erasure of data (via, among others, setting out concrete deletion-periods). Moreover, the legality test demands precision and detailedness of the law; this, in the sense that the law sufficiently indicates ‘the scope and manner of exercise of the discretion conferred on the relevant authorities’;
151
this could mean that RAT-related provisions should expressly refer to the way and the extent to which judicial authorities may consider RAT (for instance, consideration of RAT only as corroborating tools, in conjunction with other materials that can independently support their decision).
It is added that such bright-line-rule-making, preferably through the introduction of legal provisions in codes of criminal procedure (or evidence-related rules), 152 could allow for targeted regulation of RAT in a privacy-shielding fashion that could sufficiently respect the EU’s General Data Protection Regulation (GDPR) and the Law enforcement Directive (LED). 153 More precisely, in the EU legal regime, the accountability principle requires that ‘data controllers’ 154 comply with the principles of ‘lawfulness, fairness and transparency’, ‘purpose limitation’, ‘data minimisation’, ‘accuracy’, ‘storage limitation’ and ‘integrity and confidentiality’. 155
The principle of lawfulness requires that the processing be based on a legitimate ground. This principle could be respected, if concrete legal provisions (e.g., in codes of criminal procedure or evidence-related rules) were in place and specified that RAT-related processing is necessary ‘for compliance with a legal duty’ 156 (e.g., to accurately and effectively evaluate the defendant) or ‘for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’ 157 (e.g., judicial authority to sentence after considering RAT).
Thereafter, the principle of fairness demands effective reaction or processing in a rather ethical way; and the principle of transparency requires full disclosure of relevant information before, during and after the processing operation. 158 These principles could be complied with, if RAT-related legal provisions required that special weight be given to the implementation of freely accessible, non-proprietary RAT, letting the defence know the modus operandi of a given technology.
In addition, the principle of purpose limitation demands that data be ‘collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes’. 159 RAT-related legal provisions could, in this regard, impose a duty to, for example, take technical/organisational measures to limit the purposes pursued (in case of initial collection) or provide for effective safeguards (in relation to further processing); this could solely allow for concrete RAT-implementations, supported by well-defined mechanisms (e.g., limiting the processing-goals to the evaluation of the defence).
The principle of data minimisation demands that data be ‘adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed’. 160 In fact, RAT-related rules could expressly define which data (risk factors) are necessary in relation to the evaluation of the defence. In addition to data quantity, due attention should also be paid to data quality, in light of the accuracy-principle: data must be ‘accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay’. 161 Here, RAT-related provisions could impose accuracy-assessments as well as rectification-rules (e.g., in the form of adding extra information, as a ‘supplementary statement’). 162
The storage limitation principle (or the delete-principle) requires that data be ‘kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed’. 163 RAT-related legal provisions could specify concrete data-storage/retention/deletion-periods (depending on the context of a given RAT-implementation).
Last, the principle of data security (or ‘integrity and confidentiality’) refers to the processing ‘in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures’. 164 RAT-related legal provisions could, to this end, only allow for RAT-implementations that adhere to integrity-enhancing schemes or high confidentiality standards. 165
Remarkably, and despite the above similarities (on evidence, fair trials and privacy), Europe appears more defence-protective (compared to the US) in, at least, two respects. First, Europe’s non-discrimination framework focuses on the effect and expressly does not require discriminative intent. 166 This could favour the defence that would be absolved from demonstrating that, for instance, a RAT-developer included a protected ground with the purpose to discriminate. Second, and more importantly, in Europe, trade secrecy-protection is limited by disclosure-under-confidentiality in criminal proceedings. More concretely, at the European Union (EU) level, trade secrets are protected by the Trade Secrets Directive. 167 This Directive follows an approach that appears similar to that of the above-discussed US regime; for example, on terminology, 168 prohibitions 169 or exceptions. 170 Still, when dealing with trade secrets in criminal proceedings, the vast majority of EU Member States do grant the defence access to such secrets, provided certain confidentiality-standards are respected (like exclusion of the public via closed hearings). 171
That European states tend to favour such access/disclosure seems to show that they properly see what is at stake. The fundamental principles governing the fact-finding process, as well as the right to fair trial (due process), privacy and non-discrimination need be balanced against something heavier than the mere economic interests of private actors. This ‘something heavier’ seems to be, not trade secrecy-rights of economic nature but, promised accuracy in risk assessments, effectiveness and efficiency. If this is true, then Europe could show how to truly respect: the need for reliability-checks in the fact-finding process; the dimensions of human rights analysed above; and effectiveness, accuracy and efficiency of procedure.
If European regulators want RAT in the criminal justice setting, because they serve accuracy and effectiveness; and if the fact-finding or other process, where RAT may apply, must be transparent, reliable and explainable; and if RAT, subject to trade secrecy, by their nature negate transparency and explainability; then, it makes sense to: first, recognise the legal status of RAT as part of the criminal procedure; second, favour the use of RAT that use free source code and, in case of private RAT, opt for disclosure of the source code in the criminal setting (and make this disclosure conditional upon confidentiality requirements); third, exclude certain (for instance, biased) RAT and scrutinise all RAT; fourth, set standards on the factors considered by RAT; and, fifth, make concrete laws regulating their consideration by judges. If the claims and findings of this contribution are acceptable, then, in the European scheme, the proper technological means to more accurately, efficiently and effectively evaluate the defendant could be purposed toward the right ends: fairness and justice.
Before drawing conclusions, a last remark need be made on the proposed AI Act 172 that could be particularly relevant for RAT-implementations. Admittedly, RAT-regulation in the criminal procedure appears to be primarily a matter of criminal justice, not that much an issue of AI-regulation; still, and since more and more market rules and private tools tend to enter the public arena, 173 it probably makes sense to refer to the above AI Act.
The proposed Act, seemingly prohibitive and AI-targeted, seems to recommend a rather permissive scheme, adopting a risk-based approach, paying more attention to the market and placing less focus on AI itself and its impact, as well as the logic of the ‘fundamental’. 174 More precisely, this Act has a crucial Title II prohibiting some AI-practices. Article 5 of the proposed AI Act suggests a ban on: the marketing or other uses of AI, which can have an impact on someone’s conduct in a way that can result in individual harm; the marketing or other use of AI by public actors for the purposes of assessing or categorising trustworthiness of individuals, where the scoring results in unfair treatment; and the use of specific biometrics-related AI technologies in public spaces for law enforcement purposes (save for situations, where these uses are deemed necessary to find potential victims or suspects of certain crimes or to prevent concrete risks, like terrorism).
While the phrasing appears prohibitive, the proposed AI Act seems rather permissive in various respects. For instance, it is not clear whether it permits or bans contracting between EU-sellers and non-EU-buyers; this might be the case with RAT, given the global reach of many AI-implementations. Thereafter, the proposed text refers to and demands ‘physical or psychological harm’ of a concrete ‘someone’; this seems to leave uncovered certain implementations that can have a hostile impact at a collective, rather than individual, level or AI-uses whose individual risk/harm is hardly identifiable (for instance, when using RAT to evaluate defendants, is it the ‘individual’ defendant that might be ‘harmed’ or the society as a whole that may or may not wish to subject the defence to such tech-evaluation? Relevant to this, RAT focus on groups, not specific persons). 175 Besides, experience from the environment has taught us that citizens or society as a whole may be affected by implementations and interventions that do not look at a concrete individual.
Notably, the above permissive aspects of the proposed AI Act were criticised by the European Data Protection Board and the European Data Protection Supervisor, who commented on the absence of bright-line bans and stressed the need for prohibitions on certain AI uses, including ‘automated recognition of human features’, as well as on AI uses by both public and private entities that discriminate on the basis of protected grounds. 176 Whether constructive criticism could lead to negating and banning RAT-uses in Europe or, contrary, to diligent and careful implementations, fully respecting fundamental human rights, remains doubtful. Still, and in light of previous discussion on trade secrecy and non-discrimination, it appears that Europe could provide for a more defence-protective regime when regulating RAT.
Conclusions and discussion
Thus far discussion revealed key functions of RAT, including their pros and cons; and highlighted important compliance-challenges in particular relation to evidence-law and fundamental human rights. After delving into the US regime on due process, equal protection and privacy, this paper made some defence-shielding recommendations to regulators, namely: recognising by law the legal status of RAT (subjection to evidence rules and rigorous human rights-compliance scrutiny); making the RAT-source code visible to and reviewable by the judge, the defence and its expert; excluding certain insufficiently reliable RAT and scrutinising all RAT; standard-setting on relevant risk factors to avoid undue discrimination; and making concrete laws regulating RAT-implementations. Moreover, the analysis of the European regime demonstrated Europe’s supremacy in at least two key areas: the non-discrimination framework that does not require discriminative intent; and the trade secrecy-regulation/practice that favours disclosure under confidentiality standards. Last, the discussion on the proposed AI Act revealed some permissive aspects that have raised concerns and that could result in (prohibitions and bans or, contrary) diligent and proper RAT-implementations.
To conclude, any technology can help human decision-makers, thanks to promised accuracy, effectiveness and efficiency. In some domains, such promises may be of utmost importance and, as such, may need to be prioritised over reason-giving. For instance, when predicting natural disasters, an AI-tool may demonstrate particularly high-levelled-accuracy in foretelling future floods. Here, human decision-makers, using that AI-model, may be satisfied by its proven accuracy, when intervening with a view to saving human and animal lives and, in general, the environment; it might not make much sense to seek for high-levelled explainability, given the emergency of the situation and proven accuracy of the AI-model: if the technology has shown 99% reliability and if one thousand people’s lives are in danger in light of the predicted flood, then it might make sense to intervene fast and save these lives (without explaining in detail or even fully understanding how the technology reached the 99% accurate prediction). However, in other areas, like the criminal justice system, we may have to place heavy emphasis on fairness and justifiability of decision-making to benefit the defence and society as a whole. Insofar as certain RAT have been subjected to evidence rules and stringent human rights-compliance scrutiny, are sufficiently visible to and reviewable by the judge/defence/defence-expert, have demonstrated sufficient levels of reliability and have been adequately regulated (including standard-setting), 177 it may be desirable to have these RAT assisting human adjudicators in an optimally explainable and understandable fashion and always for the benefit of society as a whole.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
