Abstract
Risks posed by the presence of radionuclides in the environment require an efficient, balanced, and adaptable assessment for protecting exposed humans and wildlife, and managing the associated radiological risk. Exposure of humans and wildlife originate from the same sources releasing radionuclides to the environment. Environmental concentrations of radionuclides serve as inputs to estimate the dose to man, fauna, and flora, with transfer processes being, in essence, similar, which calls for a common use of transport models. Dose estimates are compared with the radiological protection criteria for humans and wildlife, such as those developed by the International Commission on Radiological Protection. This indicates a similarity in the approaches for impact assessment in humans and wildlife, although some elements are different (e.g. the protection endpoint for humans is stochastic effects on individuals, whereas for wildlife, it is deterministic effects on species and ecosystems). Human and environmental assessments are consistent and complementary in terms of how they are conducted and in terms of the underlying databases (where appropriate). Not having an integrated approach may cause difficulties for operators and regulators, for communication to stakeholders, and may even hamper decision making. For optimised risk assessment and management, the impact from non-radiation contaminants and stressors should also be considered. Both in terms of the underlying philosophy and the application via appropriate tools, the European Radioecology Alliance (ALLIANCE) upholds that integration of human and ecological impact and risk assessment is recommended from several perspectives (e.g. chemical/radiological risks).
1. Introduction
The risks posed by the presence of radionuclides in the environment require an efficient, balanced, and holistic assessment approach for protecting exposed humans and environments, including natural ecosystems and the wildlife living in them. The specific contaminant–medium–individual–pathway paradigm is evolving rapidly towards a more integrated view of the environment as a whole, of which man is only a part. This paradigm shift not only concerns the direct effects of contaminants, but also how contaminated environments can indirectly affect human health and well-being, as well as how these environments can be returned to a state of net benefit to society. In order to address these changes, radioecology needs to embrace the concept of integration, both in terms of harmonising the underlying approaches and methods of human and environmental protection, and integrating radioecology with other risk assessment frameworks and relevant scientific disciplines. This paper highlights the benefits and challenges posed by this integration.
The World Health Organization’s International Programme on Chemical Safety has developed a framework for integrated human health and ecological risk assessment in collaboration with the US Environmental Protection Agency and the Organisation for Economic Cooperation and Development (Suter et al., 2005). The integrated risk framework was developed for two fundamental reasons: (1) to improve the quality and efficiency of assessments through exchange of information between human health and ecological risk assessors; and (2) to provide more complete and coherent inputs to decision-making processes.
The need for explicit demonstration of the protection of the environment from the effects of radioactive contaminants has been recognised within the radiation protection community over the last two decades (ICRP, 2007). Significant effort has been expended, and a system of environmental protection has emerged, supported by the tools required to estimate exposure, analyse effects, evaluate risk, and demonstrate protection (e.g. Beresford et al., 2008; Larsson, 2008). However, there are often differences in how human and environmental assessments are conducted (e.g. regarding protection endpoints, dispersion and transfer models, or dosimetric models and assumptions). This can cause difficulties for operators, stakeholders, and regulators, even when the reasons for these differences are communicated clearly. An integration of the two radiation protection approaches – both in terms of the underlying philosophy and the practical application via appropriate tools – offers benefits. Such integration of both frameworks has been initiated by the International Commission on Radiological Protection (ICRP) (see Section 2.1).
Additionally, radionuclides and the risk or impacts posed by them to humans and the environment typically occur as part of a complex suite of co-contaminants and other stressors, as exemplified by waste streams from nuclear and non-nuclear industries, and complex legacy contamination. There is a clear gap in our understanding of contaminant mixtures that include radioactive materials. The integration of radioecological research with other disciplines, and subsequent direction towards better understanding of mixture effects, as well as adapted risk assessment methods aimed at predicting mixture effects, will make it possible to determine if radiation protection criteria are robust in a multiple contaminant context.
Radioactive contamination can occur as a result of different scenarios, disparate in character and often specific in their actual or potential impacts. Routine operations of nuclear facilities and contamination from non-nuclear industries are often of public concern. Analysis of the technical capacity and resources required to prevent, mitigate, or remediate impacts and ensure recovery of any contaminated area after a release (routine, accidental, or malevolent) must take into account the disparities and specificities inherent in the exposure scenarios. This is because they play a significant role in the assessment of consequences, in terms of economic considerations and from the societal perspective. In the case of accident/legacy sites for example, effects on the management of radiological risks include societal concerns, varying degrees of economic impact or loss of societal benefit, administrative disruption, health impacts or potential loss of (human or non-human) life, habitat damage, and impact on ecosystem services. In addition to these impacts, the measures taken to address them may, in turn, incur societal and environmental side effects. This complex interplay has been well demonstrated in the aftermaths of both the Chernobyl and Fukushima accidents.
ALLIANCE is convinced that the scientific foundation for a holistic integration of human and environmental protection, and associated management approaches should be addressed.
2. INTEGRATING Human and environmental protection FRAMEWORKS
2.1. Introduction
The development of risk assessment frameworks for chemical pollutants has been undergoing considerable advancement over the last three decades. It initially focused on human health protection (e.g. US NRC, 1983), whereupon it expanded to include ecological risk assessments (e.g. US EPA, 1992). Risk assessments for radiation have followed a similar, if delayed, evolution, as reflected in the 2007 ICRP Recommendations (ICRP, 2007). However, the development of a full framework for integration of human and ecological risk assessments for radionuclides for any specified exposure situation is still at an early stage (Copplestone et al., 2010), and remains a significant challenge for radioecology, as suggested by Pentreath (2009) in the context of the existing ICRP approach:
Building on
This paper will propose and discuss the need for and feasibility of an integrated and coherent system of radiation protection for both humans and the environment. This paper will also highlight a number of challenges related to the integration of human and environmental protection.
2.2. Transport and transfer
The same source term has the potential to give exposure to humans and the environment, as a consequence of where radionuclides (and potentially other contaminants) are transported/dispersed to the receptor points, whereupon radiation can impact humans and wildlife. The most obvious benefit of integrating humans and wildlife in one assessment scheme comes from the sharing of base data (source term, site and environmental characteristics, environmental monitoring, receptors, transport, and exposure pathways) contributing to efficiency and consistency of approach, such as by using the same transport and transfer models.
The physicochemical and biological processes at the basis of environmental transfers of radionuclides are independent of the receptor; as such, the transfer/dispersion models applied should be the same. As an example, the International Atomic Energy Agency (IAEA) SRS-19 dispersion model (IAEA, 2001), originally developed for low-tier impact assessment for humans, is embedded in the Tier 1 and 2 level of the ERICA Tool (Brown et al., 2008). In order to avoid inconsistencies, it follows that environmental transfer modelling should have the same underlying databases up to the point where the biological transfer of radionuclides occurs. Here, differences begin as whole-body transfer factors are routinely considered in assessments for wildlife, whilst transfer factors to tissues derived from specific intake/excretion data and pharmacokinetic models are used for humans. TRS472 (IAEA, 2010) also presents concentration ratios for wildlife (e.g. deer, freshwater fish) entering the human food chain, and does not use the same underlying data as that used in TRS479 (IAEA, 2014) for wildlife.
The conceptual model differs for humans and wildlife. For instance, for wildlife, the whole-body activity concentrations are related to concentration of the radionuclide in a reference medium (soil, water, or air) with no direct link to foodweb transfer, whereas for human exposure assessment, food chain modelling is considered.
An integrated approach for human and wildlife risk assessment would recognise the similarity in environmental processes and assessment context, and would ensure consistency in databases, parameter values, and representation of data whilst respecting the need to represent the intake (and resulting internal exposure pathway) differently for biota and humans.
2.3. Exposure and dose assessment
For humans, biokinetic models employ a well-defined ‘Reference Person’ to simulate intake/retention of a given radionuclide. This is combined with dosimetric models (e.g. Monte Carlo radiation transport codes) considering the elemental composition of the Reference Person; the masses, dimensions, and densities of the different tissue; and the radiation weighting factors accounting for the quality of radiation (in causing biological damage) and differential sensitivity of organs. This implies the ability to convert an intake of activity into a committed effective dose (in Sv).
For assessments of exposed plants and animals, the ERICA Tool (Beresford et al., 2007) can be used as an example of a simplified system involving concentration ratios to characterise the transfer, combined with a simplified dosimetric model which assumes that internally incorporated radionuclides are distributed homogeneously within the organism, with no differentiation between organs or tissue types. The model also assumes that the organism is immersed in a uniformly contaminated medium, and that the dose rate (considered to be aggregated over all transfer pathways) is averaged over organism volume. The radiation dose to different species of biota is calculated by using dose conversion coefficients, which are derived from absorbed fractions obtained by Monte Carlo simulations, which use an idealised representation of the body as an ellipsoid of stated mass and dimensions.
The dosimetric method for wildlife is, therefore, a simplification of the more complex dosimetric approach employed for humans. This simplification is necessary, given the large range of shapes, sizes, and masses of organisms from microscopic bacteria to the largest plants and animals. In theory, methods (such as biokinetic models) used for calculating internal exposure to man could be transferred to provide a common approach for biota, but although conceptually simple, there is a high data demand in this approach, resulting in a diminishing return as parameter demand increases. However, biokinetic models are, to some extent, being applied for wildlife (e.g. Vives i Batlle et al., 2008; Bertho et al., 2010), and are required if a more detailed mechanistic understanding of the effects observed is desired.
The physical processes of interaction of radiation with living matter are the same for all organisms regardless of whether they are human or non-human, but, as mentioned, the general approach to modelling of the dose is different for humans and wildlife (consideration or not of organ dose, relative biological effectiveness, and biokinetic modelling). However, an evolution is to be noticed. More complex anatomical phantoms, termed ‘voxel phantoms’, have been developed to test the validity of the simplistic geometric models applied in wildlife dose assessment. Voxel models for small animals were first developed in the context of medical radiation protection (Stabin et al., 2006), but are now also developed for environmental radiation protection, particularly for some of the Reference Animals. Published results cited by Caffrey et al. (2016) include phantoms of rabbit, mouse, rat, frog, two dogs, crab and rainbow trout. These voxel models allow for calculating accurate organ dose rates, and aid in validating the use of ellipsoidal models for regulatory purposes. However, these voxel phantoms will unlikely be integrated within tools like the ERICA Tool due to their high data demand and lack of underpinning data, as well as the fact that the simplified ellipsoidal representation provides a reasonable conservative estimation of the dose (Ruedig et al., 2015).
External dose calculations are more amenable to harmonisation. To explore possible implications of the simplifications made within the system of environmental radiological protection in terms of the efficacy and robustness of dose rate predictions, Brown et al. (2013) compared human radiological assessment and environmental radiological assessment using a human surrogate in a non-human-biota dose assessment methodology to assess external doses from 137Cs, 90Sr, 60Co, and 210Po. External dose assessment for environmental and human systems provided results for the human surrogate that corresponded quite closely. Similarly, Vives i Batlle et al. (2015) made a successful comparison of biota dose conversion coefficients for a human-size ellipsoid geometry with dose conversion factors for 41Ar, 85,88Kr, and 131m,133Xe as calculated for humans, highlighting consistency in the human and wildlife dosimetric methodologies for external exposures to these noble gases.
An additional difficulty in the integration exercise comes with the different life stages and hence different exposure rates. Feeding habits, physiology, and sensitivity vary with life stage, but associated parameter values and underpinning data are generally not available. Most commonly used data are those for the adult stage. For human risk assessment, there are standard tables for human habits in function of age class which allow one to assess the life-long committed dose, but such an approach to calculate biota committed effective dose is not available for wildlife (be it that it could be made available).
From the foregoing, it is unlikely that it will be effective to integrate human and wildlife dose assessment; the effort will likely exceed the benefit, although it is demonstrated that the wildlife method is a reasonable conservative approximation. Once the necessary simplifying assumptions and differences in protection endpoint used are understood, there is no inconsistency between the two approaches at the physical level.
2.4. Protection endpoints
Protection endpoints for human and wildlife protection are as different as the protection goals. For humans, protection of individual people is considered, whereas for non-human biota, the goal is to protect at the level of populations (Copplestone et al., 2007; ICRP, 2008; Andersson et al., 2009). In addition, the endpoints of the risk analysis differ. For humans, radiological protection is focused on preventing deterministic effects and reducing stochastic effects (primarily cancers), wherein the incidence of the effect but not its severity is correlated with dose or dose rate. For biota, endpoints vary, but are generally deterministic and are related to population viability (e.g. fecundity, fertility, mortality, and morbidity).
ICRP (2007) defines the protection goal as follows. For the environment:
Regarding human health:
2.5. Benchmarks for radiological protection
Human radiation protection has established dose limits, dose constraints, and reference levels for different types of exposure situations (planned, existing, and emergency). These benchmarks have a biological basis (the dose–response curve), are individual based, and aim to restrict individual doses. Conversely, there is no unique benchmark for protection of wildlife comparable to, for instance, the 1 mSv year−1 benchmark to protect the public from planned releases as set by ICRP (2007). The ICRP dose limits for humans are intended to serve as a boundary condition that will prevent deterministic effects and limit the probability of stochastic effects.
The various benchmarks proposed for the protection of wildlife are difficult to compare as they are not limits and are generally stated to be indicative for different organism groups. In their graded approach, the US Department of Energy (US DoE, 2002) used a dose rate limit of 10 mGy day−1 (≈400 µGy h−1) for native aquatic animals, and dose rate limits of approximately 400 and 40 µGy h−1 for terrestrial plants and terrestrial animals, respectively. These dose rate benchmarks are based upon the values of 40 µGy h−1 for terrestrial animals and 400 µGy h−1 for terrestrial plants and all aquatic species proposed by IAEA (IAEA, 1992) and the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR, 1996).
Benchmarks for non-human biota are considered as threshold values below which populations are unlikely to be significantly harmed based on reviews of the scientific literature. Andersson et al. (2009) and Garnier-Laplace et al. (2006, 2010) used a statistical approach adopted from methods used to set chemical benchmarks to estimate a predicted no-effects dose rate of 10 µGy h−1, which is used in the ERICA Tool as a no-effect benchmark. In reality, this is a screening benchmark in that it is intended to be used to screen-out scenarios of no radiological concern from undergoing further radiological risk assessment.
In its 2008 report, ICRP proposed ‘derived consideration reference levels’ (DCRLs) as intervals or ‘bands’ of dose rate intended to serve as points of reference for assessing the potential effects of ionising radiation on non-human biota. ICRP suggested that, in considering the potential effect of ionising radiation, context should be provided by comparing estimated dose rates to multiples of the dose rates experienced by the various biota in their natural environment. A DCRL can be considered as a band of dose rate within which there is likely to be some chance of deleterious effects of ionising radiation occurring to individuals of a given type of RAP. For deriving those DCRLs, ICRP compiled available effects data for 12 RAP categories. The DCRLs hence refer to (protection of) various organism groups defined at the family level. The DCRLs for the RAPs fall into three main groups of 0.1–1 mGy day−1 (Deer, Duck, Pine tree, and Rat), 1–10 mGy day−1 (Brown seaweed, Flatfish, Frog, Grass, and Trout), and 10–100 mGy day−1 (Bee, Crab, and Earthworm) (ICRP, 2008). The lower band can be considered as a no-effect dose rate. The DCRL bands could provide some context to help an assessor make decisions with regard to optimisation of effort to be expended for protection of the environment, in combination with additional information about the specific situation under assessment.
Integration of human and environmental protection at the level of benchmarks may be difficult, if not ultimately futile, since the protection goals in both frameworks are quite different.
2.6. Integrated impact and risk assessment
Risk assessment is usually organised in four steps: (i) formulation of the problem (or hazard identification); (ii) exposure assessment; (iii) effects characterisation; and (iv) risk characterisation, which integrates information from the previous steps. Although risk assessors may work independently, providing two versions of the risk posed by radiation to humans and the environment, decision makers must make one decision that provides an acceptable result for all involved. Hence, it is vital for assessors to present a coherent set of results. Integrated risk assessment can provide better input to decision making by addressing impact and risks to human welfare. While environmental contaminants may affect human health directly, they also affect human welfare indirectly (Suter et al., 2005). Nature provides a large variety of products and services to man, including wild and domestic food, and a large part of the contributions of ecosystem services to human well-being is of purely public goods nature; in many cases, people are not even aware of them (e.g. clean air and water, climate regulation) (Costanza et al., 1997). All these contributions to human well-being can be jeopardised by contamination, and are not considered in routine risk assessment. Decisions concerning the management of radioactive contamination (and other contaminants) in the environment will be inadequately informed until assessments are sufficiently integrated to address environmentally mediated effects on human welfare. However, beyond human interest, it is necessary to protect wildlife, not only because we depend on the environment, but because the environment has an intrinsic value that demands to be protected.
Since radionuclides are generally present in a multi-pollutant context, other contaminants (and generally other stressors) are to be considered since their combined effect can be additive or even multiplicative. Consistency between frameworks for chemicals and radiation facilitates the mutual understanding between assessors and the exchange of methods and tools. In turn, this should help to facilitate stakeholders’ understanding of risk from various sources, including radiation. Taking this even further, cumulative risk assessments address combined risks from exposures to multiple chemical and non-chemical stressors, and may focus on vulnerable communities or populations.
Significant contributions have been made to the development of concepts, methods, and applications for cumulative risk assessments over the past decade. Work in both human health and ecological cumulative risk has advanced in two different contexts. The first context is the effects of chemical mixtures that share common modes of action, or that cause common adverse outcomes. In this context, two primary models are used for predicting mixture effects, dose addition, or response addition. The second context is evaluating the combined effects of chemical and non-chemical stressors (e.g. radiation, biological, nutritional, economic, psychological, habitat alteration, land-use change, global climate change, and natural disasters) (Fox et al., 2017).
Integrated risk assessment (man–environment, radioactive, chemical, and non-chemical stressors) clearly has a benefit that it will provide better input for decision making.
2.7. Multi-criteria perspective in support of optimised decision making and risk management
In handling existing, planned, and emergency exposures, a range of well-planned and effectively integrated management approaches is required. Although the primary driver in choosing management options for radiation exposure situations will likely always be the reduction or prevention of dose to humans, the problem has many aspects. There are significant needs in other spheres – environment, economic, infrastructural, and social – that should be considered when selecting management options. Thus, there is a need to optimise management approaches for radioactive contamination beyond simple consideration of radiation dose vs. economic cost. Optimisation may require expertise in areas such as radioecology, urban planning, social and economic sciences, information technology, waste handling, environmental and agricultural sciences, and risk perception and communication. From a practical viewpoint, the optimisation process could be based on the integration of decision support systems associated with radiological sciences with knowledge databases and decision-aid tools from other disciplines (e.g. urban planning, economics, and sociology) to maximise the benefit of society.
Multi-criteria analysis (Linkov and Moberg, 2012) provides a suitable theoretical framework that can be used to combine quantitative and qualitative factors, and to guide the decision process towards a satisfactory solution (since no global optimum exists in the presence of multiple, often conflicting, criteria). By using decision tools based on a multi-criteria decision analysis, all environmental and anthropocentric parameters that either exacerbate or mitigate the consequences of the contamination can be considered together.
Multi-criteria decision analysis is often employed for the analysis of complex problems involving non-commensurable, conflicting criteria which form the basis within which alternative decisions are assessed. At the same time, multi-criteria decision aid methods overcome the shortcomings of traditional decision support tools used in economy, such as cost–benefit analysis, especially when dealing with values that cannot be easily quantified (e.g. environmental issues) or translated in monetary terms due to their intangible nature (e.g
3. Conclusions
An integrated approach for human and wildlife risk assessment recognises the similarity in environmental processes and assessment context, and would ensure consistency in databases, parameter values, and representation of data.
Dose assessment, endpoints, and benchmarks are parts of the assessment process that follow the same physical principles but are not amenable to full integration. However, all aspects of environmental dispersion and transport of radionuclides are amenable to integration using the same dispersion models and databases.
Integration of human and ecological health provides consistent assessment results, incorporating the interdependence of humans and the environment, and improving the efficiency and quality of assessments. Integrated risk assessment will better inform stakeholders and decision makers, allowing for fully-informed decisions.
Multi-criteria decision support is a good basis for integrated decision making in the context of an integrated approach for the protection of humans and the environment.
Efforts should continue to determine where the harmonisation of approaches for humans and the environment is justifiable and beneficial, focusing on developing integrated methods for assessment in the areas of transfer, exposure, dosimetry, and risk. Future research initiatives of ALLIANCE in this area need to establish good links with the work being carried out by international radiation protection bodies such as ICRP.
