Abstract
Radiation therapy of cancer patients involves a trade-off between a sufficient tumour dose for a high probability of local control and dose to organs at risk that is low enough to lead to a clinically acceptable probability of toxicity. The International Commission on Radiological Protection (ICRP) reviewed epidemiological evidence and provided updated estimates of ‘practical’ threshold doses for tissue injury, as defined at the level of 1% incidence, in ICRP
1. INTRODUCTION
The International Commission on Radiological Protection (ICRP) issued new recommendations on radiological protection in
ICRP subsequently issued
The Commission now believes that at acute or accumulated doses >0.5 Gy, the risk of a tissue reaction becomes increasingly important at very long times (i.e. >10 years) after radiation exposure, particularly for the lens of the eye and the circulatory system. Medical practitioners should recognise that the nominal absorbed dose threshold for cataracts and circulatory disease may be as low as 0.5 Gy to the lens of the eye, heart, or brain, although some uncertainty remains. There is no change in the Commission’s previous judgement that acute doses up to approximately 0.1 Gy produce no functional impairment of tissues. For medical imaging and therapy, radiation protection should be optimised for exposures to relevant specific tissues and not just for whole-body exposures.
This paper examines the implications of the new thresholds for tissue reactions in RT, and briefly discusses how tissue reactions are managed in RT from the perspectives of both patients and staff. Where appropriate, research needs are identified.
2. Tissue reactions and threshold dose
The Commission initially introduced the term ‘stochastic’ to describe single-cell effects such as mutagenesis (ICRP, 1977). In
2.1. Tissue reactions to ionising radiation
After high doses of radiation, substantial cell killing can result in evident tissue reactions. These reactions typically have a latent effect and may occur hours, days, months, or years after irradiation, depending on the tissue in question (ICRP, 2012). Non-lethal effects of radiation on cells and tissues, with associated disturbances in molecular cell signalling, also play a role in tissue responses. The manifestations of tissue injury vary and depend on several tissue-specific factors, including cellular composition, proliferation rate, damage to blood vessels or elements of the extracellular matrix, and the mechanism of response to radiation. The dose at which damage might be detected depends on the level of injury as well as individual radiosensitivity (ICRP, 2012). In some early tissue reactions (occurring hours to a few weeks after irradiation), an inflammatory response may dominate, and subsequent reactions may be due to both cell loss and non-cytotoxic effects on tissues. Late tissue reactions (occurring months to years after irradiation) may result from injury directly to the target tissue or from severe early reactions (Dorr and Hendry, 2001).
2.2. ICRP threshold dose concept
For general radiological protection, the Commission’s recommended dose limits for tissue reactions are based on threshold doses for morbidity in specific organ systems and for mortality. A threshold dose for a given effect can be defined as a dose below which the effect does not occur. However, this dose is often difficult to determine.
While it is still unclear whether the threshold dose is the same for acute, fractionated, or chronic exposures, in the absence of overwhelming evidence, ICRP assumed that the threshold dose is the same in all cases. Therefore, for cataract induction, and cardiovascular and cerebrovascular diseases, ICRP has determined the annual (chronic) dose rate for a multi-year period (Gy year−1) to be 0.5 Gy divided by the duration of exposure in years. ICRP emphasises that great uncertainty is attached to these values and that ongoing evaluation is important (ICRP, 2012). The relevance of these low threshold doses for RT patients is discussed below.
3. Management of tissue reactions in radiotherapy
RT is one of the three principal modalities used in the treatment of malignant disease (cancer), the other two being surgery and chemotherapy. The initial objective of RT, for both curative and palliative treatment, is the ‘uncomplicated loco-regional control of cancer by radiation therapy’ (Emami et al., 1991). There is substantial evidence that local control may also impart additional benefit with regard to relapse-free survival and distant metastases-free survival (Zelefsky et al., 2008). External photon beam treatment is usually performed with more than one radiation field in order to achieve a uniform dose distribution and a high dose inside the target volume, a tumouricidal dose at that target volume, and as low as possible a dose in healthy tissues surrounding the target (Johns and Cuningham, 1983). Skillful use of modern treatment techniques, primarily ‘conformal’ RT, including intensity-modulated RT (IMRT) and image-guided RT (IGRT), permits reduction or favourable redistribution or both in the volume of healthy tissue to be irradiated, with a consequential reduction in side effects and complications, and dose escalation within the target. Hypofractionation, stereotactic body radiotherapy, and stereotactic radiosurgery are being employed with greater frequency. Both dose escalation (e.g. prostate cancer) (Zelefsky et al., 2008) and hypofractionation (e.g. lung and liver cancers) (Timmerman et al., 2010; Hong et al., 2016) have been shown to increase the local control rate significantly in some tumours.
As in
3.1. Clinical treatment planning
When planning a treatment, a clearly defined prescription or reporting point and detailed information regarding total dose, dose per fraction, and total elapsed treatment time allow for proper radiation delivery and appropriate comparison of outcome results. Treatment planning in RT typically utilises the concept of various ‘volumes’ (ICRU, 1993, 1999), including the gross tumour volume (the gross palpable or visible demonstrable extent and location of malignant growth), the clinical target volume (the tissue volume that contains a demonstrable target and/or subclinical microscopic malignant disease that has to be treated adequately to achieve the aim of RT cure or palliation), an interim target volume (including an internal margin designed to account for variations due to organ motions), and the planning treatment volume (with an additional margin for set-up uncertainties, as well as machine tolerances and intratreatment variations).
An organ at risk (OAR) is an organ whose sensitivity to radiation is such that the dose received from a treatment plan may be significant compared with its tolerance. Organs with low tolerance doses with potential significant clinical complications (e.g. the lens of the eye during nasopharyngeal treatment or brain tumour treatments), even when not immediately adjacent to the treated volume, are important to consider at risk. An understanding of tolerance doses in whole organ, partial organ, and general non-uniform organ irradiation is essential for optimising target doses, perhaps with a change in the field arrangements or a change in the dose, in order to spare or minimise dose to an OAR.
3.2. Tolerance dose concept
The classification of radiation effects for dose limitation purposes has changed over time (Hamada and Fujimichi, 2014). Threshold doses for tissue reactions can be reached in some patients during RT procedures. However, radiation oncologists and clinical physicists typically do not utilise the concept of a sharp threshold dose for complications, but consider complications as probabilistic. This is primarily because the complication studies that inform the clinicians are based on populations that are often highly inhomogeneous in terms of patient-specific risk factors (e.g. genetics, age, co-morbidity, concurrent medication, and many others). In radiation oncology, tissue reactions are managed through an earlier concept of ‘tolerance dose’ (Hamada and Fujimichi, 2014). In contrast to the current ICRP definition of a threshold dose, the tolerance dose may depend on details of the therapy such as the number of fractions and prior or concurrent treatment. The tolerance dose is preferably based on evidence-based studies of similarly treated patient populations. In radiation oncology, tolerance doses are associated with an acceptably low complication probability, but ‘acceptable’ includes a broad range of incidence, depending on the trade-off between tumour control and complication severity. An example is radiation myelitis, a clinically devastating complication. Clinicians go to great lengths to keep spinal cord doses ‘low enough’ to limit the expected myelitis rate below 1% (Kirkpatrick et al., 2010; Sahgal et al., 2013). For radiation myelitis, keeping the complication rate at less than 1% usually takes priority over tumour control. In other cases, clinically acceptable rates of non-lethal complications can be 20–30% or more, depending on the trade-off with local tumour control or cure. Thus the tolerance dose for a specified complication is usually higher than the threshold dose. Most RT clinicians evaluate both dose and dose distribution (‘dose–volume effects’). As a further example, because of the nature of lung complications, clinicians often assess the mean lung dose or the percentage of the lung that would receive more than a particular dose level (Marks et al., 2010a).
The scoring of adverse radiation effects has usually relied on relatively coarse ‘grading schemes’ of clinical manifestations of severity (NCI, 2010). Hence, ‘tolerance’ does not simply imply that subclinical or relatively minor effects are absent. As many late radiation effects progress with time, tolerance doses for a specified level of damage are not absolute, but decrease with increasing follow-up time. Such values should therefore typically be identified as pertaining to a specified time after exposure (e.g. 5 years). In short, a tolerance dose is highly dependent on time–dose–volume factors. Actual values depend on the treatment protocol, irradiated volume, concomitant therapies, clinical status of the organ/patient, and follow-up time.
3.3. Historical approach
The importance of time–dose–volume factors in radiotherapy has been recognised for some time (Ellis, 1963, 1969), but clinical factors were not always recognised. Attempts were made to understand the relationship between time–dose data and patterns of radiation damage in normal tissue (Rubin and Cassarett, 1968a,b; Mendelsohn, 1969; Wara et al., 1973, 1975; Mah et al., 1986; Mah and Vandyk, 1988). RT fields and doses were often selected empirically, based largely on experience and clinical intuition, and may not have fully reflected the underlying normal anatomy, physiology, and dosimetry. Treatments had to be designed without volumetric imaging and fast computers. Therefore, in many early clinical situations, the radiation oncologist used simple field arrangements, and treated to tolerance doses supported by the experience of that era.
An urgent need to address this deficiency became apparent with the increasing use of computed tomography (CT), three-dimensional treatment planning, and improved dose delivery. In the late 1980s, a task group sponsored by the National Cancer Institute pooled available clinical experience, judgement, and information regarding the dose–volume dependence of major complications, and introduced the concept of partial organ tolerance doses (Emami et al., 1991). This provided a practical summary of tolerance doses for 21 complications from conventionally fractionated treatments (20–40 fractions), including their dependence on the dose distribution within the organ. Some of the tolerance doses were based on hard experimental and clinical investigational data, some were based on less firm data that were considered reliable at the time, and some were based purely on the clinical experience of the authors. With the understanding of these uncertainties and limitations, the pooled information on normal tissue tolerance to therapeutic irradiation proved useful for RT treatment planning purposes, and is still widely used for certain complications.
Several other extensive reviews have addressed radiation effects in normal tissues (Potten and Hendry, 1983; UNSCEAR, 1988; Alberti et al., 1991; AGIR, 2009; Shrieve and Loeffler, 2011), and in specific organ systems such as skin (Potten, 1985; ICRP, 1991), intestine (Potten and Hendry, 1995), bone marrow (Hendry and Lord, 1995), and the immune system (UNSCEAR, 2006).
3.4. Quantitative analysis of normal tissue effects in the clinic
In an effort to summarise more recent organ-specific outcome data, the American Association of Physicists in Medicine and the American Society for Radiation Oncology funded the QUANTEC (Quantitative Analysis of Normal Tissue Effects in the Clinic) project, which began in 2007. This effort resulted in an update and refinement of the earlier estimates of organ tolerance doses that has provided valuable practical clinical guidance for physicians and treatment planners (Marks et al., 2010b). Several limitations in the available data are still recognised, such as low statistical power, differing endpoint definitions, and variable RT applications (e.g. fractionation schedules, number of fields utilised, total dose delivered). Outcomes studies continue to the present day, including investigations of normal tissue complications following hypofractionated treatment. However, complications that result from very low (<2 Gy) doses remain poorly studied in RT. Although it is considered good practice to keep out-of-field and exit doses as low as practicable, out-of-field and on-treatment imaging doses are poorly reported, if at all, by conventional planning systems. Detailed patient-specific measurements of such doses are uncommon. The incidence of these adverse effects is low, and would not be apparent in typical single institution or protocol outcomes.
3.5. Normal tissue complication probability models
Several efforts have been made in order to quantify the relationship between the severity of tissue damage, total dose and dose distribution, dose per exposure, number of exposures, overall duration of exposure, and other factors (e.g. chemotherapy, co-morbidities). Such models have been useful in radiation oncology practice and research, and in clinical oncology, such as those developed to estimate normal tissue complication probability (NTCP) for partial volume irradiations and inhomogeneous dose distributions (Marks et al., 2010b). Models typically include, at a minimum, parameters describing the dose for a given probability of damage and the steepness of the dose–response relationship.
3.6. Ocular effects in radiotherapy
Radiation exposures of the eye may result in lens changes, including cataract formation (Cogan and Donaldson, 1951; ICRP, 1969, 2000; Merriam and Worgul, 1983; Kleiman, 2007). The initial stages of lens opacification may not result in visual disability, but the severity of these changes may increase progressively with dose and time until vision is impaired and cataract surgery is required (Merriam and Worgul, 1983; Lett et al., 1991; NCRP, 2000; Neriishi et al., 2007). The latency period for these changes appears to be inversely related to dose.
As noted previously, the Commission has concluded that there is a dose threshold of approximately 0.5 Gy for cataracts induced by acute exposures, with 90–95% confidence intervals that include zero dose (ICRP, 2012). For fractionated and protracted exposures, the Commission has determined that values of approximately 0.5 Gy are appropriate. These conclusions are based on recent studies where formal estimates of threshold doses have been made after long follow-up periods. This apparent dose threshold is lower, by a factor of approximately 10, than was deduced from earlier studies, likely because the earlier studies had shorter follow-up periods, did not address latency and dose dependence, were not sensitive to early lens changes, and had low statistical power.
Several of the early cataract studies were specifically associated with RT (Cogan and Dreisler, 1953; Merriam and Focht, 1957; Qvist and Zachauchristiansen, 1959), with a wide range of ages (infants to 84 years), time to follow-up (1–40 years), and doses (0.23–69 Gy). Most were small case series or had short follow-up. Several epidemiological studies support a lower or zero threshold for cataracts from RT. Albert et al. (1968) studied young children irradiated for
At about the same time as
A recent study evaluated available literature reporting cataract incidence after haematopoietic stem cell transplantation with total body irradiation (Hall et al., 2015), and found that dose (with predicted dose–response curve), dose times dose per fraction, paediatric status, and regimented follow-up care by an ophthalmologist were predictive of 5-year cataract incidence. The odds ratio for paediatric patients was 2.8 (95% confidence interval 1.7–4.6) relative to adults, and fractionation appeared to have a substantial protective effect (Hall et al., 2015). The risks of cataract among survivors of childhood cancer were recently evaluated in 13,902 5-year survivors in the Childhood Cancer Survivor Study (Chodick et al., 2016). Patients receiving radiotherapy had mean doses to the lens of 2.2 Gy (0–66 Gy) as estimated by radiotherapy records, and a total of 483 (3.5%) cataract cases were identified. Cataract prevalence was positively associated with the dose to the lens in a manner consistent with a linear dose–response relationship (excess odds ratio per Gy 0.92, 95% confidence interval 0.65–1.20), and the risk of cataract increased with increasing exposure, beginning at doses to the lens as low as 0.5 Gy (Chodick et al., 2016). The pooled reviews and recent studies further demonstrate that cataract formation can be associated with the lowest RT doses, and that the lens should be given specific attention, both in planning optimised (lowered dose to the lens) treatments wherever possible and in postirradiation management.
Effects on the lens of the eye are considered to be as important in the treatment of head and neck and brain cancers. Here, high-quality treatment planning will minimise the dose to the lens whenever possible without compromising target coverage. Cataracts are also a risk of total body irradiation, but little can be done without jeopardising disease control. There is a recent additional risk from RT with IGRT, which is being increasingly utilised and can result in additional on-treatment and post-treatment dose to the eye (Murphy et al., 2007). Some protocols include imaging daily or at every fraction, and could deliver significant doses (Kim et al., 2013). With respect to diagnostic imaging, Vañó et al. (2015) noted that image optimisation [and implementation of lens-dose-sparing procedures (when possible)] should be implemented for patients receiving several head CTs. These patients might receive doses to the lens greater than 0.5 Gy. Similar considerations should be made for IGRT imaging, which often utilises cone beam CT along with other imaging modalities.
Increasing knowledge of the pathogenesis, prevention, and treatment of radiation-induced damage to the orbital contents will allow a reduction in risks of RT. Optimal treatment planning and improved treatment delivery techniques, including the use of eye shielding (when possible and when it is felt that it will be beneficial), informed by knowledge of overall ocular effects (including the new understanding of lowered cataract thresholds), can help minimise the risk of toxicity in RT patients by reducing ocular doses where possible. This is important for both external-beam RT modalities. Radiation oncologists and ophthalmologists should work collaboratively to avoid ophthalmic complications where possible, and to recognise and treat them promptly when they arise (Jeganathan et al., 2011).
Patients are typically given information on risks to the lens of the eye, and an informed consent discussion is held. Few patients fail to consent to potentially life-saving treatment in order to avoid cataract surgery in the future. If a cataract impairs visual function, lens replacement surgery, although an invasive procedure, is usually highly successful (Vasavada et al., 2012). While 90% of patients acquire totally corrected vision, complications occur in a low percentage of patients. These include retinal detachment, oedema, formation of secondary cataracts on the replaced lens, and others. Stein (2012) reviewed the available literature on serious adverse events after cataract surgery. Posterior subcapsular rupture occurred in 1.9–3.5% of patients, retinal detachment in 0.4–3.6% of patients, endophthalmitis in 0.05–0.3% (with a collective rate of 0.128%) of patients, and suprachoroidal haemorrhage in 0.03–0.13% of patients (Stein, 2012).
3.7. Cardiovascular and cerebrovascular disease in radiotherapy
Radiation-induced heart disease in cancer survivors includes a wide spectrum of cardiac pathologies, including coronary artery disease, myocardial dysfunction, valvular disorders, and pericardial disease. These usually present 10–15+ years after exposure, although asymptomatic abnormalities may develop much earlier. The long delay between exposure and symptomatic expression of damage probably explains why the radiation sensitivity of the heart was underestimated previously. Only in the last few years has there been consolidation of the evidence on this topic. The evidence arises primarily from RT experience and epidemiological studies following nuclear and other radiation activities. From current evidence, the Commission now notes an acute dose threshold of approximately 0.5 Gy for both cardiovascular and cerebrovascular diseases (ICRP, 2012).
Cardiac effects have been investigated most extensively in long-term follow-up studies of patients treated with radiation for breast cancer and Hodgkin’s lymphoma. Survivors of Hodgkin’s lymphoma show a strongly elevated risk for cardiac death depending on the age of the patient (increased risks for irradiation at young age), the treatment method used, and the follow-up time (Boivin et al., 1992; Hancock et al., 1993; Adams et al., 2003; Aleman et al., 2003; Swerdlow et al., 2007). Radiation causes both increased mortality (mainly fatal myocardial infarction) and increased morbidity. Patients with Hodgkin’s lymphoma also have a significantly higher risk of requiring valve surgery or revascularisation procedures 15–20 years after treatment (Hull et al., 2003).
Increased cardiac morbidity and mortality has been widely reported after treatment for breast cancer, especially when older treatment techniques were used (Adams et al., 2003; Gaya and Ashford, 2005; Senkus-Konefka and Jassem, 2007). In addition, a preliminary analysis of updated data from the Early Breast Cancer Trialist’s Collaborative Group (>30,000 women followed for up to 20 years post treatment) demonstrated that risk of cardiac death was related to the estimated cardiac dose, and increased with mean total cardiac dose (Darby et al., 2010). A 2011 study reported the incidence of heart disease in 35,000 women treated with RT for breast cancer and followed-up for 30 years (McGale et al., 2011). It concluded that breast cancer RT has, at least until recently, increased the risk of developing ischaemic heart disease, pericarditis, and valvular disease. Two other publications also demonstrated significant increases in cardiovascular morbidity and mortality in long-term survivors of childhood cancers (Mulrooney et al., 2009; Tukenova et al., 2010).
Darby et al. (2013) studied the risk of ischaemic heart disease in 2168 women after RT for breast cancer with mean heart doses ranging from 0.03 to 27.72 Gy, and an overall average of 4.9 Gy. The results showed that the rates of major coronary events increased linearly with mean dose to the heart at 7.4% Gy−1, with no apparent threshold below which there was no risk. The risk began to increase within the first 5 years after the exposure, and continued for at least 20 years. The percentage increase in risk per Gy was similar for women with and without cardiac risk factors at the time of radiotherapy. Similarly, significantly increased risks of stroke have been described in adult patients treated with RT for head and neck cancer (Dorresteijn et al., 2002; Haynes et al., 2002; Scott et al., 2009), and long-term survivors of childhood leukaemia and lymphoma (Bowers et al., 2005, 2006).
The studies evaluated by the Commission and the more recent evaluations appear to support the Commission’s previous low threshold values for cardiovascular disease. Care in planning, evaluating, and implementing RT treatments is appropriate. Radiation-related cardiotoxicity in cancer patients can also be influenced by additional treatment with systemic therapy, such as chemotherapy, that may also result in cardiac effects. Combined modality treatments, increasingly used in cancer therapy, should continue to be implemented with care. The risk of RT-related cardiovascular disease might also be increased through indirect effects of RT (e.g. hypertension following kidney irradiation) or through other pre-existing general risk factors for cardiovascular diseases.
IGRT is being utilised increasingly and can result in additional patient dose because of between-treatment image guidance and post-treatment follow-up (Kim et al., 2013). Such imaging could result in cumulative (long-term) cardiac doses >0.5 Gy (Hess et al., 2016). Vañó et al. (2015) noted that optimisation is particularly important for patients who undergo several CT coronary angiography examinations, because their heart doses may exceed 0.5 Gy. It is likely even more of a concern for young people with curable cancers such as Hodgkin’s lymphoma. Similar considerations for ‘gentle IGRT’ (Hess et al., 2016) should be made for IGRT imaging, which often utilises cone beam CT in addition to other imaging modalities.
Cardiovascular toxicity following RT and/or chemotherapy is expected to change in the future. On one hand, a decrease in toxicity is expected because of improved technical methods to reduce the dose to the heart and major blood vessels. For example, there is currently a major effort to use virtual simulation and CT planning techniques to estimate doses to various parts of the heart for breast cancer RT, and to evaluate the benefits of treatment position (prone vs supine), partial breast irradiation, and deep-inhale breath hold in reducing the mean cardiac volume that receives a significant dose (Gaya and Ashford, 2005; Nieder et al., 2007; Taylor et al., 2007). Increased use of combined modality treatment has allowed reduction of both treatment dose and treated volume in thoracic lymphoma treatments (Portlock, 2015). On the other hand, improved treatment of lower-stage malignancies may improve long-term survival and lead to a greater number of survivors being at risk for late-effect radiation-induced heart disease. Quality treatment planning (e.g. partial breast irradiation with external-beam RT or brachytherapy, breath-hold techniques, proton therapy, or dose de-escalation for lymphoma) can help to minimise heart dose, and prevent or limit cardiac complications. However, the dose inside the target volume must be adequate for tumour control, and cardiac complications may not be avoidable. Once technical means for cardiac dose minimisation have been exhausted, the best solution may be patient education, with emphasis on the need for continued cardiac surveillance and encouragement to adopt a heart-healthy lifestyle.
3.8. Research needs: radiotherapy
An improved mechanistic understanding of the underlying radiobiology of tumour control and normal tissue complication epidemiology is clearly still required (Brenner et al., 2003; Joiner and van der Kogel, 2009; Dauer et al., 2010; Dorr, 2015). At the same time, an improved understanding of how treatment decisions affect outcomes is also needed. Despite a large number of dose–volume–outcome publications, progress in NTCP modelling to date has been modest and confusingly inconsistent. The QUANTEC reviews (Marks et al., 2010b) have been helpful, and additional organ-/tissue-specific reviews continue, but these have also demonstrated the limited accuracy of existing prediction models, in part due to endpoint and even organ definition differences, incomplete provided information, and low statistical power (especially where there is a low absolute incidence of recorded tissue toxicities) (Jackson et al., 2010). These problems are exacerbated for very low doses because such out-of-field doses are modelled inaccurately, if at all, by clinical treatment planning systems, patient-specific measurements of such doses are rare, and cancer patients in general receive unaccounted for dose from frequent radiological imaging.
3.9. General radiological protection for radiotherapy patients
Modern RT employs complicated technologies, including advanced imaging for more accurate treatment planning, IMRT, volumetric arc therapy, tomotherapy, IGRT for improved treatment localisation, respiratory gating, robotic systems, radiosurgery, newer and more complex treatment planning systems, virtual simulation, and ‘all-inclusive’ electronic patient data management systems. As such, it is increasingly important that radiation oncology facilities incorporate quality assurance programmes, as well as follow agency and professional society guidance for implementation (Podgorsak, 2005; AAPM, 2013; ASTRO, 2013a,b; ESTRO, 2013). Radiological protection for RT patients relies heavily on the collaborative efforts of several professionals, whose coordinated team approach greatly influences the outcome of the treatment.
Radiological protection for RT patients should not only address curative (or palliative) radiation delivery while controlling tissue effects, it should also include understanding, and perhaps minimisation, of stochastic effects. Both processes require specific information on the doses received by tissues at risk. The comprehensive review by Xu et al. (2008) was the first to compile and summarise tremendous amounts of data on dosimetric studies related to radiation-induced cancers in patients after RT. NCRP followed with a comprehensive review (NCRP, 2011) of second primary cancers, which focused on the complex epidemiologic and dosimetric issues surrounding past, conventional, and modern RT modalities and techniques, including IMRT and proton beam therapy.
4. CONCLUSION
RT of cancer patients involves a trade-off between a sufficient tumour dose for a high probability of local control and dose to OARs that is low enough to lead to a clinically acceptable probability of toxicity. Often, ‘low enough’ is >10 Gy, depending on the complication and the treatment details. However, the recent
Footnotes
Acknowledgements
The authors wish to thank the members of ICRP Committee 3, Radiation Protection in Medicine. This study was supported, in part, by National Institutes of Health/National Cancer Institute Cancer Center Support Grant P30 CA008748.
