Abstract
The infectious pandemics and epidemics of the past 200 years have caused millions of deaths. However, these devastating events have also led to creative thinking, imaginative experimentation and the evolution of medical care. As a result, the history of critical care medicine is entwined with the story of these global disasters. This article will take case studies from recent pandemics and epidemics and examine their impact on the development of anaesthesia and intensive care medicine.
Keywords
Infectious epidemics (the regional spread of disease) and pandemics (the worldwide spread of disease) have plagued humanity for millennia. Desperation born from widespread morbidity and mortality has led to creative experimentation and many developments in medicine; in particular, the specialties of anaesthesia and intensive care medicine have been widely influenced by infectious pandemics. 1
This article will review some of the major epidemics and pandemics of the past 200 years with a specific focus on developments in anaesthesia and critical care medicine.
Cholera
Cholera, a deadly diarrhoeal illness caused by the bacteria Vibrio cholerae, has been responsible for seven pandemics since the 19th century. Collectively, the pandemics have caused millions of deaths. 2 Intravenous fluid therapy, a cornerstone of management in anaesthesia and critical care medicine, was first used in the second cholera pandemic of 1826–1837.
Appalled by the scale of death and ineffectiveness of medical intervention, 22-year-old medical graduate William O’Shaughnessy used his background as an amateur chemist to attempt to treat the disease. Analysis of patients’ blood led him to consider the idea of intravenous saline, a treatment he first tried on dogs. 3 O’Shaughnessy’s principles were first used clinically in 1832 when Dr Thomas Latta reported ‘the need for early use of saline injection’. 4 Latta’s original publication detailed the administration of intravenous fluids to three patients, only one of whom survived. He and his colleagues continued using intravenous fluids throughout the year with variable success. Although others tried the therapy during the 1832 cholera epidemic, enthusiasm for intravenous fluids was short-lived as survival rates were low. It has been hypothesised that poor selection of recipients (moribund patients), too little fluid or lack of sterility contributed to the failure of intravenous fluids to be adopted at that time.5,6
In the following cholera epidemics, physicians largely reverted to more traditional oral and rectal hydration. Some 30 years later, there was a resurgence of interest in intravenous fluids when Goltz, in a complex physiology paper, suggested that death from haemorrhage could be prevented by simple fluid replacement. 7 His hypothesis was supported by the use of intravenous fluids in animal models by Kronecker and Sandler in 1879. 8 Subsequently Bischoff used intravenous fluids to treat antepartum haemorrhage in 1881. 9 The following year, Jennings administered 16 ounces of ‘saline alcoholic solution’ to a moribund obstetric patient and ‘signs of animation rapidly appeared’. 10 Intravenous fluid administration became increasingly popular from then on, with more widespread use in obstetric, operative and traumatic haemorrhage, and reports of increased safety compared to blood transfusions.6,11,12 There is further evidence that intravenous fluids were used in other infectious epidemics such as the typhoid fever epidemic in the early 20th century in the USA. 13
While the use of intravenous fluids in the cholera epidemic of 1832 did not directly lead to the ubiquitous use of intravenous fluids now, it was the devastating consequences of that disease that led to the first investigations of this important therapy.
Diphtheria
Diphtheria is a bacterial disease characterised by its ability to infect almost all mucosal surfaces in the body. 14 In 1890, Nobel Prize winner Emil von Behring described an antitoxin for the disease, followed by the first successful vaccine in 1913. 15 These discoveries slowly brought an end to a series of devastating epidemics. Between 1885 and 1894, diphtheria was responsible for over 12,000 child fatalities in New York alone, with more than 80% of the deaths in children younger than five years of age. Between 1889 and 1893, there were 700,000 reported deaths from diphtheria worldwide, a number that is likely to be an underestimate given reporting methods at the time. 16
Diphtheria caused membranous obstruction of the larynx. Once this occurred, the disease was usually fatal and early attempts at tracheostomy in moribund patients were largely unsuccessful. French physician Eugene Bouchut attempted a series of laryngeal intubations in diphtheria patients in 1858. 17 Despite some early promising results, the tubes were anatomically unsuitable. The resultant complications included significant airway bleeds and the procedure was abandoned.18,19
Tracheal intubation was further developed by Scottish surgeon, Sir William Macewan. He experimented with self-intubation finding he could breathe but ‘it was by no means a delectable sensation’. 20 He also reported success intubating a patient with epiglottic oedema secondary to having ‘a hot piece of potato… in the back part of the mouth… for some time’, 20 as well as patients with typhoid, and for airway surgery.21–24
The laryngeal tube was subsequently modified as a response to the late 19th century diphtheria outbreak in the USA. Joseph O’Dwyer, a New York otolaryngologist, shocked at the helplessness of the medical community in treating laryngeal diphtheria in children, was instrumental in modifying the equipment.24,25 His design allowed maintenance of laryngeal patency while the patient breathed spontaneously, and minimised the risk of airway trauma during insertion. The significance of the development was not lost on Chicago physician E Fletcher Ingals who, in his speech to the American Laryngeal Association in 1887, declared: In 1880 began a series of experiments with intubation, which have resulted in giving to the profession one of the most useful operations of modern time.
26
As well as designing the appropriate laryngeal cannula, O’Dwyer refined the technique for endotracheal intubation, which was performed without anaesthetic. The child is held upright in the arms of the nurse and a gag inserted in the left angle of the mouth… the operator inserts the index finger of the left hand to elevate the epiglottis and direct the tube into the larynx.
27
Little if any pain is complained of by the patient.… Within five or ten minutes more the patient is usually breathing with perfect ease, and falls into a quiet sleep.
26
The endotracheal tube produced by O’Dwyer remained in use for decades and, in combination with the subsequent development of the diphtheria antitoxin, resulted in a drop in the mortality rate from laryngeal diphtheria from 100% to around 41%. 29 The mortality rate dropped to just 13% for children requiring intubation for less than 48 hours. 30 The death rates for intubated patients far exceeded that of patients who did not require intubation, so good selection of patients was important. 29
In 1893, Karl Maydl added a Trendelenburg cone to an O’Dwyer tube to provide the first inhaled anaesthetic via endotracheal tube. 31 Significant advances have been made since, including the development of wide bore tubes, the addition of an inflatable cuff and improved materials such as the use of polyvinylchloride for modern endotracheal tubes.25,32–35
Spanish influenza
Occurring in the wake of the First World War, the ‘Spanish’ influenza pandemic of 1918–1919 remains the most significant pandemic, in terms of human mortality, in the past 200 years. The mass transit of soldiers globally, as well as overcrowding and poor sanitation, led to the rapid spread of the virus. 36 Up to a third of the world’s population at the time were infected with the novel H1N1 influenza strain (named after the proteins that appear on the virus surface). Of the 500 million infected, there were an estimated 50 million deaths worldwide. Unusually, the strain of influenza was particularly virulent in otherwise young healthy individuals between 20 and 40 years of age.36–38
Given their crowded nature, military camps served as perfect breeding grounds for the disease. In one case series in the American army Camp Grant, the virus was noted to have ‘unusual virulence’, with a quarter of the camp becoming infected, 26% of whom developed pneumonia with a mortality rate of 43%.
39
Similarly, the navy served as a common site of outbreaks, as related by an ex US navy nurse recalling her experiences serving in an Illinois hospital in 1918. The morgues were packed almost to the ceiling with bodies stacked one on top of another. The morticians worked day and night. You could never turn around without seeing a big red truck being loaded with caskets for the train station so the bodies could be sent home.
40
By the early 20th century it was recognised that supplemental oxygen could have therapeutic benefit, but the administration of oxygen was often difficult and inefficient.43,44 In 1916, British scientist John Haldane designed an apparatus to deliver oxygen accurately—the Haldane mask. 45 Uptake of the device was accelerated during the First World War after its utility in treating complications of gas poisoning was recognised to be superior to simply using a tube and funnel. 46
Following the war it was recognised that ‘oxygen can do much good in the acute case of poisoning as… in influenzal bronchopneumonia’. 47 However, there was an initial reluctance to use oxygen therapy by many general practitioners.
The optimal method of delivery of oxygen was also debated, with oxygen chambers thought to be ‘the best’ but often impractical compared to the Haldane mask. 48 Recognition of influenza causing respiratory diseases even led to the development of hyperbaric chambers in order to supply gas at greater than atmospheric pressure as a means of treating influenza.49,50 As the pandemic progressed, there was gradual recognition and acceptance that oxygen therapy should be used in cases of respiratory distress.51,52 This probably represents the first non-wartime mass adaptation of oxygen therapy, a therapy which is now ubiquitous in anaesthesia and intensive care.
Intravenous fluids also became more popular in the wake of the First World War. It was recognised that ‘the most serious mischief done by haemorrhage… resides in the diminution in the volume of circulating blood’ 53 and, given the obvious frequency of haemorrhage in soldiers during the war, there was heightened interest in the administration of intravenous fluids. Experiments were performed with saline solutions, blood and gelatin solutions.53,54
Although the use of intravenous fluids was understood for shock secondary to haemorrhage, its adaptation for shock secondary to sepsis seems to have been less well appreciated. Despite this there is some evidence of the use of intravenous fluids at the Canadian military camp, Camp Bramshott, during the influenza epidemic, where it was noted that ‘venesection relieved toxaemia especially if combined with saline or glucose and saline interstitially, intravenously or via the rectum’. 52
Other novel management therapies were attempted in the treatment of influenza. In highlighting the importance of sleep to the influenza patient, William Small, a British Army physician, recommended: ‘tepid sponging… but if insufficient, the patient should be given heroin hypodermically… until sleep is obtained’. 55
Small went on to describe treatment of pulmonary symptoms ‘where tremor is marked; where the pulse is unstable; and when the patient is obviously collapsed’. ‘Alcohol in the form of brandy, whisky or champagne gives the best results’ and ‘the compound tincture of chloroform and morphine is particularly efficient in soothing the often troublesome cough’.
55
Poliomyelitis
Poliomyelitis is a viral illness that can affect the central nervous system, causing weakness or paralysis in any part of the body. The virus has a propensity to infect young patients leading to significant childhood morbidity and mortality. 56 Infection of the cranial nerves with poliomyelitis (bulbar poliomyelitis) left patients unable to breathe. As a result, in 1928, Drinker and Shaw developed a negative pressure ventilator, the ‘iron lung’, which would allow prolonged artificial support of respiration.57,58 While a significant development in medicine at the time, the machines were not without their drawbacks, including limited supply.
In the early to mid 20th century poliomyelitis epidemics were an annual occurrence in Australia, many European countries, and North America. Hundreds of thousands of children were affected worldwide. 59 In 1952, Denmark experienced a particularly severe polio epidemic, with 3000 cases of polio in five months, a third of whom experienced paralysis. 60
The Blegdam Hospital, an infectious disease hospital in Copenhagen, experienced up to 50 new admissions of polio a day. At times they had up to 70 patients requiring ventilation but only seven ventilators, all of which were negative pressure ventilators. The mortality of patients requiring ventilation was 80%.
Dr Bjorn Ibsen, a Danish anaesthetist, was consulted by the hospital physicians to assist with the ‘desperate’ situation. Ibsen was able to introduce positive pressure ventilation successfully outside the operative setting for the first time, and in so doing, reduced mortality rates by over 50%.
60
A tracheostomy was performed just below the larynx and a rubber cuff-tube was inserted into the trachea with manual positive pressure ventilation from a rubber bag.
60
Consulting an anaesthetist to help outside the operating room was a novel idea at the time. 62 The 1950s was an era when anaesthesia was still a relatively new specialty and anaesthetists were often regarded as ‘technicians who knew a few gimmicks’. 1 In Denmark and around the world, the demonstration of the usefulness of positive pressure ventilation assisted with the development of anaesthetic departments and cemented the role of the critical care doctor in the intensive care setting.62,63
Bjorn subsequently developed the ‘anaesthesiological observation unit’, providing 24-hour care to critically ill patents. The mandate was to manage vital functions for postoperative patients, as well as patients with medical diseases and self-poisoning.1,64 These units paved the way for the development of the modern ICU. 1
Outside of the operating theatre, artificial ventilation services continued to develop worldwide for seriously ill patients. These units were often led or assisted by anaesthetists and showed great reduction in mortality across a wide range of conditions, highlighting once again how the dire circumstances resulting from a pandemic advanced critical care medicine. 63
Mid–20th century influenza pandemics
The middle of the 20th century saw the world experience two further influenza pandemics; the ‘Asian flu’ of 1957–1958 and the ‘Hong Kong flu’ of 1968–1970 are estimated to have been responsible for up to four million deaths globally. 65
The ‘Asian flu’ saw increased usage of tracheal intubation and mechanical ventilation as a treatment for ‘fulminant pneumonia’. In a review article, British physician Neville Oswald recounted some ongoing hesitation by physicians to intubating patients, despite intubation being a ‘life saving treatment’ which ‘almost certainly should have been performed earlier and more frequently’. 66
By the time of the ‘Hong Kong flu’, intubation no longer appeared to be controversial and was encouraged for those experiencing respiratory failure. The mainstays of critical care management included: ‘intermittent positive pressure ventilation and tracheobronchial lavage, together with the use of antibiotic and bronchodilator therapy’. 67 This represented a significant advance in management strategies since the ‘Spanish influenza’.
In 1967, David Ashbaugh and colleagues documented a series of patients in whom ventilation was difficult; all had characteristics more commonly associated with underdeveloped lungs in infancy. The syndrome became known as acute respiratory distress syndrome (ARDS). There were many underlying causes but up to a third were thought to be possibly secondary to viral pneumonia. 68 During the earlier ‘Asian flu’ pandemic, Robert Petersdorf had found pulmonary infiltrates post mortem in the lungs of those who had died from influenza. 69 He suggested that these lesions had been discovered as early as the influenza pandemic of 1918. Ashbaugh and colleagues subsequently noted the same deposits in patients who had died secondary to viral pneumonia and had been diagnosed with ARDS. 68 It was thought that these infiltrates might be the cause of the classic radiological findings associated with the syndrome.
During the Hong Kong flu epidemic of 1968–1970, many patients presented with ARDS, providing a new challenge for the critical care experts. The description and recognition of ARDS has been important for critical care medicine. Management of the syndrome has changed significantly yet it still remains difficult to treat. Indeed, subsequent pandemics of respiratory infections have led to a huge incidence of ARDS.
HIV/AIDS pandemic
The HIV/AIDS pandemic has been ongoing since the early 1980s. HIV is a lentivirus that attacks the body’s immune system. Untreated, it can lead to AIDS, responsible for the deaths of 32 million people worldwide. 70
Early critical care concerns about HIV centred largely on inadvertent transmission from the infected patient to the healthcare worker. AIDS was defined in 1981 prior to the discovery of the HIV virus and while there was suspicion the syndrome was caused from an infective vector, it was not until 1983 that the virus was discovered. 71
At the beginning of the outbreak, despite a long recognised association between handwashing and nosocomial infection, practices were variable, with just 41% of healthcare workers washing their hands after patient contact in one intensive care sample. 72
With this in mind, the Centers for Disease Control and Prevention (CDC) made a recommendation for the use of precautions for patients suspected or at risk of having HIV. Suggestions included wearing gloves for materials soiled with patient bodily fluids, handwashing after patient encounters, and the use of disposable needles. 73 By 1985, the CDC updated recommendations stating ‘precautions should be enforced routinely’, a recognition that the potential for disease transmission from patients to healthcare workers is an ever present risk. 74 In 1987, the CDC coined the term ‘universal precautions’. 75 This is a series of measures including barrier precautions, handwashing, and prevention of injuries from sharp devices, which are now taught and expected as a part of standard medical care.
A move towards plastic and single use intravenous needles and giving sets preceded the HIV epidemic. Plastic giving sets were popularised in the late 1950s and early 1960s with a recognition that they caused a lower incidence of fevers, rigors and thrombophlebitis compared to traditional red rubber giving sets.76–78 In the 1960s a number of alarming clusters of hepatitis in both patients and healthcare workers who had been inadvertently inoculated with contaminated syringes79,80 had generated a push towards improved needle safety and single use equipment.
However, at the time of the HIV outbreak, recapping of used syringes remained commonplace. The process was recognised to lead to needlestick injuries leading to a recommendation to stop the practice. 75 Surprisingly, the change in technique led to an increased incidence of needlestick injuries, with sharps often being discarded in non-rigid containers. 81 This led to the development of precursors to much of the technology commonly used when dealing with needles now. Rigid containers for sharps and new recapping devices were thought to decrease the rates of needlestick injury and therefore the risk of inadvertent transmission of HIV to healthcare workers.82–84
Over time the intravenous cannula was further modified making way for the advent of needles with integrated safety devices in the early 1990s.85,86 The cannula continues to be improved with materials designed for easier sliding and threading, and notches for visualisation of flashback. 87 It is interesting to note that, despite all the safety devices, a 2014 Cochrane review found that there is little if any evidence that safety devices have decreased the rates of needlestick injuries. 88
The HIV pandemic also led to changes intended to stop the spread of infections between patients. In 1952, it was noted that endotracheal tubes, which were all reusable at the time, were potential vectors for transmitting infections from patient to patient, and sterilising practices were introduced.89,90 However, with fear of cross contamination in the setting of the HIV pandemic, reuse of endotracheal tubes became an unacceptable practice, leading to the recommendation for single use endotracheal tubes with viral filters.91–93
The role of the critical care physician was further relied on with the HIV pandemic. In the early years, patients would regularly require mechanical ventilation for Pneumocystis carinii pneumonia, which often proved difficult. 94 The advances in critical care medicine had made it possible to keep even the most unwell patients alive but often at a significant cost to quality of life. The pandemic forced the specialty to re-evaluate treatment goals and led to ethical considerations of treatment pitting life-sustaining treatments against interventions to improve quality of life. There was an increased recognition that many tools of the critical care physician, while life-prolonging, could ‘greatly prolong suffering’. 95
The pandemic forced clinicians to encourage patients to develop advance care directives, which would guide their care in case they were to become incapacitated.96,97 The decision to withdraw treatment, even when medically futile, was and remains a difficult one, especially in the setting of HIV patients who were often young and previously healthy. 96
Advances in antiretroviral therapy has been the main driver in the decreased mortality seen in patients with HIV. 98 Also, the evolution of therapeutics and technology has seen mortality for patients with HIV in the ICU fall from about 70% at the beginning of the epidemic to 30% currently. 99 The ethical dilemmas regarding the allocation of treatment in ICUs remains an ongoing challenge.
Variant Creutzfeldt–Jakob disease
The variant Creutzfeldt–Jakob disease (CJD) epidemic of the mid-1990s in the UK brought a sharp focus back to the potential dangers of contaminated anaesthetic equipment. The prion disease, identified in 1996, was initially thought to be a zoonotic disease (bovine spongiform encephalopathy) transmitted to humans, 100 although this hypothesis has since been challenged. 101
The pathogen was identified in 129 cases in the UK between 1996 and 2002, causing much anxiety given its high levels of virulence. 102 There is no way to sterilise anaesthetic or surgical equipment to prevent contamination of subsequent patients. As such, a strong focus of the management of the CJD patient relied on the use of disposable equipment in the operating theatre, including disposable airway equipment. 103
Fears about potential transmission from contaminated blood products also led to some blood banks, such as in Australia, excluding donors who resided in the UK during the epidemic from donating blood.104,105 Although CJD had a low incidence, its inevitable mortality and highly infective nature required further steps to mitigate the potential for the spread of the disease in the theatre and critical care environment.
21st century pandemics
Twenty-first century pandemics and epidemics, including the SARS-CoV-1 (SARS) pandemic of 2003, the Ebola epidemic of 2014–2016 and the SARS-CoV-2 pandemic of 2019–2020 (COVID-19), have once again impacted the work of anaesthetists and intensivists.
The Ebola epidemic in West Africa led to over 10,000 deaths. 106 The virus is noted for its extremely high mortality, up to 90%, and is known to be highly virulent, being spread by transmission in blood and body fluids. The 2014–2016 epidemic was marked by 869 infections of healthcare workers, 507 of whom died from the disease, the significance of which could not be understated given the dearth of healthcare workers in the affected developing nations. 107
The alarming rates of infection had a major impact on the development of protocols for personal protective equipment (PPE). As well as focusing on the type of equipment being used, emphasis was placed on comprehensive training, dedicated PPE donning and doffing areas and supervised removal of contaminated PPE. 108 It is thought that these measures were, at least in part, responsible for the decline in the cases of Ebola and the end of the 2014–2016 epidemic. 109 While the mode of transmission has changed, many of the methods developed for the control of Ebola have been translated into practices for the current COVID-19 pandemic.
The North American epicentre of the SARS pandemic was Canada. Caring for patients with high viral loads and being involved in aerosol-generating procedures such as intubation, put anaesthetists and critical care doctors at particularly high risk of infection; 43% of all cases of SARS-CoV-1 in Canada and 41% in Singapore were healthcare workers, and globally 21% of cases were healthcare workers. 110 The importance of adequate PPE was once again reinforced re-enforced. At the time only 35% of UK anaesthetists regularly wore facemasks compared with 75% of North American anaesthesiologists. 111
The high incidence of viral transmission to healthcare workers during the SARS pandemic led to the development of the intubating team, led by a senior anaesthetist. 112 The teams were governed by strict protocols designed to decrease exposure and therefore transmission of the virus.112,113 The COVID-19 pandemic has seen a common adoption of anaesthetist-led intubating teams for airway management of the infected patient.
With hospitals worldwide being overwhelmed by cases of COVID-19, anaesthetists and intensivists are again increasing their presence outside the theatre and ICU environment. As well as leading the intubation teams, they are also assisting with care of the COVID-19 patient, often in makeshift intensive care settings, and advocating for the protection of the healthcare worker in an environment where personal protective gear has become a scarce commodity.114–116
There has been further development of facemasks, with polypropylene microfibres being used to manufacture filtering facepiece masks and N95 masks. Both mask types are considered standard of care for looking after the infectious patient who is undergoing an aerosol-generating procedure such as intubation. 117
Worry about transmission to the healthcare worker has led to the development of novel technologies such as personal ventilation hoods. Large plastic domes are used to enclose ICU patients undergoing an aerosolising procedure. The concept is designed to minimise spread of the aerosolised particles without having to nurse patients in a private room. 118
Other devices developed for the protection of anaesthetists and intensivists during the COVID-19 pandemic include the ‘aerosol box’. The plastic box is designed to surround the patient’s head during intubation and minimise contaminated particles reaching the clinician. Paradoxically, these have been found to increase the risk of infectious transmission; 119 a cautionary tale for hastily developing medical equipment without rigorous assessment of its utility and potential risk.
As well as PPE, there has been the development of existing intensive care management strategies for the COVID-19 patient. This includes trials of ‘awake’ prone positioning in patients with severe hypoxaemic respiratory failure prior to intubation. Preliminary findings suggest that the technique may help to improve oxygenation in this cohort. 120
Other pandemic responses of relevance to critical care medicine
One of the oldest management strategies for controlling infectious diseases is quarantine. The measure can be traced back to the 14th century in Croatia where it was used in an attempt to limit spread of the plague in a pandemic thought to be responsible for up to 50 million deaths.121,122
Quarantine has been used for almost all epidemics and pandemics, but it is worth noting that epidemics of tuberculosis in the 19th century led to the development of the sanatorium.123,124 Many claims were made for their apparent success. In describing the effectiveness of patient isolation and barrier nursing performed at the sanatoriums, the editors of the British Journal of Tuberculosis remarked: There can be no doubt but that the maintenance of a strictly hygienic course of life offers the best means known to modern medical science for dealing effectually with tuberculosis.
125
On war and critical care medicine
Infectious diseases have not been the sole contributors for many of the developments discussed above. War has been another important driver of change and there are striking parallels between military conflict and human battles against infectious disease; both create disheartening environments that foster experimentation and the requirement for advances in medical care.
Anaesthesia matured as a result of experimentation in the American Civil and Boer Wars, with increased use and understanding of anaesthetic agents, especially chloroform. 126 Anaesthetic techniques were subsequently refined and adapted due to the demands of both World Wars, leading to increasing specialisation and the formation of anaesthetic colleges and societies.127,128 It must be acknowledged that many of the therapies mentioned in this review, such as effective isolation strategies, intravenous fluids and advances in resuscitation methods, were also developed and driven by the horrors of war.
In many ways, the two problems are entwined. Over the centuries, wars have provided the perfect milieu for the spread of disease, with many notable epidemics occurring during war times. From an estimated 300,000 deaths from plague during the 480 BCE Persian invasion of Greece, to 20–40 million deaths from ‘Spanish’ influenza during and after World War One, infectious pandemics have often taken a more devastating toll than the conflicts themselves. 129
Conclusion
Pandemics have challenged human existence for millennia and will likely continue to do so. The disastrous conditions they create call for innovation and provide an opportunity for specialties such as anaesthesia and intensive care to develop and advance. From the refinement of endotracheal tubes with diphtheria to the formation of specialised intubating teams during the SARS pandemic, critical care specialists and anaesthetists have assisted in improving patient outcomes during pandemics and epidemics. Similarly, these devastating infectious events have increased the scope and skills of the critical care practitioner.
Footnotes
Author contribution(s)
Declaration of conflicting interests
The author(s) have no conflicts of interest to declare with respect to the research, authorship and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship and/or publication of this article.
