Abstract
Leaders who promote cybersecurity education focused on the human factors of cyberattack build a resilient workforce that complements technical protections, reducing organizational risk. Cybersecurity is a priority for information technology teams, relying primarily on technology to protect systems. As technical protections mature, the vulnerability shifts to human factors. Education must focus on the risk presented by humans rather than machines. A human factors-centred education program trains human reaction to threats considering the unique healthcare environment. Leaders may look to industries, like aviation, experiencing similar technical advancement, for education practices based on human factors. This article outlines a cybersecurity education program developed for healthcare, applying strategies adopted from commercial aviation. Four core pillars of training are defined: (1) dynamic education delivery options, (2) social engineering focused simulations, (3) high-risk positions and role-based training, and (4) stakeholder and leadership engagement. The first phase of implementation has been analyzed and offers some lessons for health leaders.
Introduction
Cybersecurity is a term that has become well known to Canadian health leaders. Rapid adoption of technology to enable safe patient care, to communicate, learn, collaborate, work remotely, and even maintain the physical healthcare environment has created a critical reliance on the availability of connected technologies. As such, the protection of that technology has made cybersecurity a priority for health leaders, information technology teams, governments, and insurance providers. In response to the threat presented by cybercriminals to the healthcare industry, investments have been made in maturing technical protections in an attempt to mitigate risk. As technical protections mature, areas of vulnerability shift toward the staff who operate the technology, the human factor. Health leaders who develop and promote cybersecurity education programs focused on the human factors of cyberattack build a resilient workforce that complements technical security protections to reduce organizational risk. This article outlines our hospital’s experience designing and implementing the first phase of a cybersecurity education program developed for a healthcare environment by applying human factors principles proven by commercial aviation within a large academic teaching hospital and incorporating smaller rural partner hospitals in the program delivery through a regional security operations centre. Four core pillars of training are defined: (1) dynamic education delivery options, (2) social engineering focused simulation training campaigns, (3) high-risk positions and role-based training, and (4) stakeholder and leadership engagement. Initial results and lessons learned from the first phase of implementation are discussed and opportunities for continued development are identified.
Background: From aviation to healthcare
Human factors have been applied in healthcare environments in the past with a goal of decreasing medical errors and improving patient safety. The focus has been on the interaction between healthcare professionals and the clinical environment (i.e. placement of sinks to improve hand hygiene compliance). 1 Rapid adoption of technology in the healthcare environment has brought about the emergence of an additional category of patient risk, that of cybersecurity. The rapid adoption of technology, and the risks associated with it, is not an experience unique to healthcare. The commercial aviation industry experienced a similar phenomenon with the rapid availability and adoption of computer-based control and navigation systems and the increased reliability of that equipment. It was found that in approximately 80% of aviation accidents, pilots contribute more to the problem than major malfunctions of aircraft systems. The determination was that the only way to further reduce the overall rate of accidents in the industry was not through adoption of additional technology but rather to concentrate on improving the performance of people. Introducing the concept of human factors–based training provided active steps to addresses the human elements that cause incidents to occur. 2
Human factors is the study of how people interact with their environments. In the context of aviation, the study of human factors focuses on “how pilot performance is influenced by such issues as the design of cockpits, temperature and altitude, the functioning of the organs of the body, the effects of emotions, and interaction and communication with other participants in the aviation community.” 2 This concept can be applied to healthcare if we consider healthcare staff in the role of the pilot. The interaction between humans and technology in the healthcare environment directly impacts the safety of patients just as the interaction between pilot and technology within the cockpit impacts the safety of the passengers onboard an aircraft. In both cases, errors can be fatal. Healthcare staff are influenced by the design of their environment (where and when interaction with technology takes place), the effects of emotion on logical thought and action, and the way they expect to interact and communicate with other participants in the healthcare community. These human factors influence the way in which healthcare staff interface with technology, and thus, the safety of the patients entrusted to their care.
From compliance to human factors: Designing wave 1
As is the case in many healthcare organizations, staff at our hospital have been assigned a mandatory cybersecurity training module annually for a number of years. The module outlined the requirement for complex passwords, defined “phishing,” identified common red flags, and informed staff on how to report suspicious e-mails and activities. The training was offered in a standard, self-paced lecture format that the majority of corporate e-learning is based on. In addition, the hospital purchased a phishing simulation platform, preloaded with common, generic phishing scenarios that would be sent to staff periodically throughout the year. Compliance with the education requirements was regularly achieved, yet phishing remained the number 1 cause of security incidents in the organization. Our cybersecurity team was able to significantly reduce the incidence of successful cyberattacks through the implementation of technology but could not implement technical controls that would prevent staff from falling victim to social engineering, defined by the Canadian Centre for Cybersecurity as “the practice of obtaining confidential information by manipulation of legitimate users.” 3 As in commercial aviation, the determination was made that the only way to further reduce risk was to concentrate on improving the way that our staff interact with the technology they use every day considering the human factors inherent in their work environment. Our team began to re-envision the training program to address human factors. Wave 1 included four pillars.
Dynamic education delivery options
The first step in designing our cybersecurity education was to identify the behaviour change that the training program should bring about; in this case, empowering our staff to react to a situation, like a socially engineered phishing attack, and effectively problem solve. The traditional self-paced e-leaning module relied primarily on passive learning, that is, rote memorization of facts. While passive learning is useful in some situations, generally it is not the most effective model when the outcome is expected to be situational problem solving. 2 This first step to design a human factors-based education platform for the team was to pivot to an active learning model.
Active learning is also a component of human factors education for aviation. It describes active mental involvement as “working with received new information, processing it, using it to solve problems, working out how it fits with what you already know, and figuring out why the information is necessary.”
2
In the context of healthcare staff, we assessed that the training needed to provide an opportunity for staff to engage and interact with materials to reinforce knowledge transfer and improve retention and recall. We endeavoured to build training that would match the daily pace of work that healthcare staff are accustomed to, while leveraging familiar work environments. We investigated options to integrate educational material within existing work processes and established the following three goals for the formal education program, to be tested with a focus group: (1) Creation of an active learning model—The addition of gamified training presented users with an interactive scenario, paired with the knowledge transfer portion of the module. This allowed staff to process the new information and apply it within the assigned lesson. (2) Shorter duration, increased frequency—Breaking down the content, typically delivered annually, into its component parts and offering each in short “micro modules” allowed the delivery to be aligned with the pace healthcare workers experience in their standard workday. Reminders of cybersecurity are presented more frequently for staff, without increasing the overall annual time spent on cybersecurity training. (3) On-site, in-unit delivery—By ensuring that staff had the tools to access training directly in their work environments, we created environmental associations with existing processes.
Topics addressed through the education module remained unchanged from the traditional e-learning cybersecurity training, but delivery was modified to accommodate the unique conditions of the healthcare environment and the needs of healthcare workers.
Social engineering focused phishing simulations
Simulation training has been a standard in cybersecurity awareness programs for some time, and the benefit of simulation training is well documented in the commercial aviation training industry, with some research showing that “for beginning pilots, an hour in a simulator early in training can be equivalent to as much as four to six hours in the airplane.” 2 Simulation is also a well-known training tool in healthcare, with extensive use in clinical skills labs and for surgical teams as prime examples. The goal of simulation training in this context was to provide a safe environment for staff, where, when presented with a potential phishing attack, they could test their ability to actively use the knowledge gained during formal training to problem solve.
Phishing simulations involve the creation of a fake phishing e-mail which is sent to staff in an effort to test their ability to correctly identify suspicious e-mail. Data is gathered in the background to provide metrics related to staff performance. If staff click on the faked phishing link within the e-mail, they can be directed to a feedback page providing additional training material, often identifying the “red flags” that should have indicated to them that the e-mail was suspicious. There are pre-built scenarios that healthcare organizations can send to their staff, but most are generic and do not accurately represent a highly targeted phishing attack where an attacker spends time learning about an organization to create an e-mail that is much more convincing and less likely to be detected by staff. Guiding principles for the optimization of our phishing simulation program were developed to emphasize the social engineering aspect of phishing attacks and to ensure an impactful experience for staff. The goal of our cybersecurity training program was to optimize the phishing campaigns being sent to staff using human factors concepts and testing staff reactions and interactions with the material rather than using the results as a static metric. (1) Perform robust data gathering from the perspective of an outside entity (all internal information was off limits for the exercise) to develop highly targeted, relevant phishing scenarios that were directly relevant to staff. (2) Incorporate content designed to illicit a physiological response from staff (fear, anger, panic, and confusion) using high value topics to create a feeling of urgency to respond. This mimics a common tactic employed by cybercriminals where stress and urgency are used to cause fixation or tunnel vision, diminishing the ability to see the broad picture and making staff prone to miss or ignore useful information that would normally be obvious.
2
High-risk positions and role-based training
Certain staff within a healthcare organization present a more attractive target to cybercriminals as a result of the role they hold within the corporate structure. Risks are related to the responsibilities they have (leadership and financial) or the access to systems and information that could be used to facilitate an attack. The identification of all high-risk positions within an organization requires a full risk assessment in partnership with human resources, but some high-risk roles are easy to identify. Executive leaders, for example, often have access to privileged information and compromising the credentials of one of these individuals can be highly valuable to an attacker.
In the first phase of our role-based training initiative, we selected executive leadership and their direct support staff as a test group and designed a highly targeted phishing simulation. The intent was to create a baseline human risk score for the group and identify their specific training needs. Once the simulation was launched and the results evaluated, two observations were made: (1) Communication processes must be evaluated in conjunction with role-based risks to identify staff who have roles with high levels of access to the targeted individuals (access to schedules, e-mail, and privileges to send correspondence on behalf of the target individuals). These individuals should be included in the same risk group as those they support. (2) Experiencing a highly targeted simulation increases the awareness of risk within the target group and primes the environment for meaningful active learning. The simulation brought the subject to the top of mind, allowing more time to be spent creating solutions and less time outlining the problem.
Stakeholder and leadership engagement
To reduce risk and strengthen information security, leaders must go beyond sending staff stand-alone courses and phishing simulations. Instead, leadership must create a security-aware culture, with best practices in mind across all business units. By embedding cybersecurity into an organization’s culture, it is much easier, in the long run, to reach behaviour change objectives. It is not enough for staff to go through the motions of cybersecurity training as a compliance requirement; they need to put their knowledge into action, staying alert to new threat.
4
In wave 1 of our human factors–based education program design, our Corporate Communications office was engaged to help build a cybersecurity culture by leveraging existing communications processes, channels, and stakeholder messaging. Our communications experts championed security culture by: (1) Integrating security messaging within existing corporate communication vehicles. Corporate communications often support regular communications intended for wide audiences within an organization (newsletters and e-casts). Incorporating regular reinforcement of security awareness messages offers the opportunity for repetition and reinforcement of secure behaviours. (2) Focusing communications on security trends in the media and leveraging people’s natural curiosity to deliver verified information about current cybersecurity events. Cybersecurity coverage in the media helps capture public interest and allows us to naturally draw parallels to risks within the healthcare environment. Messaging can reinforce behaviours that safeguard corporate data and can be implemented by staff to protect their personal assets.
Lessons learned and maturing the program
The first wave of implementation of an education program based on the principles of human factors was focused on high level modifications to existing programs to focus on human risk. While changes were intentionally subtle, the results offer critical learnings that provide a baseline on which to mature the program.
Cultural change is a process, no matter the magnitude
Introducing a change as subtle as modifying the training cadence for a small test group can be met with resistance if not managed correctly. Change must be viewed as a process, not a project. The expectation that once a change is communicated, staff will immediately adapt is unrealistic. Several studies identify that it can take two months to change behaviour, so breaking down the change into steps using a recognized change management framework and providing staff the time they need to adopt a change is vital to sustaining any desired behaviour change. 4
The organizational impact of phishing simulations may not align with the impact of individual staff phishing simulation results
The implementation of the guiding principles outlined for the phishing simulation program generated an immediate, obvious increase in the impact of the simulations on the organization. Two simulations were performed in the first year of the program, with organizational responses (help desk, support, and communications) showing improvement by the second simulation. Individual results, however, demonstrated no material change (<3%). Future programs will require increased exposure to phishing simulations at the individual staff level in an effort to improve performance.
In the next wave of the program, simulations that test organizational response (includes all staff) will be conducted twice annually. “Cycled simulations” will introduce a bank of phishing simulations generated using the same principles of social engineering that will cycle through the organization, being sent to smaller groups of staff throughout the year. Goals have been adjusted to increase the frequency of individual staff exposure to phishing simulations to four annually and will be reassessed after one annual cycle.
Role-based training should be expanded in partnership with other corporate departments to ensure that all high-risk roles are captured
Results from the baseline phishing simulation of the executive leadership test group confirmed the value of targeted training for high-risk roles within the organization. The identification of all high-risk positions within an organization requires a full risk assessment that cannot be performed by information technology independently due to the inherent bias a speciality in technology may introduce. Partnership with departments such as corporate risk, human resources, privacy, and finance to perform a role-based risk assessment will provide a more complete baseline risk assessment to focus role-based training in wave 2.
Designing the future
Human risk is a significant point of vulnerability that has not seen the same level of investment and maturity as technical security controls in Canadian healthcare organizations. Incorporating the human risk factor in a standard cybersecurity defence strategy by introducing cybersecurity education programs focused on the human factors of cyberattack fortifies the organization’s defences against an information security breach.
Insight into human risk offered by the Canadian aviation industry presents a strong starting point for healthcare organizations. Once successfully adapted to the healthcare environment, human risk management will continue to evolve in this unique environment.
Footnotes
Acknowledgements
I would like to thank the Information Technology teams at London Health Sciences Centre (LHSC), St. Joseph’s Healthcare London, St. Thomas Elgin General Hospital, Woodstock Hospital, Middlesex Hospital Alliance, Listowel Wingham Hospitals Alliance, Tillsonburg District Memorial Hospital, Alexandra Hospital, and South Huron Hospital Association for supporting and contributing to the evolution of our regional cybersecurity training program. I would also like to thank LHSC CIO Andrew Mes, and LHSC CISO Keith Lawson for their mentorship and support throughout this process.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Ethical approval
Institutional Review Board approval was not required.
