Abstract
Qualitative evaluations of courses prove difficult due to low response rates. Online courses may permit the analysis of qualitative feedback provided by health care providers (HCPs) during and after the course is completed. This study describes the use of qualitative methods for an online continuing medical education (CME) course through the analysis of HCP feedback for the purpose of quality improvement. We used formative and summative feedback from HCPs about their self-reported experiences of completing an online expert-facilitated course on tobacco dependence treatment (the Training Enhancement in Applied Cessation Counselling and Health [TEACH] Project). Phenomenological, inductive, and deductive approaches were applied to develop themes. QSR NVivo 11 was used to analyze the themes derived from free-text comments and responses to open-ended questions. A total of 277 out of 287 participants (96.5%) completed the course evaluations and provided 690 comments focused on how to improve the program. Five themes emerged from the formative evaluations: overall quality, content, delivery method, support, and time. The majority of comments (22.6%) in the formative evaluation expressed satisfaction with overall course quality. Suggestions for improvement were mostly for course content and delivery method (20.4% and 17.8%, respectively). Five themes emerged from the summative evaluation: feedback related to learning objectives, interprofessional collaboration, future topics of relevance, overall modifications, and overall satisfaction. Comments on course content, website function, timing, and support were the identified areas for improvement. This study provides a model to evaluate the effectiveness of online educational interventions. Significantly, this constructive approach to evaluation allows CME providers to take rapid corrective action.
Keywords
Introduction
Online learning was introduced with the advent of the Internet in the early 1990s and has been increasingly used in medical and health care education. 1 As the number of online courses continues to grow, the need for quality and accountability in course evaluations is also increasing. 2 A meta-analysis in health professions education concluded that online learning is associated with a large positive effect in outcomes such as satisfaction, knowledge, skills, behaviors, and impact to patients compared with no intervention, and effectiveness is similar to that of classroom-based learning methods. 1 While the results of online and in-person trainings are not significantly different, online evaluation response rates typically fall below that of classroom-based response rates.3,4 Evaluating online learning remains a challenge, and there is a lack of general consensus on the best way to evaluate online learning and use the evaluation data to support quality improvement.2,5
Online courses might be an acceptable alternative to training a diverse workforce spread across great distances and time zones. 6 It is usually time intensive and cost prohibitive to offer in-person intensive training courses to health care providers 1 (HCPs) where there is a large “know-do” gap in some specific areas such as tobacco dependence treatment. Moreover, quality improvement of course offerings requires collecting and analyzing feedback from participants, and using this feedback to modify the experience and content for future learners.
The Training Enhancement in Applied Cessation Counselling and Health (TEACH) Project is the first university-accredited continuing medical education certificate program in Canada focused on tobacco dependence treatment. This project has been highly successful in developing an interprofessional, collaborative, and experiential education program, training more than 5000 HCPs to date. The TEACH curriculum consists of three courses: (1) an online introductory course on tobacco control pillars (10.5 hours), (2) a classroom-based or online Core Course on the fundamentals of tobacco dependence treatment (19.5 hours), and (3) a classroom-based or online Specialty Course focused on a variety of populations or topics where tobacco dependence treatment requires adaptation (13.5 hours). Course evaluation is a critical component to quality assurance and improvement. The TEACH Project’s comprehensive evaluation framework 7 includes course evaluations completed by learners following each course module (i.e., formative evaluation) in addition to a summative evaluation immediately administered post training. Learners also complete a follow-up evaluation at 3 and 6 months post training.
To create a learner-centered program that supports rapid experiential learning 8 and expands reach by building capacity among HCPs unable to travel for classroom-based training, TEACH courses have been adapted for the online environment, utilizing best practice principles in e-Learning. Despite administering comprehensive quantitative surveys, a qualitative evaluation was designed to inform quality improvement of the TEACH online Core Course. 7 Data obtained from written questionnaires with predefined answers may be less contextually rich than that obtained through qualitative approaches 9 (i.e., free-text or open-ended comments). With qualitative response options, participants can respond to the questions as they would like to answer them and the researcher can investigate the meaning of these responses. However, we recognized that the qualitative feedback may not be representative of the all HCPs who participated in the online Core Course, and therefore, quantitative data from an online survey were also obtained.
The goals of this article are to (1) introduce curriculum designers to an innovative use of qualitative methods in both formative and summative course evaluations and (2) describe the qualitative findings elicited from TEACH-trained HCPs following their participation in the TEACH online Core Course for program evaluation.
Methods
This article reports thematic analysis of HCPs’ opinions expressed in feedback data from four cohorts of the TEACH online Core Course. These data were collected in January, May, July, and October 2015 during and after the course was completed. HCPs were asked to provide free-text comments after each of 10 modules (i.e., formative) and respond to a series of open-ended questions following the end of the course (i.e., summative), in addition to quantitative questions. Demographic information, professional discipline, and number of years providing tobacco cessation interventions were also collected. At the time this study was designed, the Centre for Addiction and Mental Health Research Ethics Board deemed that formal review and approval was not required for our study.
In this phenomenological study, we used a hybrid process of inductive and deductive thematic analysis to interpret the free-text comments from the formative evaluations and open-ended questions from the summative evaluations. Inductive codes were based on prominent themes that emerged from HCPs’ free-text comments. Our deductive codes were based on concepts that we explicitly asked HCPs about (e.g., what overall modifications to the course do you suggest) through open-ended questions, in which we searched for attributes of Henri’s model. 10 This strategy allowed us to examine our study questions (i.e., through our questions in the summative phase and through deductive codes) while capturing other key themes that emerged from the data (i.e., free-text comments through inductive codes that allowed the data, rather than theory, to drive coding). 11 We used the Braun and Clarke model 12 for our thematic analysis as it is a flexible approach that can be used across a range of research questions applying the following six phases: (1) Free-text comments and responses to open-ended questions were read and reread to familiarize the researchers with the data; (2) a range of initial codes were generated to index common features across comments; (3) these initial codes were examined to determine prevalent themes; (4) themes were reviewed for internal homogeneity and external heterogeneity, to combine similar themes into overarching themes and draw coherent links or distinctions between them; (5) the overarching themes were firmly defined, organized in relation to their collated data extracts, and then analyzed for subthemes; and (6) the report was produced.
Several steps were taken to ensure methodological and analytical rigor in this investigation. Two independent raters (A.E.A. and M.B.) coded the transcripts. Before the analysis of the data, the raters met to discuss any discrepant codes until consensus was met in regard to the final codes. A quarter of the data were double-coded to check for consistency, for which there was >90% agreement between researchers (A.E.A. and M.B.). To minimize the risk of bias, the prominence of main themes and their respective subthemes was determined through discussion between two authors. These authors read the free-text comments and answers to open-ended questions as well as reviewed all coded quotations for each theme. Unanimity among the two coders was required. When unanimity was not achieved, comments were discussed until consensus was reached. Quotes are presented in the results to illustrate the themes and subthemes identified. To validate the findings, first hand quotes are presented word-for-word within the results in Table 2. 13 Also, further evidence of validity was obtained by comparing emergent finding with the literature. The qualitative software QSR NVivo 11 was used to perform content analysis of the themes derived from the comments.
Results
In total, 96.5% of participants completed the course evaluations. A breakdown of the sample for each cohort appears in Table 1. Participants were asked to select their professional discipline. Overall, 44% (n = 124) of the participants were nurses.
Description of the Study Group.
Respondents (n = 277 out of 287 participants) provided 690 comments through free-text comments and answers to open-ended questions within the study questionnaires, describing their satisfaction with their learning experience and suggestions for improvement of the TEACH online Core Course.
Formative Evaluation
The five central themes identified from 349 comments within the data are presented with their subthemes alongside quotations as examples. Table 2 lists these themes in the following categories: positive feedback and suggestions for improvement. The overall focus of the study was on “suggestions for improvement,” and a high proportion of the comments (61% of comments) provided feedback to improve the following course areas: content of the course (20.4% of comments), delivery system (17.8% of comments), evaluation of the course (11.7% of comments), timing of the course (8.3% of comments), and overall quality (2.6% of comments).
Themes and Samples of 349 Comments Reported in Free-Text Comments in the TEACH Formative Evaluation.
Note. Cohort 1: January 2015. Cohort 2: May 2015. Cohort 3: July 2015. Cohort 4: October 2015. TEACH = Training Enhancement in Applied Cessation Counselling and Health.
More general positive comments emerging from the data (39% of comments) typically consisted of 1-word responses, such as “good” or “excellent.” Positive responses were grouped into the following themes: overall quality (22.6% of comments), content of the course (14% of comments), delivery method (1% of comments), timing of the course (0.6% of comments), and evaluation of the course (0.3% of comments). The majority of comments made by HCPs in the formative evaluation were positive, expressing satisfaction with overall course quality. Comments on course content and the delivery method were identified as key areas for improvement. HCPs also wanted the course designers to address problems with content flow, repetition, and relevance of concepts.
Summative Evaluation
Five themes emerged from 341 of the participants’ responses to open-ended questions in the summative evaluation administered at the end of each course. The themes, subthemes, and corresponding examples are shown in Table 3.
Themes and Samples of 341 Comments Reported in Open-Ended Questions in the TEACH Summative Evaluation.
Note. TEACH = Training Enhancement in Applied Cessation Counselling and Health.
Theme 1: Learning objectives—23.5% of comments
Participants were asked to comment on one knowledge or skill they learned the most about and were planning to use in their clinical practice. Qualitative responses included the following knowledge or skill areas: motivational interviewing, pharmacological interventions, screening and assessing tobacco use, developing treatment plans, harm reduction, the global impact of tobacco use, advocacy and system-level change, specific populations with high tobacco prevalence, implementation, and facilitating smoking cessation groups. The most commonly reported knowledge or skill areas were motivational interviewing, pharmacological interventions, and assessing tobacco use (7.8%, 4.2%, and 2.6% of comments, respectively). These are aligned with the learning objectives of the course.
Theme 2: Interprofessional collaboration and networking—12.1% of comments
Participants generally had positive comments in addition to suggestions for improvement related to opportunities for interprofessional collaboration and networking during the course. Overall, only 5.4% of the comments under this theme indicated participants were pleased to have opportunities to collaborate with fellow learners.
Theme 3: Future topics of relevance—8.2% of comments
When we asked participants about future cessation topics of relevance for professional development, they reported 11 content areas, respectively: (1) mental illness and substance use disorders; (2) tobacco dependence treatment; (3) tobacco cessation for women across the lifespan; (4) motivational interviewing; (5) implementation; (6) harm reduction; (7) advocacy and system-level change; (8) pharmacological interventions; (9) tobacco interventions for youth and young adults; (10) tobacco interventions for First Nations, Inuit, and Métis populations; and (11) chronic disease prevention. Also, other approaches to cessation related to working with specific populations such as immigrant populations and older adults were reported.
Theme 4: Overall modification—37.8% of comments
The participants’ perceptions regarding overall modifications to the course were categorized into four subthemes: course content, website function, course timing, and course support. A prominent subtheme was course content (17.2% of comments). This subtheme was attributed to less repetitive evaluation surveys following modules, more lengthy and knowledge-based quizzes, more videos, the incorporation of live webinars to connect with participants and faculty, and more case studies and interactive exercises to enhance skill development.
For the website function subtheme (11.6% of comments), participants reported some difficulties with loading the videos and navigating through the course content. Comments regarding course timing and scheduling (6% of comments) described the importance of revising the module timing to make it flexible for HCPs with busy schedules (e.g., having all modules open at the same time). Participants also requested continuous access to the course materials after the completion of the course so that they could refer back to content and use it as a reference and resource in the future. The final subtheme emerging under suggestions for improvement was course support (3% of comments). Some participants also commented that they were dissatisfied with the course registration process.
Theme 5: Overall satisfaction—18.4% of comments
When we asked participants about general comments regarding the training provided, they typically responded positively with 1-word responses, such as “good” or “excellent.” The central subthemes emerging from this section included course usefulness, course support, web usability, and timing.
Many participants described how the design and the delivery of the course content were useful which included both static content (e.g., resources and readings) and interactive content (e.g., videos, webinars, interprofessional collaboration, and quizzes). Course support also garnered positive feedback highlighting participants’ satisfaction with course administrators. Web usability, including course design, ease of navigation, and loading of webpages and videos, accounted for 1.7% of comments of overall satisfaction. For example, a very small number of participants (0.2% of comments) stated that they would have preferred taking this course in-person. Fewer comments for overall satisfaction could be found under the timing subtheme (mentioned in 0.3 % of comments).
TEACH online Core Course content is iteratively updated to incorporate the latest advances and evidence-based approaches in tobacco cessation, in addition to HCPs’ feedback. However, upon reviewing the results across the four cohorts in 2015, there were a number of changes that needed to be made to enhance the learner experience and align the course with best practice approaches in adult learning. The TEACH online Core Course has recently undergone a major revision incorporating knowledge-based and case-based quizzes, clinical video demonstrations, self-reflections, opportunities to practice clinical skills through case studies, and collaborative projects that require participants to work together on activities in the online environment. Course faculty are now required to provide individual and customized feedback on course assessments so that participants can increase their awareness of knowledge and skill areas for further improvement. Website design and functionality has been addressed by providing further instructions regarding course navigation, streamlining course content to reduce complex navigation requirements, and ensuring that the course content is functional across a number of website browsers. This revised course was launched in July 2016, and evaluation results are forthcoming.
Discussion
This study was initiated to demonstrate two goals and provides several important and novel findings. First, this phenomenological study, through thematic analysis of 690 comments and emerging 10 main themes, highlights the importance of employing a hybrid process of inductive and deductive thematic analysis for formative and summative qualitative evaluations, and also highlights the importance of collecting qualitative feedback (in addition to quantitative feedback) for program evaluation. This article provides a comprehensive understanding of the TEACH Core Course through qualitative analysis and evaluation. We found new learning needs emerged during the course implementation phase from the participants’ perspective. Using a strong qualitative evaluation for the TEACH online Core Course allows us to advance learning objectives focused on tobacco dependence treatment and implement research on education outcomes and the features that improve them, which has also been demonstrated in a meta-analysis for the effectiveness of online learning. 1
Problem identification and needs assessment for the target audience should be administered prior to curriculum development; however, as learners might not fully know their needs until exposed to the subject matter, and their needs may evolve due to learner characteristics (e.g., discipline and years of experience), recurrent needs assessments should be built into course evaluations.14-17 These important steps have been considered through our formative and summative course evaluations. Skeff et al 18 have also noted that individuals often do not fully appreciate their professional development needs until they have been exposed to the subject area.
The current qualitative study could be considered as a model for determining the success of educational programs and used as a learner needs assessment to guide iterative revisions to curriculum, the most important steps in the curriculum development and evaluation process. 17 Cronbach, who is one of the pioneers of the quantitative method, stated that a well-designed qualitative evaluation is important and emphasized the significance of its application to determine the success of educational programs. 19
Second, the main themes emerging from this study are based on the feedback provided by HCPs across four course cohorts. Given the large sample size included in this qualitative research coupled with a high response rate (96.5%), a wide range of participant views were captured. This high response rate demonstrates that online training and evaluation can be deployed for HCPs without compromising learner overall satisfaction. However, many of the HCPs who responded to the qualitative evaluations may have had definite views regarding the course they completed (e.g., either a positive or negative learning experience), and it is possible that many of the participants who did not respond with qualitative comments either did not have suggestions for improvement or did not feel comfortable sharing negative feedback. This result also is consistent with past literature regarding evaluating online training for HCPs. 20
A total of 349 comments came from the formative evaluations, and five themes emerged through an inductive approach: (1) overall quality, (2) content, (3) evaluation, (4) timing, and (5) delivery method. A total of 341 comments came from the summative evaluations and were related to the following five categories derived from a deductive approach: (1) learning objectives, (2) interprofessional collaboration and networking, (3) future topics of relevance, (4) overall modification, and (5) overall satisfaction.
Nearly all of the positive comments elicited from the formative evaluation were focused on the overall quality and content of the modules, and more than 75% of the overall satisfaction in the summative evaluation was related to the course content. Our findings are consistent with earlier research which indicated that learners give more feedback on the overall quality and content of the curriculum than on other course areas. 14
Those participants who provided comments related to suggestions for improvement in the formative evaluations were looking for more knowledge-based quizzes and fewer repetitive module evaluations at the end of each module. However, completion of the module evaluation was not mandatory. Our results confirm that participants would like to have more videos and live webinars to increase interactivity, a technical solution for loading videos, more case studies and scenario-based quizzes to check knowledge retention and application, more opportunities for collaboration with fellow learners, feedback from facilitators regarding knowledge and skills, and for course designers to focus on troubleshooting issues associated with video loading. Our results are consistent with past literature indicating that a technology-mediated course that matched learners’ needs can facilitate understanding of the subject.20-22 Several participants reported the usefulness of course administrative support, believing that this was helpful in addressing any technical or logistical issues in a respectful and timely manner. In addition, many participants felt that course discussion boards should be actively monitored by facilitators to encourage proactive conversation and provide further insight on the topics discussed by learners. This supports Boettcher’s best practices 23 for teaching online that emphasize the importance of creating a supportive online course community. In most online courses, the dialogue between faculty and participants is supported through weekly coaching and reminders. To enhance dialogue between participants in our online course, we have since added weekly live webinar sessions for enhanced interaction and problem solving and have added in additional discussion boards for reflective practice and coaching.
Participants also suggested that the course administrators should open all the modules at once, remove time constraints across the course (i.e., assignment deadlines), and continue to keep the course accessible after it has ended so that they can refer back to content and resources at a later date. Providing training to HCPs that will enhance practice skills can often be difficult given their competing clinical demands and priorities. 24 E-Learning can help address some of these challenges by offering more flexible timelines, eliminating travel, and offering a diverse learning environment without compromising leaner satisfaction. 20 A very small number of HCPs stated that they would have preferred taking this course in a classroom-based setting.
There are several limitations in this study. Participants were not followed up with to elicit more information about their comments. The analysis of free-text comments and open-ended responses in this context did not allow for the opportunity to probe respondents for further detail or explanation. 25 Our study did not collect demographic characteristics such as gender and ethnicity; therefore, we cannot be certain that our sample is reflective of the general Ontarian HCP population. Despite its limitations, there are some valuable conclusions to be drawn from this study. HCPs find TEACH to be a valuable program for tobacco dependence treatment training. This article adds a new perspective to the current literature regarding online courses in tobacco dependence treatment for HCPs, presents a model to evaluate effectiveness of online educational interventions, and provides tailored suggestions to improve online courses which can be applied in other contexts. 26 To complete the cycle of quality improvement, these findings will inform future course iterations’ development and revisions. Tracking evaluation outcomes and learner feedback by developing a strong evaluation component to online curricula allows educators to advance the science of education and increase evaluative research on education outcomes and the features that improve them. 1
Conclusion
Using a combined technique of inductive and deductive thematic analysis, this formative and summative qualitative evaluation has highlighted an approach that demonstrates rigor within a quality improvement research study. This process made it possible to describe participants’ satisfaction with their learning experience. Areas of strength and needed improvements of the online course have been examined from the perspective of HCPs through qualitative feedback collected across four course cohorts. Curriculum designers should incorporate opportunities for monitoring and feedback by expert facilitators to foster a supportive online learning environment. CME providers should consider the importance of collecting qualitative feedback from their participants to elicit concrete and relevant suggestions to improve the experience for future learners. This qualitative evaluation approach can be replicated by other evaluation scholars and can assist them in evaluating the effectiveness of online educational interventions, in the context of CME and beyond.
Footnotes
Acknowledgements
The authors acknowledge the members of the Nicotine Dependence Service for their assistance and administration related to this study and the following funding sources: Ministry of Health and Long-Term Care, Ontario for TEACH and STOP Programs, Centre for Addiction and Mental Health, and Department of Family and Community Medicine, University of Toronto. They gratefully acknowledge the generous assistance and valuable editing provided to them by Vanessa Ballarino from the Nicotine Dependence Service.
Authors’ Note
AEA conceived and designed the study, coded the transcripts, performed data analysis, and wrote the manuscript. MB assisted in designing the study, coded the transcripts, and contributed to drafting and revision of the paper. MF assisted in drafting and revision of the paper. RD participated in design of the study and drafting and revision of many subsequent drafts of the paper. PS authored the first and many subsequent drafts of the paper, contributed to its design and revision for form and intellectual content, and also approved the final manuscript for submission.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
