Open accessResearch articleFirst published online 2021
A Qualitative Examination Detailing Medical Student Experiences of a Novel Competency-Based Neuroanatomy eLearning Intervention Designed to Bridge a Gap Within an Integrated Medical Curriculum
A 1-year time-gap between first- and second-year neuroanatomy courses was created at our institution as a result of restructuring the curriculum from systems-based to an integrated format. Additionally, neuroanatomy hours decreased significantly (48.8%) when transitioning to an integrated curriculum, similar to other medical schools. Competency-based eLearning in medical education has shown promising results with decreasing overall learning time and improving accuracy. To date, competency-based eLearning has not yet been explored in neuroanatomy education.
Objective:
The purpose of this study is to develop and assess a novel competency-based neuroanatomy eLearning intervention for second-year medical students designed to bridge a 1-year time-gap, without adding significant instructional hours, in an integrated curriculum.
Methods:
A competency-based eLearning intervention encompassing the major tracts, brainstem anatomy, and an interactive case featuring a simulated patient experience was developed in the Articulate Storyline® platform. Student usage data, single-session course evaluations, and a focus group were used to evaluate the module’s effectiveness.
Results:
Student usage data showed an average completion time of M = 2:59:25 hours which fit within the scheduled 3-hour timeframe. Students rated the module’s overall effectiveness as M = 3.65 (out of 4) on a single-session evaluation. A focus group provided qualitative feedback suggesting improvements to the eLearning module in the domains of content, mechanics, and timing.
Conclusion:
A competency-based neuroanatomy eLearning intervention shows promising initial results to bridge a 1-year educational gap within an integrated curriculum. Overall, students described this educational tool as helpful and outlined ways in which to improve this resource.
One consequence of medical curriculum integration is the decrease in basic science instructional hours for first- and second-year (pre-clinical) medical students—particularly in neuroanatomy.1-4 Curriculum integration, or concurrently delivering basic sciences and their clinical applications, in pre-clinical medical education requires the addition of clinical instruction resulting in the loss of basic science lecture hours. In the last decade, the average weeks of pre-clerkship basic science lecture time has decreased 4.6%—from 73.7 average weeks of instruction in 2010 to 2011 down to 70.3 weeks in 2018 to 2019.5 Drake et al6 surveyed American medical schools in 2002 (n = 84) and again in 20091 (n = 31) to investigate the effect of curriculum integration on first-year medical school instructional hours in gross anatomy, neuroanatomy, microscopic anatomy, and embryology. Drake et al1,6 found neuroanatomy suffered the largest drop of all the basic sciences—an 18% decrease in lecture and laboratory hours—when converting the curriculum from systems-based (M = 96, SD = 37) to an integrated approach (M = 79, SD = 33).
The effect of decreased pre-clinical lecture and laboratory neuroanatomy hours, within an integrated curriculum, on neuroanatomy outcomes has been investigated. Mateen and D’Eon7 described the importance of exploring the impact of fewer neuroanatomy instructional hours on student outcomes as, they described, neuroanatomy education is becoming “increasingly marginalized in medical school curricula” (p. 538). In their study, Mateen and D’Eon7 found that most (95%) recent medical school graduates did not retain enough neuroanatomy to pass a first-year neuroanatomy examination. Alternatively, another recent study by Arantes et al8 found no significant difference in the overall medical student neuroanatomy examination scores between a traditional (system-based) curriculum when compared to an integrated curriculum, t(80) = 1.12, P = .265. However, Arantes et al8 found integrated students fared better in the more complex neuroanatomy subject areas. Integrated students performed better in the areas of the cerebellum, subthalamus, tracts, and vision when compared to the traditional students.8 With conflicting evidence, more studies are needed to investigate the effects of curriculum integration, and the loss of pre-clinical instructional hours, on neuroanatomy education.
Like other schools,9,10 the University of Louisville School of Medicine (ULSOM) has transitioned from a systems-based approach toward an integrated curriculum. The integrated first semester, first-year medical student (M1) course—Clinical Anatomy, Development, and Examination (CADE) was developed and implemented in 2014. CADE encompasses anatomy, neuroanatomy, embryology, and relevant clinical examination competencies. As a result of this integration, neuroanatomy curricular hours were reduced and information delivery timing was changed. The neuroanatomy curriculum was reduced 48.8%—from 82 hours (within the 4-week systems-based course) to 42 hours (within CADE) to accommodate other first year requirements within the new course. Additionally, this resulted in moving neuroanatomy from the Spring (second semester) first-year medical student course to the Fall (first semester) course. As a result, first-year medical students transitioning to their second year (M2) Neuropathology course now have a time-gap of almost 1 year between completing the Fall M1 Neuroanatomy course and entering the Spring M2 Neuropathology course. This presented a novel educational challenge: how to bridge the time-gap between the first- and second-year neuroanatomy courses while not significantly adding instructional hours. The purpose of this study is to investigate the development and assessment of a novel educational intervention designed to help students bridge this 1-year time-gap within the integrated curriculum. This article unfolds with a review of eLearning interventions in medical education, followed by module development, implementation, and assessment.
Neuroanatomy eLearning
A literature review of neuroanatomy eLearning found that web-based approaches are perceived positively by students11-13 and contribute to depth of medical student learning.14 A qualitative study by Foreman et al11 investigated the utility of neuroanatomy eLearning in health professions students (n = 43) and found most (88%) students described eLearning as beneficial. When compared to traditional learning (using neuroanatomy atlases), most (95%) students rated eLearning as better (M = 1.56, SD = 0.67) on a 5-point Likert scale (1 = much better and 5 = much worse).11 In another study, Gould et al12 explored the utility of eLearning for neuroanatomy among medical and allied health schools (n = 35 students, n = 27 faculty from 11 different states). Most participants agreed (89.5%-97.5%) eLearning would be beneficial for supplementing neuroanatomy instruction.12 Additionally, Svirko and Mellanby14 evaluated motivation and the depth of neuroanatomy learning in second year medical students (n = 869). Svirko and Mellanby14 describe deep learning as being motivated by interest in the subject matter and surface learning as motivated by fear of failure. Svirko and Mellanby14 found time spent using eLearning was positively correlated with deep learning; concurrently, a positive correlation between deep learning and academic performance was reported (r = 0.12, P < .001). In summary, students perceived neuroanatomy eLearning as useful with promoting deep learning which is correlated with improved academic performance.
Limited studies suggest neuroanatomy eLearning contributes to improved student outcomes—especially eLearning resources containing 3-dimensional (3D) features. Estevez et al15 found the overall quiz scores were higher for medical students receiving a 3D eLearning resource (t[85] = 2.02, P < .05) compared to traditional 2D brain cross-sections. A greater improvement was observed in the experimental group when the 3D specific questions were isolated, F1,85 = 5.48, P = .02.15 Additionally, Allen et al16 showed improved neuroanatomy quiz scores after students accessed a 3D neuroanatomy eLearning resource. Respondents also rated the 3D eLearning resource as better enabling them to understand neuroanatomy compared to traditional educational resources (M = 4.29, SD = 0.79, range: 1-5).16 These results are encouraging that eLearning may improve student outcomes and promote deeper understanding of neuroanatomy; however, more studies are needed to investigate student outcomes using neuroanatomy eLearning.
A specific type of eLearning—competency-based eLearning—has shown promising results in medical education with decreased overall learning time and improved accuracy.17-19 Competency-based eLearning requires the user to demonstrate knowledge, or skill proficiency, through an assessment prior to advancement. Hu et al17 compared a competency-based eLearning simulation and traditional practice of surgical skills with first- and second-year medical students (n = 48). The competency group spent 19% fewer hours than the traditional group (7.12 vs 8.75 hours, P < .001) to obtain proficiency at 3 different skills over 7 weeks.17 Another study suggests competency-based eLearning learning improves accuracy in visually identifying patterns in cytopathology. Samulski et al19 compared competency-based eLearning to traditional methods (textbooks and microscopic slides) in a population of medical residents and fellows (n = 36); the competency-based eLearning group demonstrated 68% accuracy compared to 55% accuracy for traditional methods. In conclusion, these initial studies suggest competency-based eLearning may reduce learning time while improving student outcomes.
To date there are no currently published studies exploring competency-based eLearning in neuroanatomy education. This study aims to fill this gap in the literature. Competency-based eLearning may provide the additional benefit of decreased learning time, within the time constraints of an integrated curriculum. The development, administration, and assessment of a competency-based neuroanatomy eLearning intervention designed to bridge a 1-year time-gap within a restructured integrated curriculum is described below.
Methods
The methods are described in 3 sections: first, a needs assessment outlines the initial determination for a supplemental neuroanatomy learning intervention at our institution; second, module development explains the selection of content and unique design features of the module; and, third, the module assessment section explains evaluating the module’s effectiveness and efficiency. This study was approved by the University of Louisville Institutional Review Board (IRB#: 17.1104).
Needs assessment
A convenience sample of second- through fourth-year (M2-M4) medical students at University of Louisville School of Medicine were invited to participate in a neuroanatomy survey administered in Qualtrics®. Two open-ended questions were used to identify the need for an additional neuroanatomy learning intervention: (1) Please share how you felt about learning neuroanatomy at the University of Louisville; and (2) How can neuroanatomy be improved at the University of Louisville. The director of the integrated first semester medical school course sent an email invitation to M2 to M4 medical students to participate in an online survey during the Fall 2018 semester. Completed surveys were exported from Qualtrics® into a csv file in Microsoft Excel®. IP addresses were deleted from the files to protect anonymity. Files were imported into IBM SPSS® Version 26 where missing or non-complete surveys were deleted prior to analysis. Systematic review of the qualitative survey responses yielded categorical themes. As outlined in the results section, many (61.5%) of our respondents described neuroanatomy as difficult with 9.1% of respondents wanting instructional assistance in preparation for their second-year neuropathology course. The development of an eLearning intervention, as explained below, in the Fall 2018 semester was administered at the start of the M2 Neuropathology course in the Spring 2019 semester.
Module development
Content selection
Content selection was based on neuroanatomy education research15,20 and neuropathology course director recommendations. The 3 evidence-based20 content themes selected were tracts, brainstem anatomy and case based clinical lesion problems. Additionally, the neuropathology course director suggested including cerebral blood flow and eye movements into the module. Three-dimensional arterial models from the Cerefy Atlas of Cerebral Vasculature® were incorporated into the intervention due to the literature showing improved medical student learning outcomes using 3D models.15
eLearning module structure
The module was designed in the Articulate Storyline® platform with 3 sections: Tracts, Review Quiz and Case (see Figure 1 for the module design layout). The optional Tracts portion of the module reviewed the 3 major tracts (Corticospinal, Dorsal Column Medial Lemniscus, and Spinothalamic) with corresponding brainstem anatomy. The Review Quiz is the competency-based portion of the module that must be completed in order to unlock the Case. A detailed explanation how the competency-based features were created is described below. The Case section follows a walk-through of a clinical case—allows students to think through, interact, and make diagnostic decisions as the case progresses—from the initial patient encounter through the diagnosis.
Module flow chart: the “Review Quiz” in the center is the competency-based portion of the module. The orange rectangle represents a pop-up box that appears in the “Review Quiz” menu when all 5 sections are complete. This results in unlocking the “Case” section of the module.
Competency-based features: Utility
The competency-based Review Quiz determines neuroanatomy proficiencies based on anatomical location—after the initial basic sectional anatomy subsection: brainstem: midbrain, brainstem: pons, brainstem: medulla, and the spinal cord. Each subsection of the competency-based Review Quiz represents a brainstem or spinal cord level associated with a stroke lesion and clinical vignette. Students must pass the following competencies in order to progress within the module: (1) Identify specific regional brainstem/spinal cord nuclei, (2) Describe and visually recognize arteries that supply that region of the brainstem/spinal cord, (3) Interpret neurological and/or physical patient examination videos to isolate which nuclei are lesioned, and (4) Discern between regional arteries using their diagnostic examination interpretation to isolate a stroke lesion. Each 1 of the 5 Review Quiz sub-sections must be completed before allowing the user to advance to the next sub-section. Once all 5-subsections are complete, the interactive Case section is unlocked and access is granted.
Each of the 5 sub-sections was designed with a standardized question-flow to maximize content delivery while maintaining clarity and brevity. For each brainstem level, a stroke lesion was chosen with standardized successive questions, following a consistent order, and diagnostic interpretations related to a clinical case. For example, following the Midbrain subsection of the Review Quiz in Figure 1, the first set of standardized competency-based questions involve basic neuroanatomy (ie, affected nuclei) related to this clinical case (Q6 in Figure 1). Next, a 3D-vascular anatomy review (VIDEO in Figure 1) was presented featuring a 3D model from the Cerefy Brain Atlas® (eg, see Figure 2). Competency-based blood flow questions (BFQ in Figure 1) require students to show basic arterial supply of that area of the brainstem before advancing. Clinical vignette-videos requiring diagnostic interpretation (C-Vin in Figure 1) (a clinical vignette-video example is provided in Figure 3) were followed by competency-based questions requiring students to identify potentially affected arteries (Art Q in Figure 1). Lastly, the lesion and affected brainstem neuroanatomy were presented on an MRI image (MRI in Figure 1) (see Figure 4 for an example of demonstrating a lesion on an MRI). Once a student completed all of these steps, the subsection is considered complete and the user is able to advance to the next subsection. All subsections of the Review Quiz must be completed prior to unlocking the Case section of the module as shown in Figure 1.
A 3D vascular anatomy review: vascular supply of each brainstem section is reviewed with a video featuring 3D models from the Cerefy Brain Atlas®.
Clinical videos: students must watch clinical vignette videos (A) and subsequently interpret these videos for stroke lesion localization (B).
Imaging: diagnostic imaging of the case, along with lesion localization, is demonstrated to the user.
Competency-based features: Mechanics
Articulate Storyline® platform was chosen for ease of utility with creating competency-based features and web-publishing capabilities. Minimal knowledge of computer coding was needed to create the competency-based functionality. The program’s functionality allows the creator to choose the level of desired competency for quiz questions. At the medical school level, 100% competency was chosen since there were only few questions (M = 8.4) per section to maintain brevity. Publishing in the Learning Management System (LMS) format provides usage data (dates and total usage time) for each student. There are optional Flash and HTML5 publishing capabilities for use on mobile devices. The mechanics of using Articulate Storyline® to create the competency-based features is described below.
Question-level competency was created using a multiple-choice quiz slide, requiring successful completion before advancing to the next slide—with unlimited attempts selected (see Figure 5 arrow). Competency-level questions were allowed to advance only in a forward direction by disabling alternative options for slide advancement (ie, removing the next button).
Question-level competency: Q1 is the first multiple choice question slide selected in Articulate Storyline. The red-arrow labeled FAIL represents unlimited attempts (if the user answers the question incorrectly). The green arrow labeled PASS denoted successful completion and advancement to the next slide.
Section-level competency was created using a sectional menu using variables with if/then statements and revealing hidden layers. For example, in Figure 6, the gray sectional anatomy box is the menu item on the review quiz menu. A variable for this box was chosen, with an if/then statement, that set this gray box to show a hidden layer when the last slide in this section was marked as successfully completed. The revealed hidden layer overlays this gray box with a green box to let the user know that section is complete. The purple line in the figure indicates returning the user back to the review quiz menu—now with that section marked as competency complete.
Section-level competency: the gray box in the Review Quiz menu (black arrow) changes to green when the user successfully completes all slides in this section.
Review Quiz-level competency was reached when all sections of the quiz were completed (see Figure 1). This competency level unlocked the Case section of the module on the Main Menu. The mechanics of unlocking the Case section of the eLearning tool also used variables, if/then statements and revealing hidden layers—with a secondary step. A hidden layer on the Review Quiz menu had a variable set to show this layer, once all 5 sections of the Review Quiz were complete (see the orange box on Figure 1). This variable requires 5 individual if/then statements, therefore requiring completion of all sections, before revealing this hidden layer. This revealed hidden layer, when selected, resulted in 2 triggers: (1) Bringing the user to the first slide in the Case section; and (2) Setting the Case tab on the Main Menu to unlocked. The Case tab on the main menu had a variable created with an if/then statement that would show a hidden layer called unlocked—that allows returning user to trigger directly to the first slide within the Case section of the module—once the Review Quiz is complete.
Module assessment
Various methods were used to assess the module’s effectiveness and efficiency including student usage data, single-session course evaluations, and a focus group. Student utilization rates and average usage data per student were recorded. Data from the course evaluation, and a focus group provided qualitative data to assess the module.
Course evaluation
A single-session course evaluation for the administration of the eLearning module provided both quantitative and qualitative data on module effectiveness. Students were asked to rate the eLearning intervention on organization, scope of topic, curriculum integration, and effectiveness on a 4-point Likert scale (1 = strongly disagree to 4 = strongly agree). Additionally, space is provided for students to provide qualitative feedback about the strengths of the module and how the module could be improved.
Focus group
One moderated, focus group was conducted to address the research question: Exploring second-year medical student feedback for the eLearning intervention. The focus group was led by 1 faculty (PhD) moderator (EJN) in 1, 1-hour session. The discussion was recorded and students took turns responding into the microphone to maintain structure during the session.
A PowerPoint® presentation was used to facilitate the data collection process with the large discussion group. The participants were welcomed with a Welcome slide with the study purpose “To assess M2 student attitudes toward the M2 bridging module.” A Guidelines slide informed participants the session was being recorded with a confidentiality statement, a “please be respectful and please listen respectfully as others share their views” statement, and “please know that if we interrupt you it is only to keep everyone on task and ensure everyone who wants to participate can participate” statement. A discussion of neuroanatomy bridging module was guided from 5 predetermined main questions, and secondary probing questions. Each question allowed for 10 to 12 minutes of responses, and probing questions, before moving on to the next question.
Broad questions were asked prior to probing with more specific questions, and positive questions were asked before to negative questions to reduce question bias. The moderator was instructed to maintain neutral vocal tone to minimize moderator bias, including neutral facial expressions and body posture. The moderator was also instructed to refrain from providing their own opinions during the discussion. The invitation title of the discussion reflected a neutral tone: “Assessing Student Attitudes Toward the M2 Bridging Module” to reduce sampling bias and discourage only students who had strong opinions, either for or against the module, from attending. This language was intended to be welcoming of all students to be inclusive of all feedback about the module.
The following 5 standardized questions were asked for the purpose of investigating the eLearning module’s effectiveness and efficiency: (1) What was most helpful about each one of the sections?; (2) What did you feel was missing from the module?; (3) What do you think could be eliminated from the module?; (4) How do you feel this module prepared you for the neuropathology course?; and (5) What would you change about the module to either reduce your anxiety or increase your confidence in neuroanatomy? The probing questions, “Could you give me an example?” and “Could you explain further?” were used for each main question.
Data analysis, systematic review of the transcribed focus group, was modeled after grounded theory methodology.21 Three members of the research team, representing diverse backgrounds, included 1 School of Medicine faculty member (Ph.D., JBC) and 2 graduate students (1 Ph.D. candidate, JSB and 1 medical student, EPH). Researchers were instructed to set aside preconceived ideas of what may be revealed by the data and allow patterns and themes to emerge from the data. The focus group was transcribed and no names were recorded in the transcription to protect student anonymity. Each researcher (JBC, JSB, and EPH) was instructed to observe, code and analyze the transcription independently prior to collaboration with the other researchers to minimize bias and improve credibility of the observations.22 Researchers (JBC, JSB, and EPH) made note of broad themes that emerged from the data by grouping main themes of responses together. Each researcher (JBC, JSB, and EPH) independently coded the responses into data segments and grouped them into categories to observe emerging patterns and themes. A second independent reading of all the responses was conducted with identification of quotes and supporting themes with codes created for each data segment. The research team (JBC, JSB, and EPH) reviewed the independently coded data together in a second round of data coding. Once repetitive themes were identified and no new themes surfaced from the data, the research team reached a consensus regarding the dominant themes. Finally, subthemes within each dominant domain were identified and quotes were selected as representative of each subtheme. A consensus was made by the research team of which quotes were best representative of each domain and subtheme.
Results
Needs assessment
Survey participants and response rate
A total of 175 M2 to M4 medical students voluntarily participated in the needs assessment survey. Incomplete surveys (n = 64) were dropped from the analysis resulting in a total of 111 surveys for data analysis with a 23.9% response rate. The sample was comprised of 53.2% (n = 59) female and 46.8% (n = 52) male respondents. The study drew from a total student population of 480 medical students, 55% male and 45% female, 7 of which were undergraduate neuroscience majors (1.5%; see Table 1). Publicly available data from the University of Louisville School of Medicine23 show the average age of all 3 entering classes (2015-2017) was 23.6. The average undergraduate overall GPA was 3.69 with the average Biology Chemistry Physics and Math (BCPM) GPA being 3.62. Racial demographics, reported as underrepresented in medicine, comprised 10.6% of entering students.
Study population: class profile demographics for ULSOM entering classes 2015 to 2017.
Abbreviations: N, number of matriculated students; NM, number of neurobiology or neuroscience majors; GPA, overall grade point average (4.0 scale); BCPM GPA, overall Biology, Chemistry, Physics, and Mathematics grade point average (4.0 scale); MCAT, Medical College admissions test scores.
Entering class 2015 scores were graded on the old system and not included in the average. Entering class 2015 MCAT scoring = verbal: 9.68, Physical Science: 9.53; Biological Science 10.05); gender = binary gender listed as male (M) and female (F); age = average age of entering class; UIM = underrepresented in medicine reflecting racial diversity of the entering class.
Qualitative student responses
Student feelings about the neuroanatomy curriculum at our institution were categorized into 3 major themes from Question 1 (see Table 2). Of respondents who provided qualitative feedback (n = 39), many (61.5%) felt neuroanatomy at our institution was difficult, with the themes of instruction and organization contributing to its difficulty. Some respondents (35.9%) described neuroanatomy as being “not well taught” leading to instructional challenges. Insight into these challenges were provided by a few (15.4%) respondents who described the neuroanatomy course as “poorly organized.” Respondents overall perceived neuroanatomy as difficult and felt neuroanatomy instruction needed improvement at our institution.
Student feelings about learning neuroanatomy. Samples of qualitative student responses representative of each theme. (Student comments were anonymized in the transcript).
Q1. Please share how you felt about learning neuroanatomy at the University of Louisville
Theme
Response consensus n, % of total
Sample supporting quotes
Difficult
n = 24, 61.5%
Neuroanatomy was the most difficult portion of the CADE (Clinical Examination, Anatomy, and Development) course. . .you had to just “get through it” but it was so difficult it almost made it seem like it was a losing battle. . .
Instruction
n = 14, 35.9%
Neuroanatomy was not well taught. . .I had to teach myself which was incredibly concerning since I had no background in this. . .
Unorganized
n = 6, 15.4%
The neuroanatomy course was poorly organized. . .
Suggested improvements to neuroanatomy instruction were provided by responses to Question 2. Three consistent themes emerged from the data by respondents who provided qualitative feedback (n = 33) to this question (see Table 3). Neuropathology course preparation and pre-examination review sessions were suggested by a few (9.1%) respondents. Elimination of spinal tract review soft-chalks within the curriculum was suggested by a couple (6.1%) of respondents. In summary, respondents described wanting neuroanatomy curricular changes at our institution and provided suggested improvements. One of those suggested improvements was neuropathology course preparation assistance.
Suggested improvements to neuroanatomy instruction. Samples of qualitative student responses representative of each theme. (Student comments were anonymized in the transcript).
Q2. How can neuroanatomy be improved at the University of Louisville
Theme
response consensus n, % of total
Sample supporting quotes
Neuropathology preparation
n = 3, 9.1%
. . .it would be good to connect the Neuropathology course (2nd year) with the neuro section from first year. Not sure how to do this.
Review sessions
n = 3, 9.1%
Pre-exam review sessions would be helpful. . .
Eliminate soft-chalks
n = 2, 6.1%
Two years later and I still shudder whenever I think about the spinal tracts soft-chalk. . .
eLearning intervention student usage data
The neuroanatomy eLearning intervention was administered to the second-year medical students (Class 2017, n = 155) in the Spring 2019 semester. Student usage data showed 100% utilization rate (n = 155) and an average completion time of 3 hours (M = 2:59:25 ± SD = 1:30:48). Student utilization times ranged from 50:07 minutes to 8:03:49 hours. The median utilization time was 2:48:17; utilization times were positively skewed and most students (n = 139) completed the module under 3 hours which fit appropriately within the scheduled 3-hour timeframe.
Focus group and course evaluations
Qualitative student feedback about the eLearning intervention was gathered from both a single-session course evaluation and a focus group. A single session course evaluation for the eLearning intervention recorded 13 first-year medical students responses for an 8.1% response rate. Students rated the overall effectiveness of the eLearning intervention as a 3.65 (SD = 0.76, SE = 0.07) on a 4-point Likert scale (1 = strongly disagree to 4 = strongly agree). Unfortunately, the response rate from the optional single-session course evaluation was lower than anticipated. Instead of an independent thematic analysis, the qualitative responses from these course evaluations were combined with the responses from the focus group for thematic analysis due to the low single-session course evaluation response rate. Additionally, a 1-hour focus group with 5 first-year medical student participants provided 84 total responses from the 5 main questions and 23 probing questions. Analyses of the combined (course evaluation and focus group) qualitative responses identified 3 themes of student feedback about the eLearning intervention: Content (Table 4), Mechanics (Table 5), and Timing (Table 6). Subthemes from each of these domains emerged from the data. Tables include selected, supporting quotes as representations of student perceptions of the intervention within each domain and subtheme. Details highlighting the 3 resultant themes will be reported individually below.
Content.
Subtheme
Supporting key quotes from focus group
Supporting key quotes from course evaluations
Step by step approach (n = 5)
Student_5:I would agree that just walking through step-by-step, I think that was really helpful and seeing where everything was just to remember it because I pretty much forgot everything from neuro too.
Engaging (n = 2)
Student_1:I found the case to be very, like, engaging . . . I thought that grabbed my interest a little bit and so that was helpful.
Student_2:I think what was helpful. . . being able to see it in different ways, not just in text format or animations but in life
Clinically relevant (n = 3)
Student_4: . . .then linking that to like a clinical presentation would be helpful I think.
Student_3:Would be helpful to have the quiz part of the module review the ways that exam findings indicate a cranial nerve deficit rather than just testing us on brainstem slides since they were not covered in any quiz or test questions for our actual course.
Retrieval practice (n = 3)
Student_3: . . .it was just helpful because it was easy to click through and be like, oh these words look familiar when you were going back through the tract but then when you are actually quizzing and do I actually know this information, that was really helpful.
Student_2: . . .So having a quiz there was a nice way to reinforce ok this is what you actually need to know if you didn’t know this. . .
Too detailed (n = 2)
Student_1: . . .that only because the quiz was so focused on neuroanatomy versus pathology and the slices and the pictures versus integrated case questions. . .
Student_4:Many of these we only had to identify on the module brainstem slices but were then asked specific question on quizzes or exams.
Narrow focus (n = 16)
Student_1: . . .quiz portion was just a little bit too specific. . .
Student_5:The resource provided did not help me learn cranial nerve palsies, which was especially frustrating because it was highly tested on the exam.
Student_10:I mean, we do some localization things within the neuro course but it would be nice and helpful to have, like, the middle cerebral gets this side of the brain, PICA is down here and AICA’s here, just those little things would be nice to go into the actual neuro lessons, so ok, I already know that this feeds this so you don’t have to rehash it again
Student_4:It did cover the basics of the 3 main tracts, but what about the role of the other tracts and nuclei that were not discussed.
Student_2:Also wish there was more in the initial learning module about blood supply.
Student_3:It would have been helpful to review cranial nerves in the module. . .
Mechanics.
Subtheme
Supporting key quotes from focus group
Support quotes from course evaluation
Module progression (n = 7)
Student_2: . . .not being able to skip through the quiz if you were getting questions wrong was like incredibly frustrating. . .
Student_6:I like how they changed it so you could go on to the next part without having to get everything right. That was the only frustrating thing!
Student_3: . . .so frustrating. . . the case was locked behind the quiz..
Student_1:I wish you could move forward in the initial module without getting correct answers.
Lack of immediate feedback (n = 3)
Student_2:I think the lack of feedback and the lack of like, ok—you tried this a couple of times now you’re not getting it like let’s maybe like this is what is supposed to be, here, you want to look for this that is why it is in this part of the brain this is why it is in that part of the brain.
Cumbersome clicking (n = 5)
Student_1: . . .for example, if you were in the tract module and you are clicking through and a box pops up then a blurb pops up and then you have to close out of it and then you have to close out of it, and then the next one the box pops up, and then a blurb pops, and then it took a long time to be able to get out of everything and close things and you couldn’t speed things up.
Question formatting (n = 9)
Student_1: . . .having the quiz questions being more representative of board style questions or exams I think that definitely would enhance things further. . .
Student_7:could add more test like practice questions
Student_1: . . .it was not sufficient to answer the level of questions asked on the exam.
Student_4:I would agree that having a little bit more QAS even like board style questions for a few of them, I don’t know necessarily if that is the purpose of it but I think it would help prepare us because I have questions on reviews in question banks that have been like here is a brain slice or here is a brainstem and be able to figure out where everything is. . .
Timing.
Subtheme
Supporting key quotes from focus group
Supporting key quotes from course evaluation
Time consuming (n = 7)
Student_4: . . .because I had spent so much time on the quiz. . .
Student_4:While it would have taken much longer for students to complete this task I would have rather had a more thorough review.
Student_5: . . .first time I did the module I didn’t even finish the quiz because it was just, like, taking so much time. . .
Content
Student feedback about the instructional material disseminated in the intervention separated into 6 content subthemes: step-by-step approach, engaging, clinically relevant, retrieval practice, too detailed, and narrow focused (Table 4). The first 4 subthemes described beneficial features of the intervention’s content; improvements to the intervention’s content are explained in the last 2 subthemes.
One beneficial feature of the intervention is that outlines each step in a pathway, for example in the spinothalamic tract, from a pain receptor in the leg to the primary somatosensory cortex. Each step focused on the important concepts of the location of cell bodies, synapses, axons, and decussations. Students perceived “walking through” each pathway “step-by-step” as a helpful way to learn spinal pathways. Next, the intervention was designed with animations, interactive pop ups, and clinical cases which students described as engaging. Clinically relevant features of the intervention applied neuroanatomy content to clinical practice by asking students to correlate areas of sensory loss and positive clinical tests with lesion localization. Lastly, students described retrieval practice reinforcing “what you actually need to know” and identifying student weaknesses as helpful in learning neuroanatomy.
Easing the detail of the intervention’s content is 1 way in which student feedback supports its improvement. Students described the intervention’s quiz questions as too detailed. For example, students were asked to differentiate between 2 nearby nuclei on a brainstem cross-section. Students proposed providing additional functional information that could be tested in a clinical setting, as in an “integrated case” within the intervention, along with the brainstem cross-sectional image. Additionally, students described the intervention as narrow focused. Students suggested the addition of other neuroanatomy concepts, for example, stroke lesions, a more comprehensive cerebrovascular review, and clinical deficits related to cranial nerve lesions to improve the intervention. Broadening the content scope may be 1 way to further develop the intervention.
Mechanics
Module progression, lack of immediate feedback, cumbersome clicking, and question formatting were the main mechanical subthemes that emerged from the data (Table 5). These 4 subthemes explain student feedback supporting improving module’s design and usability. Competency-based module progression, or slide advancement only after mastering the content on the current page, was not received favorably by students. Additionally, this highlighted the lack of immediate feedback to quiz questions. Students described being “frustrated” by the lack of explanations for incorrect answers and being unable to progress to the next slide. Another theme for mechanical improvement of the module, cumbersome clicking, highlights the need to reduce the number of pop-up boxes that must be closed prior to slide advancement. Lastly, students suggest quiz question formatting within the intervention more closely reflect national board examinations.
Timing
Time consuming was the major subtheme that emerged from the data reflecting the timing of the intervention’s administration (Table 6). The intervention was scheduled for a 3-hour time block on a Friday afternoon following a quiz. Students explained they were tired and desired the module to take less time to complete. However, student usage data showed an average completion time of 3 hours (M = 2:59:25, n = 158) which fit appropriately within the scheduled timeframe. Additionally, the intervention is an online resource and students were provided unlimited access to the resource.
Discussion
A competency-based neuroanatomy eLearning intervention, aimed at bridging a 1-year time-gap in an integrated curriculum, was designed. Student feedback obtained through module utilization data, course evaluations, and a focus group guided the intervention’s development. Qualitative feedback suggested module improvements in the domains of content, mechanics, and timing. Significant module improvements, described below, were completed and implemented in the subsequent year’s curriculum. Overall, the modifications to the eLearning module’s content and mechanics improved the student perception of the eLearning module. Single-session student course evaluations from the subsequent year did not describe the eLearning module as time consuming or suggest shortening the module.
Our students described the content of competency-based questions as too discerning with too narrow of a focus. Module improvements to the content were made as a direct result of this feedback; for example, clinical applications were added to select nuclei-focused brainstem neuroanatomy questions and removal of questions that were deemed too discerning. The addition of content in the form of a more comprehensive cerebrovascular review, a basal ganglia pathways review, and a review of spinal cord syndromes addressed the narrow focus of the module. However, the supplemental content resulted in 1 large, cumbersome eLearning module. The competency-based eLearning review, and each additional content area, was split into independent modules to streamline the content. Students are required to complete the competency-based eLearning module first, before the dissemination of subsequent content area modules.
Additionally, our data suggested several improvements to module mechanics. Modifications were made to question formatting; for example, reducing the number of answer choices and converting some questions to match the national board examination format. Another significant improvement was made to the mechanics of competency-based questions. Each competency-based question was converted from unlimited attempts to a maximum of 2 incorrect responses. In addition, feedback boxes were added to each incorrect response choice to guide students to the correct response. These modifications to question mechanics were also important to student perception of module completion time, detailed below.
Lastly, a third domain in which students suggested the eLearning intervention could be improved was in its completion time. Perhaps students perceived the module as time consuming if they had difficulty with the competency-based questions; students were required to repeatedly answer these questions, until correct, to obtain competency prior to advancement to the next section. Combined with a large number of answer choices, very discerning questions, and no feedback for incorrect responses—students felt the competency-based section was time consuming as their module advancement was hindered. Student usage data showed most students (n = 139) completed module, on average, in the allotted 3-hour time-frame. However, student perception of the module being time consuming may have been due to the user’s delayed module progression if the user repeatedly obtained multiple incorrect answers for each of the competency-based questions.
One limitation of addressing student feedback to module mechanics—cumbersome clicking, for example—was due to limitations of the Articulate Storyline® eLearning platform. As described in the methods, creating the pop-up boxes and showing hidden layers, were necessary to create the competency-based functionality. Another limitation of Articulate Storyline® is the video playback function being constrained to 1 speed. Students comments describing part of the time consuming nature of the module was the inability to watch embedded videos at a faster speed; this functionality was not available in the Articulate Storyline® eLearning platform. Overall, the benefits and flexibility of using the Articulate Storyline® to create an interactive, competency-based eLearning module outweighed these limitations.
Potential study limitations are the low survey response rate, sampling bias, and lack of generalizability. A larger response rate and therefore a larger sample size, encompassing more of the student population, would reduce the potential for non-response bias to both the survey and course evaluation. Additionally, convenience sampling may have impacted the results, as it is not known if students with strong opinions, either for or against the intervention, were the only ones to participate in the course evaluation and focus group. Lastly, data derived from a single university lacks generalization to all medical institutions due to varying implementation of first-year curriculum.
This competency-based neuroanatomy eLearning tool shows promising initial results to bridge an educational gap within an integrated curriculum. With neuroanatomy suffering the largest (18%) decrease of all the basic science lecture hours when converting pre-clinical medical education to an integrated curriculum,1,6 medical educators are seeking time-efficient educational solutions to deliver neuroanatomy content in a compressed timeframe. Neuroanatomy eLearning could be a potential solution to this challenge. Specifically, competency-based eLearning in previous literature has shown promising results with decreased learning time and improved accuracy.17-19 Future directions of this study include further qualitative study to evaluate student perceptions of the module’s improvements and potentially investigating the intervention’s effectiveness on neuropathology student outcomes and course competencies. Overall, students reported this was a helpful educational tool and described improvements to this resource as a direct result of this study.
Footnotes
Acknowledgements
The authors would like to acknowledge Emily Noonan, PhD for contributing to this project. The authors also wish to thank the University of Louisville medical students who volunteered to participate in this study.
Funding:
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The University of Louisville’s Acland Endowment for Anatomy Education provided financial support for the research, authorship, and publishing of this article.
Declaration of Conflicting Interests:
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Author Contributions
JSB conceptualized the research study and developed the eLearning intervention. EPH, JSB, and JBC contributed to data analysis. EPH and JSB drafted the manuscript. JBC provided substantial improvements to the manuscript. All authors read and approved the final version of the manuscript.
ORCID iD
Jessica S Bergden
References
1.
DrakeRLMcBrideJMLachmanNPawlinaW. Medical education in the anatomical sciences: the winds of change continue to blow. Anat Sci Educ. 2009;2(6):253-259.
2.
McBrideJMDrakeRL. National survey on anatomical sciences in medical education. Anat Sci Educ. 2018;11:7-14.
3.
Choi-LundbergDLAl-AubaidyHABurgessJR, et al. Minimal effects of reduced teaching hours on undergraduate medical student learning outcomes and course evaluations. Med Teach. 2020;42:58-65.
4.
SelvarajahLAojulaN. Decreasing teaching hours for undergraduate medical students: a step in the right direction?Med Teach. 2020;42:838.
DrakeRLLowrieDPrewittCM. Survey of gross anatomy, microscopic anatomy, neuroscience, and embryology courses in medical school curricula in the United States. Anat Rec. 2002;269:118-122.
7.
MateenFJD’EonMF. Neuroanatomy: a single institution study of knowledge loss. Med Teach. 2008;30:537-539.
8.
ArantesMAndradeJPBarbosaJFerreiraMA. Curricular changes: the impact on medical students knowledge of neuroanatomy. BMC Med Educ. 2020;20:1-20.
9.
BrauerDGFergusonKG. The integrated curriculum in medical education: AMEE guide no. 96, Med Teach. 2015;37:312-322.
10.
VidicBWeitlaufHM. Horizontal and vertical integration of academic disciplines in the medical school curriculum. Clin Anat. 2002;15:233-235.
11.
ForemanKBMortonDAMusolinoGMAlbertineKH. Design and utility of a web-based computer-assisted instructional tool for neuroanatomy self-study and review for physical and occupational therapy graduate students. Anat Rec B New Anat. 2005;285:26-31.
12.
GouldDJTerrellMAFlemingJ. A usability study of users’ perceptions toward a multimedia computer-assisted learning tool for neuroanatomy. Anat Sci Educ. 2008;1:175-183.
13.
SvirkoEMellanbyJ. Attitudes to e-learning, learning style and achievement in learning neuroanatomy by medical students. Med Teach. 2008;30:e219-e227.
14.
SvirkoEMellanbyJ. Teaching neuroanatomy using computer-aided learning: what makes for successful outcomes?Anat Sci Educ. 2017;10:560-569.
15.
EstevezMELindgrenKABergethonPR. A novel three-dimensional tool for teaching human neuroanatomy. Anat Sci Educ. 2010;3:309-317.
16.
AllenLKEaglesonRde RibaupierreS. Evaluation of an online three-dimensional interactive resource for undergraduate neuroanatomy education. Anat Sci Educ. 2016;9:431-439.
17.
HuYBrooksKDKimH, et al. Adaptive simulation training using cumulative sum: a randomized prospective trial. Am J Surg. 2016;211:377-383.
18.
SamulskiTDLaTWuRI. Adaptive eLearning modules for cytopathology education: a review and approach. Diagn Cytopathol. 2016;44:944-951.
19.
SamulskiTDTaylorLALaTMehrCRMcGrathCMWuRI. The utility of adaptive eLearning in cervical cytopathology education. Cancer Cytopathol. 2018;126:129-135.
20.
JavaidMAChakrabortySCryanJFSchellekensHToulouseA. Understanding neurophobia: reasons behind impaired understanding and learning of neuroanatomy in cross-disciplinary healthcare students. Anat Sci Educ. 2018;11:81-93.
21.
StraussACorbinJ. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 2nd ed.Sage Publications; 1998.