Abstract
OBJECTIVES
This study investigates the differences between in-person versus virtual format of an advanced communication skills OSCE through thematic analyses of post-OSCE debrief transcripts.
METHODS
Two cohorts of senior medical students participated in either a 2019 in-person or 2021 virtual advanced communication skills OSCE. Students were grouped in triads and rotated through three of five possible cases. Afterwards, students participated in a faculty-led debrief (in-person in 2019, virtual in 2021). Inductive thematic analysis was used to compare the themes and the ratio of comments related to the themes were compared between the virtual and in-person OSCEs.
RESULTS
Thematic analyses for both in-person and virtual OSCEs identified the same four major themes (Case Review, Emotional Response, Feedback, and Reflection) and 11 subthemes. However, the ratio of comments related to Case Review was lower in the virtual OSCE compared to in-person (P < .0001). Analysis of subthemes within Case Review revealed the percentage of comments was higher for Content and lower for Challenges in the virtual OSCE compared to in-person (both P < .0001). There were no differences in the ratios of comments related to Emotional Response, Feedback, and Reflection, or their subthemes.
CONCLUSION
A virtual advanced communications skills OSCE for medical students showed identical qualitative themes to that from a prior in-person OSCE. However, students in the virtual OSCE focused more on matter-of-fact discussions about case content and less about the challenges they experienced. The findings suggest that some medical students may struggle with experiential learning in the virtual format, and have difficulty accessing or practicing their reflective observation skills based on Kolb's learning theory. Differences may be attributable to the additional cognitive load in the virtual setting, inadequate structural safeguards, and/or other limitations of virtual communication.
Introduction
The Observed Structured Clinical Examinations (OSCE) has been widely used in medical education since its development in the 1970s. 1 Characterized by separate stations, specific tasks, and standardized assessments, OSCEs are effective for both summative 2 and formative assessments.3,4 In recent decades, they have been increasingly used for communication skills training with various levels of healthcare providers.5,6 For senior medical students, OSCEs have become a common tool to prepare them for their first year of residency. Virtual OSCEs, which are OSCEs performed on an online videoconference platform, 7 have become more prevalent in the wake of the COVID-19 pandemic.3,8–10
Quantitative studies have shown virtual OSCEs are effective and well accepted by residents11,12 and fellows, 13 while showing comparable results to in-person OSCEs for medical students.14,15 Qualitative studies have also shown favorable results for virtual OSCEs,16–18 with both medical students 17 and faculty19–21 highlighting the saved time and resources as an advantage of the virtual format. Nonetheless, two recent systematic reviews question the use of the virtual OSCE as a replacement for in-person OSCEs, citing the need for more studies with stronger evidence showing the virtual format's effectiveness.22,23 This concern is justified since virtual iterations of other teaching formats have shown to increase cognitive load.24,25
Debriefing refers to the facilitated or guided reflection of experiencing an event and making sense of it. 26 It is considered an integral part in learning from simulation.27–29 There are no studies that have assessed the content of post-OSCE debriefs in the virtual setting, or directly compared them to in-person post-OSCE debriefs. Such a comparison could provide unique insight based on medical students’ perspectives to assess the effectiveness of the virtual OSCE and debrief format.
We previously described an in-person formative OSCE on advanced communication skills for senior medical students 4 which we reconfigured for a virtual set-up in 2021. We performed a quantitative analysis between in-person 2017 and 2019 OSCEs with a 2021 virtual OSCE. The results showed that students in the virtual OSCE showed similar scores in interview process (eg, avoiding jargon and demonstrating empathy) but lower scores in interview content (eg, achieving case-specific tasks on a checklist) compared to students in the in-person OSCE. 15 Kolb's experiential learning theory emphasizes concrete experience as an important stage in learning. 30 In this previous work, we proposed that the virtual OSCE format is an obstacle to concrete experience for some students, which led to their decreased ability to complete elements on an OSCE checklist. 15
As such, we embarked on the current qualitative study with two objectives. First, through thematic analysis of faculty-led debrief transcripts, we aimed to explore explanations for the quantitative results which had revealed lower student scores in interview content on a virtual versus in-person OSCE. Second, we sought to assess the content of post-OSCE debriefs, exploring differences between debriefs conducted in-person and virtually. To our knowledge, this is the first study to directly compare qualitative data between an in-person and virtual OSCE for medical students.
Methods
Place, period and nature of study
The study was conducted at Yale School of Medicine in New Haven, Connecticut, USA. The MD program at Yale School of Medicine is a four-year curriculum consisting of an 18-month pre-clerkship period covering basic science education, 12-month clerkship period during which all students rotate through core clinical disciplines, and a 17-month post-clerkship period. During the post-clerkship period, students are required to complete a four-week sub-internship, at least 28 weeks of clinical electives and research, and a four-week Capstone course. The remaining time is spent studying for licensure exams, applying to residency programs, doing additional electives and research, and taking vacation.
The advanced communication skills OSCE is part of the Capstone course which is scheduled two months before graduation. During the Capstone course, students participate in core classes/workshops and individual experiences based on their desired specialty choice. The Capstone course is formative, requiring only attendance for a passing grade.
Participation selection and virtual OSCE format
Students were recruited as a convenience sample as they underwent the advanced communication skills OSCE. The OSCE is formative with attendance determining pass/fail. The OSCE had been conducted in-person prior to 2020, was canceled in 2020, and was converted to virtual format in 2021 due to COVID-19.
The week before the OSCE, students received relevant didactic sessions on different advanced communication skills. These sessions were provided to all students, and included topics on advanced communication skills, though the specific scenarios used during the OSCE were not disclosed. In 2019, the didactics were given in-person or as recordings. In 2021, the content was the same but the didactic sessions were given synchronously online or as recordings from previous years. During the OSCE, students were grouped in triads and took turns as examinee as they rotated through three out of five possible cases: difficult communication with an angry patient, goals of care for a patient with serious illness, medical error disclosure, palliative care assessment, and telephone death notification (Table 1). After each station, the examinee received feedback guided by objective self- and peer-assessment instruments. 4 Faculty and standardized patients did not provide feedback due to prior work demonstrating similarity in scoring among faculty, standardized patient, and peer observers plus student preference for a peer-assisted learning model for the in-person OSCE. 31 In 2021, the virtual OSCE was conducted on Zoom (San Jose, CA, USA), with smaller debriefing groups in an attempt to reduce cognitive load in the virtual environment. The telephone death notification case was conducted with the standardized patient's camera turned off, allowing interaction with the examinee solely through audio. Otherwise, the set-up was identical to that used during the in-person iteration of the OSCE in 2019. We have previously published additional details about the OSCE format and participants. 15
Summary of five cases in advanced communication skills OSCE.
Post-OSCE debrief
After completion of all three cases, multiple student triads were combined into larger groups for a faculty-led debrief to review case content, emotional responses, and students’ questions. These post-OSCE debriefs were held in separate meeting rooms for the in-person OSCE and in Zoom rooms for the virtual OSCE. For both formats, the debrief session length was 30 min and students were scheduled to participate in one debrief only.
All faculty facilitators for the debriefs were physicians (MDs) involved in medical education across the 4-year Yale School of Medicine curriculum, specifically communication skills training. Two of the investigators (TM, JT) had regular interactions with the students throughout the curriculum, while the other two investigators (AC, LM) facilitated educational sessions with some students. The debrief faculty facilitators in 2019 consisted of 2 females and 2 males. The same facilitators, with the addition of one male faculty member, led the debriefs in 2021. No other participants were present during the debrief besides the students and faculty facilitator, and notes were not taken during the debriefs. The debrief faculty guide included a summary of the five cases in the OSCE and discussion prompts to facilitate the debrief. The same debrief guide was used for both in-person and virtual OSCEs. Debriefs for both years were audio-recorded and later transcribed (Rev; Austin, TX, USA).
Analysis
Both sets of transcripts from 2019 and 2021 underwent inductive thematic analysis, where themes and meanings are derived from the data with no preconception. 32 The deductive approach was not possible as the investigators did not have the framework on comparing virtual OSCEs to in-person OSCEs during pre-COVID iterations. The 2019 and 2021 transcripts were analyzed together after the 2021 virtual OSCE to strengthen the inductive approach and limit bias.
The thematic analysis was based on Braun and Clarke's 6-step thematic analysis approach – familiarization with data, production or preliminary codes, drafting a set of themes, reviewing themes and subthemes, naming themes and subthemes, and generating a report. 33 Comments made by facilitators were excluded. Two investigators (AC, JT) independently coded four randomly selected transcripts (two from 2019, two from 2021). The two investigators then went through an iterative process with regular meetings until code saturation was achieved. Afterwards, the investigators coded the remaining transcripts independently before meeting together to compare results. Any discrepancies in coding were resolved by a third investigator (LM).
Once all themes/subthemes were identified, the number of comments for each theme/subtheme within the transcripts was collected for the in-person and virtual OSCE formats. Content analysis was conducted by comparing the portion of comments related to each theme/subtheme between the in-person and virtual OSCE formats. 34
Thematic analysis was conducted with NViVo (Lumivero; Denver, CO, USA). Statistical analysis was conducted with IBM SPSS Statistics for Windows, version 25 (IBM Corp.; Armonk, N.Y., USA). Participants were not asked to give feedback on either the transcripts or the study results. The study was granted ethical clearance after being deemed exempt from review by the Yale University Institutional Review Board (IRB Protocol ID 2000020576).
Reflexivity statement
All of the research investigators potentially had direct contact with many of the students through the Yale School of Medicine curriculum. However, the identity of individual students was unknown to the researchers during thematic analysis. The research team conducted regular meetings throughout data analysis to identify biases and engage in reflexive discussions, and the team sought feedback from a qualitative research colleague not involved in the project to provide an external perspective about the process.
In addition, the investigators adhered to the COREQ (Consolidated Criteria for Reporting Qualitative Research) guidelines to ensure rigorous and transparent reporting. A completed COREQ checklist is provided as Supplemental material.
Results
Participants
The 2021 class participating in the virtual OSCE consisted of 53 men and 39 women with an average age of 29.0 years. Compared to the 2019 class who participated in the in-person OSCE (44 men, 44 women, average age 28.6 years), there were no differences in age and gender (P values .41 and .46, respectively). The number of students participating in debriefs ranged from 8 to 12 for virtual OSCEs, and 16 to 18 for in-person OSCEs.
Debrief transcripts and major themes
There were 8 transcripts from virtual OSCEs and 4 from in-person OSCEs. A total of 300 comments (18 376 words) were coded from the virtual OSCE debrief transcripts, compared to 306 comments (15 118 words) from the in-person OSCE debrief transcripts. Inductive thematic analysis identified the same four major themes in both OSCE formats: Case Review, Emotional Response, Feedback, and Reflection. Both formats had the same trend in number of comments for the major themes, with Case Review being the highest, followed by Feedback, Reflection, and Emotional Response in decreasing order (Figure 1). Representative comments for each theme/subtheme are presented below, and the frequency of comments per theme/subtheme is summarized in Table 2.

Comparison of percentages of thematic analyses between in-person (2019) and virtual (2021) OSCE; *indicates statistical significance P < .05.
Frequency of comments and comparison of comment percentages from virtual and in-person OSCE.
Case review and subthemes
Case Review comments were defined as comments related to the OSCE cases. Overall, the ratio of comments related to Case Review was lower in the virtual format (128/300, 43%) compared to in-person (155/306, 51%; P < .0001). In terms of subthemes, students shared comments on the Content of the OSCE station, which were defined as matter-of-fact discussion without mention of challenges or barriers (51/128, 40% virtual vs 35/155, 23% in-person). Student #28: “I did that case. There's a car accident and this teenager and her friend was also in a car accident. Both of them died. The prompt was to call the parent and then deliver the bad news and arrange for logistics.”
Students also shared Challenges, which were comments on areas of confusion, uncertainty, and lack of knowledge (47/128, 37% vs 90/155, 58%). Student #10: “I had difficulty for that, like [the] second where I had to say the word dead. I was trying to say passed away, but I remember on the checklist we’re supposed to say, the proper way to do it. So it's, it almost felt like I had to own it and I didn’t want to own it. I didn’t want to own that her daughter had died.”
They also shared Lessons Learned based on communication skills or specific approaches highlighted during the workshop (30/128, 23% vs 30/155, 19%). Student #69: “There was one thing that I realized was the power of sitting with silence and just sitting with emotion.”
The transcripts from the virtual OSCE debriefs showed a higher percentage of comments for Content and lower percentage for Challenges (both P < .0001).
Feedback and subthemes
Feedback comments were defined as comments providing suggestions related to the OSCE. There were no differences in overall ratio of comments related to Feedback between the virtual (102/300, 34%) and in-person (69/306, 23%) OSCEs. In terms of subthemes, students shared General Impressions, which were nonspecific comments related to the OSCE experience (38/102, 37% vs 28/69, 41%). Student #2: “Because I’ve now had this positive experience where it was actually, the response was very reasonable and understanding.”
They also commented on Future Ideas (6/102, 6% vs 11/69, 16%), which were suggestions for future OSCE iterations. Student #75: “I would have loved to hear [feedback] from a standardized patient.”
Students also commented on Session Format (38/102, 37% vs 19/69, 27%); these were statements on the structural elements of the OSCE they went through. Student #49: “I definitely think having the checklist guided the observation, as opposed to just the general thought of like, ‘Oh, this is very uncomfortable, I’m so very glad that I’m not the one doing this right now.’”
Last, students made specific comments regarding Remote Format, which were comments related to the telephone case for the in-person OSCE and related to the telephone case and/or virtual format during the virtual OSCE (20/102, 20% vs 11/60, 16%). Student #55: “And now that it was over Zoom, the cameras were off microphones were off and it really felt like you were one-on-one on a telehealth visit.”
There was no difference in the ratio of comments related to the subthemes of Feedback between the virtual and in-person OSCEs.
Reflection and subthemes
Reflection comments were defined as comments related to the OSCE content but not related to the specific OSCE cases. There were no differences in overall ratio of comments related to Reflection between the virtual (54/300, 18%) and in-person (66/306, 22%) OSCEs. In terms of subthemes, the most common reflections were those based on Clinical Experiences and related to the OSCE (38/54, 70% vs 47/66, 71%). Student #26: “But I think I’ve seen enough scenarios where patients were upset, the residents tried their best to accommodate them, and maybe the attending had to step in.”
Students also shared reflections based on Personal Experiences that were related to the OSCE content (1/54, 2% vs 5/66, 8%). Student #1: “I definitely have family members who’ve gotten news of family dying, and driven home and thought, ‘I could have easily gotten in a horrible accident with someone else.’”
While debriefing, students also shared comments that were based on Other Experiences not fitting into the categories above (eg, classroom experiences) (15/54, 28% vs 14/66, 21%). Student #8: “In our lecture last week, […], one of the PICU fellows, said her rule of thumb is, if they’re an hour out, if they can be there within an hour, she waits til [they] get there.”
There was no difference in the ratio of comments related to the subthemes of Reflection between the virtual and in-person OSCEs.
Emotional response
Comments were coded as Emotional Response when a specific emotion was expressed. There was no difference in overall ratio of comments related to Emotional Response between the virtual (16/300, 5%) and in-person (16/306, 5%) OSCEs. There were no subthemes identified. Examples of emotions expressed by students from the in-person OSCE include feeling “intense,” “surprised” how the patient reacted, and “uncomfortable” with the telephone death notification scenario. Students from the virtual OSCE felt “intense” and “taxing,” as well as sentiments leading to “tearing up.” Student #72: Yeah. I said it in my group, but [classmate's name] had the 17-year-old death notice and I’m glad it wasn’t me. But I did feel choked up when the father did. And I thought she handled it very, very well. And I don’t know how I would’ve handled it, given how I was feeling and not even being on the phone and being part of that conversation.
Discussion
Qualitative analysis of debrief transcripts following a virtual advanced communication skills OSCE for senior medical students revealed some differences compared to an in-person OSCE. While the overall themes that emerged were identical, the debriefs with faculty in the virtual format focused more on matter-of-fact discussions about case scenarios and less about challenges students experienced.
Our previous quantitative analysis showed students in the virtual OSCE had lower scores in interview content (ie, achieving case-specific tasks on a checklist) compared to students in the in-person OSCE. 15 The educational significance of that difference in performance becomes clearer when considered alongside the findings of the present qualitative study. Despite having lower scores in interview content, students in the virtual OSCE commented less on the cases in general and when they did, they were less inclined to discuss challenges than the in-person cohort. This difference is educationally significant when debriefing is a key element of simulation learning.27–29 In the virtual format, students struggled more to master case content, and therefore were unable to advance to skill building and assessing challenges in their learning. The fact that there was no difference in the Feedback theme and the subthemes therein suggests that students may not be self-aware of these subtle but significant obstacles in the virtual setting.
We suggest three possible explanations for the lower ratio of Case Review comments with higher percentage on Content and lower percentage on Challenges, one which relates to the OSCE itself, and two which relate to the debrief.
First, during the OSCE, the additional cognitive load from the virtual setting may have been an obstacle to achieving optimal experiential learning. Previous studies show learners expressing difficulty engaging with others during online discussions35,36 due to technological difficulties and distractions that impede virtual learning. 37 A recent neuroscience study shows decreased neural signaling related to social interactions in the virtual setting compared to in-person. 38 As a result, students may not have been able to achieve sufficient experiential learning, which is critical before students can attain reflective observation per Kolb's experiential learning theory. 30
Regarding the debriefs, additional structural safeguards may be required in the virtual environment. When transitioning to the virtual format, we attempted to optimize the virtual setup 7 by decreasing the size of debrief groups, since smaller group size can increase engagement in the virtual setting, 37 which led to the higher number of transcripts from the virtual OSCE. We also discouraged use of the “chat” function to improve engagement with the spoken conversation. Otherwise, we implemented similar safeguards compared with the in-person debriefs, such as ensuring facilitator presence, setting clear expectations before the debrief, and communicating rules of engagement. 39 However, there may have been more opportunities that were overlooked. For instance, our virtual debriefs were scheduled for 30 min which was the same time allocated for the in-person debriefs in 2019. More time might have been needed as virtual discussions can require more time and cognitive effort than in-person discussions. 40
Last, inherent limitations in virtual communication in comparison to in-person communication may have impacted the quality of debriefing. For example, the virtual setting impedes cross-talk and fast-paced dialogue among participants who cannot easily read each other's social cues, problems that persist despite using the “gallery view” in larger group discussions. 41 Transitions between speakers may require more active work from facilitators, 41 which could result in fewer and longer comments in the virtual setting. This is shown in the results for the virtual OSCE: despite having fewer participants in each debrief, overall there were actually more words per comment than in the in-person format. Further studies are needed on how inherent properties of virtual communication affect the quality of virtual OSCE debriefs.
At the same time, students were able to share explicit emotions during the virtual OSCE debrief. This could be explained by students’ increasing level of comfort with the virtual environment, as supported by comments specific to Remote Format from the virtual OSCE. Student #37: “I can at least speak for myself that having done interviews for the past year [on Zoom], it's easier enough to engage and do standardized patients from home too.”
Interestingly, comments on the Remote Format from the in-person OSCE, which could only come from the telephone death notification case, pointed out the lack of nonverbal communication leading to challenges with empathizing with the family member. Student #6: “Especially with things like silence, where you’re not really getting that same kind of feedback from the person's feedback in front of you. If you’re thinking about silence, in person, you’re able to be attentive. They know that you’re paying attention, they know that you’re there.”
The decreased ability to empathize in remote communications was a major concern for telemedicine during the early stages of the COVID-19 pandemic. 42 In comparison, comments from the virtual OSCE were more focused toward the increasing student comfort in telemedicine and its advantages and disadvantages. Our results are in line with more recent literature showing students can effectively express empathy in the virtual environment. 19
There are limitations to our study. First, it was conducted in a single institution with two different cohorts. We have previously addressed this by finding the two groups similar in terms of demographic factors and prior clinical performance. 15 Second, two of the research investigators had previous relationships with the medical students involved in the study which could carry potential reflexive effects. However, there is no reason to think that these effects would have been applied differently to the 2019 and 2021 cohorts, since all transcripts were analyzed simultaneously and student identities were removed from the transcripts. Additionally, we tried to maintain the validity and reliability of our study by adhering to the COREQ framework (Supplemental Material). Last, the students’ behavior may have been influenced by the fact they were being observed/recorded. However, observation and recording are standard in our curriculum during encounters with SPs, so it unlikely that recording in the current context would have stood out to the students as unusual.
Conclusions
Overall, the themes that emerged through qualitative analysis of post-OSCE debrief transcripts were identical between the virtual and in-person formats. However, there were subtle but significant qualitative differences in debrief content that may suggest less optimal content review in the virtual format. It is possible these differences existed within the virtual OSCE itself, serving as an obstacle for some students to advance their learning. Simultaneously, these differences may have impacted the post-OSCE debriefs, affecting students’ abilities to reflect and assess challenges at hand. While shifts to virtual formats hold promise even in the aftermath of the COVID-19 pandemic, further studies of optimal educational safeguards are needed before virtual debriefing of challenging OSCE content is adopted as a standard alternative to in-person debriefing.
Supplemental Material
sj-docx-1-mde-10.1177_23821205241311961 - Supplemental material for Comparison of a Virtual and in-Person OSCE on Advanced Communication Skills: Qualitative Insights from Medical Student Debrief Transcripts
Supplemental material, sj-docx-1-mde-10.1177_23821205241311961 for Comparison of a Virtual and in-Person OSCE on Advanced Communication Skills: Qualitative Insights from Medical Student Debrief Transcripts by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development
Footnotes
Acknowledgements
We would like to thank Dr John Encandela for his expertise in qualitative research which contributed to our methodology.
FUNDING:
The authors received no financial support for the research, authorship, and/or publication of this article.
DECLARATION OF CONFLICTING INTERESTS:
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Author's contribution
J.T. devised the project. A.C., T.M., L.M., and J.T. were involved in data collection. A.C. processed the data and conducted statistical analyses. A.C. wrote the manuscript with input from J.T., T.M., and L.M.
Ethics and consent
Written consent was obtained from all participating students before the OSCE. The students were made aware of the investigators involved in the study through the consent process. Identification numbers were assigned to ensure dependability of transcription. Two students withdrew consent after the debrief in 2021, and corresponding comments were deleted from the transcripts by an administrator not involved in the study. The study was granted exemption from review by the Yale University IRB (Protocol ID 2000020576).
Supplemental material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
