Abstract
Qualitative researchers are increasingly using online data collection methods, especially during the COVID-19 pandemic. I compared the data quality (i.e., interview duration, average number of themes and sub-themes, and inaudible words) of 34 interviews (29 conducted by Zoom (16 with camera on, 13 camera off) and 5 conducted by phone) drawn from a study focusing on youth’s coping experiences during the pandemic. Findings showed that phone interviews had a longer duration compared to Zoom. However, phone interviews had a similar average word count to Zoom interviews (with the camera on). Zoom interviews conducted with the camera off were shorter in duration than interviews with the camera on. The number of themes was similar across the different interview formats but there were fewer sub-themes for Zoom interviews with the camera off. The findings suggest that Zoom interviews conducted with the camera off could affect the data quality. This research also emphasizes the importance of giving participants choice in the format of their interview to allow for optimal sharing of experiences while enhancing the equity, diversity and inclusion of the participants.
Introduction
New communication technologies in recent years such as Skype, Google Hangouts, FaceTime, and Zoom have helped to improve the options for online qualitative data collection (e.g., email, instant messaging, and video calling; Jenner & Myers, 2019). During the coronavirus disease (COVID-19) pandemic most researchers encountered restrictions in face-to-face data collection in an effort to help contain the spread of the virus and had to look for alternatives. As such, online methods through phone or internet (i.e., Skype, Zoom) were used to address the physical distancing restrictions. Some researchers argue that the pandemic has added new context to conducting online qualitative research and further validating it as a legitimate research strategy (Andrejuk, 2020).
Despite the increased use of online qualitative data collection, in-person interviews are still often viewed as the gold standard while online methods are often assumed to have poorer quality, even though there is a lack of evidence to support this claim (Krouwel et al., 2019). Other interview formats such as telephone and online (e.g., audio or video) may be appropriate for a variety of reasons such as cost, time, and privacy (Krouwel et al., 2019). Although recent literature highlights the many benefits and some disadvantages of online interview methods, it is mainly descriptive or anecdotal and based largely on researchers’ reflections and observations (Gray et al., 2020; Krouwel et al., 2019; Upadhyay & Lipkovich, 2020). Little research has been conducted to compare differences between various formats (especially online video interviews) that assess data quality.
Background
I first outline some common disadvantages and benefits of online qualitative data collection and then highlight some studies that have compared different interview modes. Common disadvantages of online qualitative data collection include technical difficulties and developing rapport with participants (Gray et al., 2020). For example, technical challenges such as internet delays, poor audio or video could also potentially impact the quality of an interview (Deakin & Wakefield, 2014). Given the different types of technology and platforms available, participants need appropriate access and skills to use them (Archibald et al., 2019; Davies et al., 2020). Online qualitative data collection could potentially exclude some people who lack sufficient internet access. Additionally, some suggest that Skype is less well suited to personal or sensitive topics and there is a need for further comparison of online versus other interview formats (Seitz, 2016). Despite some potential disadvantages in online interviews, advances in technology have led to an increase in conducting online interviews, especially during the COVID-19 pandemic (Andrejuk, 2020; Lobe et al., 2020; Vindrola-Padros et al., 2020).
Qualitative online data collection has many reported advantages. For example, several studies show that online interviews can save time and cost and allow for greater flexibility compared to other interview formats (Archibald et al., 2019; Davies et al., 2020). Other studies report that online interviews could help to increase the geographical reach and inclusion of participants (Archibald et al., 2019; Davies et al., 2020; Deakin & Wakefield, 2014; Jenner & Myers, 2019; Upadhyay & Lipkovich, 2020). For example, online methods could help to enable more low income, less mobile and rural individuals to participate in research (Sedgwick & Spiers, 2009). As such, online qualitative data collection appears to yield higher response rates (Russell et al., 2010).
The flexibility of online qualitative methods could be advantageous for marginalized or vulnerable groups, such as people with disabilities, who may have difficulty travelling to an in-person interview (Gray et al., 2020; Neville et al., 2016; Ryan, 2013). For example, Bowker and Tuffin (2004) argue that online interviews are effective and appropriate for people with disabilities because of their flexibility, allowing participants to engage in research at their own pace. Further, online qualitative interviews can also allow for more reflective responses and can be a useful approach for asking sensitive questions (Madge & O’Connor, 2004; Seitz, 2016; Sipes et al., 2019). Online interviews can offer participants a safe and comfortable environment (i.e., within their own home or other preferred location) with potential for greater anonymity and perceived privacy (Deakin & Wakefield, 2014; Irani, 2019; Reisner et al., 2018). Some research shows that online qualitative methods could facilitate greater disclosure of personal information (Bowker & Tuffin, 2004). For example, Weller (2017) argues that physical separation between researchers and participants has the potential to increase rapport and may enable participants to share more than they otherwise would in an in-person interview. Similarly, Upadhyay and Lipkovich (2020) found that most participants in their study reported that meeting by video gave them a sense of safety and security, which helped them to open up to the interviewer. Given the numerous potential benefits of online qualitative data collection, for both researchers and participants, it is important to understand the quality of the data collected to better guide its application and interpretation.
In this study I draw on interviews focusing on youth and young adults, with and without disabilities. Exploring this area is important because adolescents have embraced online communication technologies more than any other age group (Mason & Ide, 2014) and online qualitative data collection is increasing this age group (Mason & Ide, 2014; Shapka et al., 2016). Some research shows that adolescents prefer online methods for interviews because it is seen as their natural communication environment (Hunter et al., 2013; Mason & Ide, 2014). Online qualitative data collection could arguably help to reduce power differences between youth and adults, making rapport easier to build, while also increasing participant’s comfort to share information (Horsfall et al., 2010; Mason & Ide, 2014; Shapka et al., 2016). For example, Mason and Ide’s (2014) study on adolescents’ experience of conducting interviews over email found a strong preference for online data collection, especially that it was more convenient in terms of scheduling and having more time to reflect on their responses.
Comparing Online Interviews to Other Traditional Formats
Studies comparing online interviews to other traditional formats (mostly face-to-face) show mixed results. For example, some researchers report no or little difference in outcomes between the various formats (Deakin & Wakefield, 2014; Jenner & Myers, 2019; Johnson et al., 2021). Meanwhile, some studies show more benefits in the data quality for in-person methods (Johnson et al., 2021; Krouwel et al., 2019) and others highlight enhanced quality using online methods versus in-person (Nehls et al., 2015; Woodyatt et al., 2016).
Studies indicating no difference between different interview formats highlight that both methods bring similar outcomes and do not differ in terms of the interview duration or breadth of topics covered (Jenner et al., 2019; Johnson et al., 2021; Krouwel et al., 2019); however, there is a lack of consensus on the data quality (Andrejuk, 2020). For instance, Jenner and Myers (2019) compared interview settings (i.e., private in-person, public in-person, and private via Skype) and found that interviewing via Skype produced neither a reduction nor inappropriate excesses of rapport. They also noted that Skype interviews were a popular choice among participants that did not result in shorter interview durations. Other researchers who compared face-to-face versus online video interviews reported that the quality of the interviews did not differ (Deakin & Wakefield, 2014); however, it is important to note that the results were based on researchers’ reflections and not detailed empirical assessments of the data quality.
Studies comparing different interview formats that favored in-person interviews showed that offline (traditional) interviews had greater word density and that more statements were produced to support a similar range of topics (Johnson et al., 2021; Krouwel et al., 2019). For instance, Krouwel et al. (2019) compared Skype video calling and in-person qualitative interview modes in a study of people with irritable bowel syndrome and found that both interview methods produced a similar number of words and topics (i.e., codes); however, the number of statements upon which the variety of topics was based was notably larger for the in-person interviews (Krouwel et al., 2019). Davies et al. (2020) conducted a scoping review of face-to-face versus online qualitative data and found that online interviews were often shorter and more to the point and often had lower levels of rapport development.
Studies comparing different interview formats that favored online methods have reported on the richness of the data from online compared to offline interviews (Nehls et al., 2015). For example, Woodyatt et al. (2016) compared in-person versus online focus group discussions and found that the online focus groups had a larger word count but were shorter in time than the in-person focus groups. They found a high overlap in the themes across the interview formats but the online groups had one additional theme regarding a sensitive topic. The in-person groups had less sharing of in-depth stories, whereas sensitive topics were discussed more candidly in the online groups (Woodyatt et al., 2016).
Although qualitative researchers are increasingly using online interview methods there is a lack of evidence on their data quality compared to other more traditional methods (e.g., telephone). Most research focuses on benefits and disadvantages of online qualitative methods, lessons learned and other practical issues of how to conduct research (e.g., Adams-Hutcheson & Longhurst, 2017; Andrejuk, 2020; Fox et al., 2007) rather than a comparison of data quality (Krouwel et al., 2019). Of the studies that have conducted a quantitative comparison of data quality they often focus on other (non-interview) methods such as focus groups or instant messaging (e.g., Shapka et al., 2016; Woodyatt et al., 2016).
Most studies to date exploring online interviews used Skype and there are few comparative studies have been conducted on the data quality of online interviews, especially those using Zoom. Given the predominance of Zoom over the past year during the COVID-19 pandemic (i.e., currently over 300 million users) and the increasing reliance on it for online qualitative data collection, it is worthwhile to explore its impact on interview data quality (Burstynsky, 2020). Further, there is limited research focusing on qualitative online interviews from a health care and specifically a patients’ perspectives (Krouwel et al., 2019). Here, I assess the data quality of online zoom interviews compared to phone interviews in a sample of youth.
Methods
The purpose of this study was to compare any differences in the quantity and quality of the interview data obtained through Zoom versus phone. This article is a secondary analysis drawing on a study focusing on barriers and facilitators for coping during the COVID-19 pandemic among youth with and without disabilities (Lindsay et al., 2021). In the original study we used a qualitative design to conduct in-depth, semi-structured interviews (see Supplemental File for interview guide) to explore school and employment status, career plans before the pandemic and how COVID-19 affected them, including psychosocial issues (Lindsay et al., 2021). A research ethics board at a pediatric hospital approved this study (#0129, approved June 29, 2020).
Data Collection
For the original study, we used a purposive sampling strategy to recruit participants through flyers, advertisements and social media that explained the goals of the study. Those interested in taking part received an information package from a researcher. Participants needed to meet the following inclusion criteria: aged between 15 and 29, with a disability, or without a disability (i.e., comparison group; Lindsay, 2019), who were currently employed, or had recent work experience. After assessing for participant’s capacity to consent, we obtained their written consent, while following the guidelines and protocols outlined by our institutional research ethics board.
Our interview guide was developed based on recent literature on school-work transitions, employment, volunteering, and school during pandemics (Lindsay et al., 2021). Questions asked about school and employment status, career plans before the pandemic, how COVID-19 impacted them, and coping strategies (see Supplemental File). Two researchers (with backgrounds in public health) conducted the semi-structured interviews (split evenly between them). They were trained by the principal investigator of the study who has extensive experience in qualitative research and interviewing youth with disabilities. The interviewers also conducted several practice interviews which were recorded and discussed with team members on areas for improvement.
The interviews were conducted online via Zoom (n = 29, 16 with video on, 13 without) or phone call (n = 5). Participants decided the interview format they wanted and were all asked the same questions, using a semi-structured interview guide (see Supplemental File; Lindsay et al., 2021). At the time of data collection for this study, face-to-face interviews were not permitted during the COVID-19 pandemic. The only online data collection option approved by our research ethics board was Zoom or telephone interviews. We felt it was important to give choice in the mode of interview to enhance the equity, diversity, and inclusion of participants. We also recognize that choice was needed in the interview format (and also choice of having the camera on/off) given the potential sensitive nature of the topic (i.e., coping with the pandemic). We recruited participants until our team agreed that thematic saturation was achieved within our study, which refers to no new codes being identified (Hennink et al., 2017). Interviews were conducted from July to November 2020 (during the COVID-19 pandemic, before vaccines were available) and lasted up to 53 minutes (mean 31 minutes).
Zoom is a videoconferencing service offering features such as online meetings, messaging services, and secure recording of sessions. This online platform allows the ability to communicate in real time with participants on a computer, tablet, or mobile device (Archibald et al., 2019). Zoom is relatively easy to use, has data management features and security options (Archibald et al., 2019). Other features include real-time encryption of meetings. Zoom does not require participants to have an account or download a program (Gray et al., 2020). Zoom also has password protection function to record the video and audio. With audio and video functions, participants can decide what level of contact they would like.
Analysis
The interviews were audio recorded and transcribed verbatim (including all utterances and instances of inaudible words) by the researchers who conducted the interviews. In the original study we applied an interpretive descriptive methodology to guide the analysis (Creswell & Poth, 2017). Two researchers each read all of the interview transcripts together, then separately by group (e.g., youth with and without a disability; Lindsay, 2019). We applied a two-stage, inductive comparative analysis using an open coding approach to assess all possible codes (Lindsay, 2019). We each established a list of codes across all participants while noting key patterns (Lindsay et al., 2021). Then, we compared and discussed our codes and revised or merged them as needed. Any discrepancies in the codes were discussed amongst the research team until consensus was reached. After comparing our codes they were grouped under higher-order headings to generate categories and sub-categories. Finally, a coding tree that described each of the codes was developed and then applied to all of the transcripts. We also compared and contrasted the codes within and between the groups of participants (youth with and without disabilities (Lindsay, 2019). Two researchers applied the coding scheme to the transcripts, after which we extracted quotes that were representative of each theme and sub-theme (Lindsay et al., 2021).
We draw on an established practice of using quantitative measures to assess differences between qualitative interview modes (Krouwel et al., 2019; Taylor et al., 2018). Here I focus on a comparative analysis of the interviews, which included assessing each transcript for quantitative features (e.g., word count, length of interview in minutes, proportion of words by the interviewer, number of themes and sub-themes per participant, average number of inaudible words). All but one of the interviews from the original study were involved in this analysis. One was excluded because the interviewer’s voice did not record properly part way through the interview, making it difficult to assess the overall word count and percentage of interviewer responses. I also draw on our field notes and observations to understand any key differences between the interview modes, using an applied a narrative, thematic analysis (Reissman, 2008).
Measures
Each transcript was coded according to the following dimensions that are indicative of interview quality. I followed de Leeuw’s (1992) dimension of interview duration, which included the number of minutes from the first statement from the interviewer directed toward the participant to the final statement. Utterances unrelated to the interview (such as testing the equipment) were excluded from the word count. Next, the volume of exchange (de Leeuw, 1992), also known as dominance (Kvale, 2006) was assessed by the total number of words spoken by the interviewer toward the participant (including probes, re-phrases, follow-up questions, and comments), which was calculated as a percentage of the overall interview word count.
Average number of themes and sub-themes per participant
I examined the total number of themes and sub-themes from the original analysis of the interviews (Lindsay et al., 2021), which helped to explore the depth and breadth of participant responses.
Average number of inaudible words
I calculated the total number of inaudible words per interview and divided them by the total number of interviews for each interview format
Additionally, given that in our original study we compared youth with and without a disability, I will apply this comparison of interview formats as well.
Sample Characteristics
The sample of interviews that I draw upon involved 34 youth and young adults, aged 16 to 29 (mean age 23.2), 17 with disabilities, 17 without), see Table 1. Of the youth who had a disability 11 youth had a physical disability and 3 had a mental health condition (e.g., anxiety, depression), 2 had a learning disability, and 1 with both a physical and mental health condition. About 13 youth were working (6 with a disability, 7 without) and 18 were currently in school (5 with a disability, 13 without). Five youth chose to participate by phone, and 29 by Zoom (16 with the video on and 13 with the video off; see Table 1).
Overview of Participants.
Results
First, I describe the key characteristics of participants in each mode of interview followed by differences in the data quality. Overall, our results show that online interviews conducted by Zoom, especially when the video is on, have comparable quality to those conducted by phone. Phone interviews were 17% longer in duration than Zoom interviews with the camera on and 25% longer than interviews with the camera off (see Table 2). Meanwhile, interviews conducted by Zoom, where participants had their video off, were shorter in duration and had less depth in their sub-themes and needed more interviewer prompting.
Descriptive Analysis of Interviews by Method.
Significant difference, p < .05.
Trends by Mode of Interview
Phone interviews
There was a significant difference in interview mode by disability status where five participants chose a phone interview, all of whom had a disability (see Table 2). It was interesting to note that the average age for this group was higher than participants interviewed through Zoom (26.2 vs. 22.5 years); however, this difference was not significant Most participants (4/5) interviewed by phone were employed and may have had more experiences to draw upon for examples during the interview and providing detailed answers. Participants appeared comfortable with sharing sensitive experiences with disability disclosure in the workplace and specifically issues about stigma and discrimination. They were also alright with disclosing sensitive topics they experienced anxiety and depression, boredom and social isolation during the pandemic. For example, a participant shared,
“With my mental health I have panic disorder, anxiety disorder, depression that sort of thing, which impacts my kind of day-to-day living the majority of the time….we all didn’t know what was in store, how long this was going to last. How it was going to impact the business, our hours, etc. So it was, it was all a quite difficult transition. I was pretty, like, depressed at the beginning. Those two weeks were super challenging for me to like, be working from home, just because I didn’t really have, like, a like common area where I can kind of work from so it was a lot of working from bed and it’s just not healthy.” (#6)
The interviewers observed that although it was a bit more challenging at the beginning of the interview to develop rapport because they could not see the participant’s face and body language, the flow of the interview was smoother overall without technical interruptions that are often encountered in zoom (e.g., lagging internet).
Zoom interviews (with camera on)
Sixteen participants chose to have their interview with the Zoom camera on (5 with a disability and 9 without). Participants who had a Zoom interview with their camera on were younger, on average, than those interviewed by phone. Half of participants with a disability who had their camera on for their interview were female compared to 100% of the group without disabilities. The group with disabilities that were interviewed by Zoom (with the camera on) had a higher mean age (23.85 years) compared to the group without disabilities (20.8 years). Participants interviewed by Zoom with the camera on did not seem to encounter any issues with discussing sensitive topics (e.g., depression, anxiety, social isolation etc.).
Having zoom with the camera on for the interview format had an average of 5.1 inaudible words per interview, which was much higher than other interview formats, although not statistically significant. This trend is likely due to the additional bandwidth that is often needed for having the camera on. The advantage, however, of having the camera on is that the interviewer can see when a participant’s screen is frozen or having a delay with their internet. For example,
I think it would fall under both…. Oh can you hear me? I think it froze for a second.
Yeah, I think it is a bit laggy (#14).
Sure, do you want to see if I can turn my mic a little?
Yes, please.
Okay. It is on the highest; So, I guess I will just have to talk a bit louder. (#14).
Another participant shared their experience with a Zoom interview:
“Sometimes if you’re going to do a Zoom interview for somebody that may be a barrier in itself. Right? Like, they may not sound confident because the internet might be lagging. Do you know what I mean? Not, like, I did an interview a couple of weeks ago actually. I felt somewhat comfortable, but it felt, it still felt very weird that I was talking.” (#1)
The interviewers observed that it was often easier to develop rapport with participants when their camera was on. However, sometimes the flow of the interview was interrupted if the internet connection was poor or if the participant did not have sufficient equipment (internet device or webcam); or if there were any background noises, which the interviewer was unable to control. With the camera on, the interviewer could at least see that the participants screen was frozen, whereas this was not the case for those participants who had their camera off.
Zoom interviews (camera off)
Thirteen participants had a Zoom interview with their camera turned off, which involved five youth with a disability and eight without (see Table 2). There were slightly more females in the group without disabilities and a younger average age than the youth with disabilities. Six of 13 participants in this group (2 with a disability, 4 without) reported technical challenges and 6/13 also did not report any examples for one of the main themes (i.e., transitioning to online school and working from home during the pandemic) from our original study. This suggests the data quality and depth of the themes could be impacted when the camera is turned off, although differences in technical challenges by mode of interview were not significantly different.
The interviewers observed that it was often more challenging to develop rapport with participants who had their camera off because they could not see their facial expressions or other non-verbal cues. Additionally, it was sometimes difficult to tell if they were experiencing a technical lag in their internet connection or were just not responding to the question. An advantage of having the camera off was that it could have helped participants to feel more comfortable in remaining anonymous or that the information they were sharing.
Interview duration
The interviews conducted by phone had a longer average duration (17% longer than Zoom with the video on and 25% longer than Zoom interviews with the video off), although not significantly different. Phone interviews had a similar average word count as the Zoom interviews with the camera on but had 12.4% more average words than Zoom interviews with the camera off (no significant difference).
In comparing the interview format by group of youth, we noticed that for youth with disabilities, phone interviews had the longest duration compared to Zoom interviews (see Table 3). Specifically, phone interviews among youth with disabilities were 17.2% longer in duration than Zoom with the camera on and 29% longer than Zoom interviews with the camera off, although not significantly different. A similar pattern was noted for youth without disabilities where the average interview duration was 6.3% longer for interviews with the video on (31.4 minutes) compared to video off (29.4 minutes). Comparing across groups, we see that both groups had a mean of 31.4 for Zoom interviews with the camera on. Meanwhile, the average interview duration for youth with disabilities was 8.4% shorter Zoom interview (with the camera off) compared to youth without disabilities. There were no statistically significant differences in the interview mode by group of youth (with or without a disability).
Descriptive Analysis of Interviews by Participant Type.
Word count
Upon looking at the average word count for all participants, we note the similarity for phone interviews and Zoom (camera on; 4,614 vs. 4,653). The Zoom interviews (with the camera off) had a 13.1% lower word count. In comparing youth with and without disabilities we see there was a similar word count for phone interviews (for youth with disabilities) and Zoom interviews (camera on) for both groups (youth with and without disabilities). There was a 16.4% lower average word count for youth with disabilities compared to youth without a disability although these differences were not statistically significant.
Interviewer dominance
In terms of the percentage of the interviewer word count the phone interviewers were 8.4% less than the Zoom (camera on) and 8.7% less than Zoom (camera off), indicating that phone participants needed somewhat less prompting or questions repeated. In comparing youth with and without disabilities, we see that phone interviews had a 17% lower interviewer word count compared to Zoom interviews for youth with disabilities. For youth with disabilities this was the same for Zoom interviews with the camera on or off. Meanwhile, for youth without disabilities, they had a 18.3% lower interviewer word count for the Zoom interviews with the camera on compared to Zoom interviews with the camera off. Differences in interviewer dominance were not statistically significant.
Themes and sub-themes
In comparing the mean number of themes and sub-themes across interview modes we found that phone interviews had a 3.8% higher average number of themes (5.4 vs. 5.2) than both types of Zoom interviews (camera on/off). In comparing youth with and without disabilities we notice that phone interviews yielded the highest average number of themes for youth with disabilities (5.4), followed by Zoom with the camera on (4.8) and slightly fewer with Zoom with the camera off (4.6). Youth without disabilities had 1.8% more average themes with the Zoom camera off compared to having the Zoom with the camera on. Further, youth without disabilities had 12.8% more than youth with disabilities who interviewed with the Zoom camera on and 17.9% more with the camera off.
In regards to the average number of sub-themes for all participants, Zoom interviews with the camera on had 13.7% more than Zoom interviews with the camera off, and 5% more sub-themes than the phone interviews. This suggests that having the camera on and seeing the participants facial expressions might have helped them to feel comfortable and to develop rapport, which in turn, could help participants to provide more detailed responses and examples. A similar trend was noted in comparing youth with and without disabilities. Meanwhile, youth with disabilities had the highest average number of sub-themes in the Zoom interviews (camera on), 5% more than Zoom interviews with the camera off and 10% more than phone interviews. Youth without disabilities had 10% lower average number of sub-themes for the Zoom interviews with the camera off compared to camera on. Interestingly, although the pattern was the same for both groups, it seemed to have a more pronounced impact for youth without disabilities on the number of sub-themes with the camera off, however these differences were not significant.
Inaudible words
Zoom interviews with the camera on had 37.3% more average number of inaudible words than phone interviews and 51% more than Zoom interviews with the camera off. This pattern differed for youth with and without disabilities. For youth with disabilities, they had an average of 9.1 inaudible words in Zoom interviews with the camera on, followed by 3.2 for the phone interviews and 2.6 for Zoom interviews with the camera off. For youth with disabilities they had a slightly higher average number of inaudible words for the Zoom interviews with the camera off compared to Zoom interviews with the camera on. Across groups, this indicates that youth with disabilities had 77% more average inaudible words for the Zoom interviews with the camera on, while youth without disabilities had 14% more for the Zoom interviews with the camera off, although these differences were not significantly different.
Discussion
This study compared the data quality of online Zoom interviews to phone interviews among a sample of youth with and without disabilities. Our findings showed that online interviews conducted by Zoom, especially when the video is on, have comparable quality to those conducted by phone. Zoom interviews with the camera on were slightly shorter than phone interviews but had the same word count, number of themes, and more sub-themes than phone interviews or those conducted by Zoom with the camera off. These trends suggest that having the camera on helps with developing rapport and seeing participant’s non-verbal cues.
The findings somewhat contrast Shapka et al.’s (2016) comparison of online versus in-person interviews with a sample of adolescents where they found that online interviews took more time but in-person interviews were longer in terms of the word count. Their study found that interviewers engaged in different ways across the two formats with in-person conversations having a higher percentage of interview questions asked and more probing questions (Shapka et al., 2016). They reported that online conversations had more rapport building statements. There were no differences in the number of conversation exchanges or off-topic statements (Shapka et al., 2016). Other research shows that online and in-person modes of interviewing among adolescents are structurally different in terms of the number of words and duration of the interviews; however, the level of self-disclosure and data quality were similar (Shapka et al., 2016).
Meanwhile, interviews conducted by Zoom where participants had their video turned off were shorter on average and had less depth in their sub-themes. This trend could be a result of lacking the face-to-face interaction and non-verbal cues, which seems somewhat more exacerbated in Zoom interview (camera off) compared to phone because this mode had additional issues of technical delays. It is recommended that when offering participants choice in the mode of remote interviews that it should be conducted either by phone or Zoom with the camera on. Having a Zoom interview with camera off seems to further impact the data quality.
Although Zoom interviews with the camera on had a higher average number of inaudible words compared to phone or Zoom interviews with the video off, likely due to technical issues or internet delays these did not appear to impact the quality of the data. This trend may have resulted from the interviewer being able to see the participant’s screen is frozen or delayed and waiting for them to respond or repeating a question. Other recent research suggests that technical delays in online interviews should not be viewed necessarily a negative thing and some researchers suggest it was an opportunity to build rapport (Krouwel et al., 2019).
In comparing youth with and without disabilities across interview mode our findings showed that youth with disabilities had better data quality (duration, word count, number of themes) with phone interviews and Zoom with the camera on. Meanwhile, Zoom interviews with the video off produced lower data quality (duration, word count, and number of themes), which seemed even more pronounced for youth with disabilities. This trend suggests they may need more time to build rapport or that a different mode of interview may be more optimal. Further research is needed to explore this further.
Our findings showed that participants’ comfort with sharing experiences with sensitive topics (stigma, discrimination, anxiety, and mental health). It appeared that Zoom interviews with the camera on yielded more depth in the themes and sub-themes compared to phone or Zoom interviews with the camera off. Although this trend differed somewhat for youth with and without disabilities. For youth with disabilities they had more overall themes in the phone interviews and had fewer themes but more depth to the sub-themes with Zoom interviews (video on). These findings highlight the importance of allowing participants to choose their preferred mode, especially for sensitive topics. Our results are similar to other research showing that sensitive information was shared more candidly in online focus groups compared to in-person focus groups (Mann & Stewart, 2000). Other research has found that online platforms may create a safer space for participants to discuss personal experiences of sensitive issues in a more anonymous environment (Woodyatt et al., 2016).
Given the benefits associated with online qualitative interviews, combined with the fact that it is a preferred method by youth, it is a promising method for researchers to consider. Key lessons learned during this research involving online interviews included that researchers and participants should check their internet connection, find a room without distractions, speak slowly and clearly, slow flexibility in repeating answers, and questions while paying attention to facial expressions. It is recommend to clearly communicate the expectations to participants that if they choose an online interview it is optimal for them to have their video on, otherwise recommend a phone interview as an alternate mode. It is important to allow participants choice in their interview format because it could help to increase diversity and inclusion of participants, which is increasingly being expected in research designs (Gill et al., 2018). It could also allow for more flexible scheduling while helping to provide a sense of safety and security amongst participants, which is particularly important for marginalized groups such as youth with disabilities.
Limitations and Future Research
There are several limitations of this research that should be considered. First, given that this research was conducted during the COVID-19 pandemic when face-to-face interviews were not permitted. Further research is needed to compare the data quality of Zoom, and phone interviews with face-to-face interviews. Second, we did not account for type of device (e.g., computer, tablet, and mobile device) that participants used, which could affect technical issues and data quality. Nor were we able to control for the surrounding environment in which the interviews were conducted, which is an important aspect to explore in future studies. Third, because we allowed participants to choose what mode of interview they wanted (i.e., not randomly assigned) it produced an uneven number of phone versus Zoom interviews; thus warranting further research to confirm these findings. We felt that it was important to allow for choice in mode of interview so that participants could feel comfortable, potentially enhancing the equity, diversity, and inclusion of participants. Fourth, there was an unintentional over-representation of female participants and further work is needed to include more males, and whether any gender differences exist in data quality by interview mode. Additionally, this study was a secondary analysis of a qualitative study focusing on youth’s school and work experiences during the pandemic where the aim is to obtain a purposive (small) sample, which may not be generalizable. Thus, caution should be used in interpreting the findings and tested with larger and a wide variety of other samples (e.g., various ages, research topics etc.). Fifth, the interviewers may have talked more with some participants if they needed the question repeated or needed additional prompting, which could have affected the total word count. Finally, our study was limited in some aspects by our research ethics board because we were not permitted to record the video, or use the chat function, features that could be important for further data analysis. Further research is needed with larger samples to better understand how interview mode influences data quality.
Conclusion
This study compared the data quality of online Zoom versus phone interviews. The findings showed that phone interviews resulted in a longer average duration than Zoom interviews but had a similar word count to Zoom interviews conducted with the camera on. Zoom interviews with the camera turned off had poorer data quality compared to other interview modes. This research also emphasizes the importance of giving participants choice in the format of their interview to allow for optimal sharing of experiences while enhancing the diversity and inclusion of the participants. Further research should consider exploring how online interviews compare to in-person interviews and how this varies for different types of participants.
Supplemental Material
sj-docx-1-sgo-10.1177_21582440221140098 – Supplemental material for A Comparative Analysis of Data Quality in Online Zoom Versus Phone Interviews
Supplemental material, sj-docx-1-sgo-10.1177_21582440221140098 for A Comparative Analysis of Data Quality in Online Zoom Versus Phone Interviews by Sally Lindsay in SAGE Open
Footnotes
Acknowledgements
We acknowledge the land, where this research was conducted on is the traditional territory of many nations including the Mississaugas of the Credit, the Anishnabeg, the Chippewa, the Haudenosaunee, and the Wendat peoples and is now home to many diverse First Nations, Inuit, and Métis peoples. We also acknowledge that Toronto is covered by Treaty 13 with the Mississaugas of the Credit. We also thank the participants involved in the study and the staff and trainees in the TRAIL lab who contributed to the larger project.
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Ethical Approval
This study received ethical approval from a research ethics board at Holland Bloorview Kids Rehabilitation Hospital (20-0129).
Supplemental Material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
