Abstract
Over the past five years, accelerated by the COVID-19 pandemic, qualitative research has evolved, with a shift to online settings and rapid analysis techniques, enhancing access and streamlining research processes. When recruiting health policy professionals and digital public health stakeholders with time constraints for qualitative studies, we encountered a challenge that remains underexplored in the literature: how to conduct interviews under 30 min while ensuring high data quality, depth and maintaining research rigour. Drawing on over 100 short interviews undertaken across four studies in health policy and digital public health, we provide experience-based guidance on addressing this challenge. We outline key methodological adaptations required to (1) secure participation from time-constrained professionals and (2) obtain meaningful insights despite limited time. These range from strategies adopted prior to the interview - such as sampling approach, preparation of materials, and interviewer training - to in-interview adaptations, including time management techniques and choices regarding recording and note-taking, all aimed at optimising the use of limited interview time. Our experience contributes to the advancement of qualitative research methods by reporting that, through extensive pilot testing and a carefully designed interview guide, 30-min short interviews can generate valuable data while accommodating professionals’ schedules and maintaining research rigour. This approach expands the reach of qualitative research, enabling data collection from individuals otherwise inaccessible due to time constraints.
Introduction - Interviewees With Time Constraints
Qualitative research explores the subjective experiences and understandings of individuals or specific groups and is widely used across disciplines, often complementing quantitative research as part of mixed-methods approach (Plano Clark, 2017; Pope, & Mays, 2020). Interviews are among the most common qualitative data collection methods, involving a set of questions designed to gather data that address the main research objective (Gill et al., 2008; Hinton, & Ryan, 2020). Research interviews can be structured (using closed questions), semi-structured (following an interview guide while allowing scope for topic exploration), or in-depth (employing open-ended questions that provide interviewees with the space to elaborate their thoughts and experiences). Interviews may be conducted one-to-one or in small groups, known as focus groups, to generate information on collective views (Gill et al., 2008; Hinton, & Ryan, 2020).
The COVID-19 pandemic catalysed a shift towards remote data collection, accelerating the digitalisation of qualitative interview practices with a transition from in-person interviews to virtual formats. Traditionally, qualitative methodologists have regarded face-to-face contact as the gold standard for qualitative interviewing, raising concerns that remote meetings may hinder rapport-building between the interviewer and interviewee, thereby limiting the depth of interaction and the quality of data (Gillham, 2005; Irvine, 2011; Leeuw, 1992; Lofland & Lofland, 1994; McCoyd & Kerson, 2006). While online interviews initially emerged as an adaptive response to pandemic restrictions, they were soon recognised as an opportunity for long-term methodological advancement (Keen et al., 2022). Alongside the shift towards digital interview methods, the pandemic, characterised by urgent challenges, demanded academia to generate high quality evidence in record time. As a result, literature on rapid qualitative research has expanded significantly. Research teams worldwide have adopted rapid appraisal techniques and assessment methods to capture ‘snapshots’ of complex situations, facilitating the swift evaluation of innovations and interventions through rapid feedback and rapid evaluation cycles (Deom et al., 2023; Smith et al., 2023; Vindrola-Padros, 2021; Vindrola-Padros et al., 2020).
While digital interviews and rapid appraisals have improved access to interviewees and streamlined qualitative research analysis processes making them more efficient, the duration of interviews for qualitative studies can extend beyond one hour (DiCicco-Bloom & Crabtree, 2006). A review of 227 qualitative studies reported an average duration of interviews of 64 min (Young et al., 2018). Securing this level of availability from interviewees with demanding schedules can be particularly challenging. In our research, our targeted interviewees -health policy professionals and digital public health stakeholders across multiple European countries- cited time constraints as a barrier to participating in the proposed 45- to 60-min interviews. We reframed this challenge as an opportunity: by limiting interview duration to 25-30 min, we successfully conducted over 100 interviews across four studies while maintaining methodological rigour and depth. These studies explored health policymakers’ perceptions of public trust in health data sharing in the European Union, Italy, France, and Switzerland (Zavattaro et al., 2025b); examined the views of digital public health stakeholders on the evolution of the sociopolitical discourse on health data sharing in Switzerland (Zavattaro et al., 2025a); and engaged health policymakers from the European region to refine a self-assessment tool aimed at integrating trust-building principles into policy work (Zavattaro et al., 2025). Additionally, we consulted international stakeholders on digital trust (Gille et al., 2024). Across the four studies, we conducted one-to-one online interviews with health policymakers and digital health stakeholders, recruited through purposive sampling and snowballing, with recruitment concluding once data saturation was reached (Green et al., 2025; Pope & Mays, 2020). All interviews were conducted with written informed consent, audio-recorded, and analysed using an inductive thematic approach. Two researchers independently coded the transcripts and subsequently discussed the coding to reach consensus. An overview of the thematic coverage achieved across the studies is presented in Appendix 1. With appropriate methodological adaptations, short interviews proved to be an effective method for overcoming two key challenges we encountered in our studies: (1) ensuring the participation of individuals with limited availability of time; and (2) obtaining meaningful and in-depth insights despite time constraints.
When designing our studies, we conducted a preparatory review of the existing literature on best practices for conducting short interviews lasting 25-30 min. While publications on qualitative research interview techniques, including those tailored to specific target groups such as elite informants, as well as studies on rapid data analysis methodologies are available, specific guidance on conducting short interviews of up to 30 min remains lacking (McGrath et al., 2019; Solarino & Aguinis, 2021; Vindrola-Padros, 2021). Drawing on our experience, we aim to address this gap by offering insights into the methodological adaptations necessary to support future qualitative researchers seeking to conduct short interviews. We do not suggest that interviews originally designed to last 90 min and cover multiple topics can simply be condensed into 30 min. On the contrary, such an approach would compromise research quality, which is not the objective of this paper. Instead, we advocate for a purpose-built design philosophy in which the brevity of the interview is a deliberate methodological choice, fully aligned with narrowly defined research aims and highly focused questioning. This approach not only supports feasibility and participation, but also ensures that the core principles of research rigour are preserved within a condensed format. We offer experience-based guidance on the methodological adaptations we would have found invaluable prior to commencing data collection.
Methodological Adaptations for Conducting Short 30-Min Interviews
Adaptations to Streamline the Interview Process and Fit the 30-min Format
1) Prior to the Interview
a) Sampling Strategy
It is essential not only to keep the interview brief but also to streamline the recruitment process by minimising unnecessary emails, maintaining conciseness, and avoiding excessive communication, which may discourage participation. To arrange the interview date, rather than proposing fixed time slots or using scheduling tools -which proved ineffective in our studies-we advise to ask interviewees to suggest 2-3 preferred time slots based on their availability, starting from a proposed date. Additionally, specify in the initial email whether data will be anonymised, as concerns about confidentiality are common and can be a reason for non-participation. Addressing this early helps prevent follow-up queries and declination of the invite. We advise to send the consent form 1-2 days before the interview. Sending it too far in advance may result in interviewees delaying or forgetting to sign it and requiring additional follow-ups, while sending it shortly beforehand sustains momentum and ensures the document is fresh in the participant’s mind.
b) Preparation of the Interview Material
Consent Form
To streamline the consent process and improve interviewee compliance, we used an online consent form. While .pdf digital signatures can be efficient for those familiar with them, some interviewees encountered difficulties, requiring printing, sign, scan, and return the document, which proved cumbersome. In several instances, this approach introduced unnecessary friction and led to delays in scheduling interviews. The consent form should be concise, focusing on the information most relevant to interviewees and required by ethics and the research institutions (University of Zurich, 2025). Given the limited time of interviewees, an overly lengthy consent document may lead to delays, missed signatures, or dropouts. In our experience, reducing legal jargon and presenting essential information in a question-and-answer format - for example, ‘How is my privacy protected?’, ‘Which personal data will you collect?’, ‘What is my benefit in participating to the project?’ - proved helpful in maintaining engagement and ensuring a smooth onboarding process.
Interview Material
For online video interviews, two common approaches include using Microsoft PowerPoint (or similar software) to display key visuals, and interview questions or relying on personal notes as a guide while maintaining a conversational flow. Screen sharing enhances clarity by allowing interviewees to read questions directly, reducing misunderstandings - especially when participants are not communicating in their first language - and keeping discussions on track. Presenting one question per slide helps maintain focus and minimises distractions from subsequent questions. This visual anchoring is particularly effective in maintaining a shared reference point, especially in remote interactions where attention can waver more easily. To maximise interview time, questions must be carefully defined. Broad questions may lead to lengthy, unfocused responses. Instead, ensure that interview questions are open-ended, maintaining the explanatory nature of qualitative research while remaining highly focused on the research topic. This balance between openness and specificity is key to eliciting meaningful responses within limited time. The question format can be effectively calibrated during pilot testing with pilot interviewees who share a similar background to the study population. Limiting the number of questions is crucial: based on our experience, five is the maximum manageable in a 30-min interview, with four being optimal to maintain a steady pace and avoid rushed responses.
c) Interviewer Preparation
Once the interview material is finalised, pilot test the interview. Pilot testing is recommended for all interviews as it familiarises the interviewer with the material, ensures a smooth and structured flow, and builds confidence in using essential technology such as screen sharing and audio recording (McGrath et al., 2019). This step is particularly crucial for short interviews, as it helps maintain the 30-min timeframe and ensures clear time allocation between the introduction and main discussion. It also allows for the fine-tuning of question phrasing, transitions, and tone, which are particularly important when time is limited, and each question must yield meaningful data. Multiple rounds of pilot testing are crucial when preparing for short interviews, as the interviewer must refine their skills and ability to create a welcoming environment during the brief introduction, allowing the interviewee to feel comfortable to express their thoughts. We recommend limiting the introduction to a maximum of 5-7 min. This brief opening should include a clear explanation of the purpose of the study, interview structure, and ethical assurances. It is also advisable to prepare a set of phrases to politely intervene if the interviewee goes off-topic, preventing unnecessary loss of time. In short interviews, any delays caused by technical difficulties (e.g., audio issues, connectivity problems) can consume a significant portion of the available time, potentially compromising the effectiveness of the interview. Therefore, it is essential to conduct pre-interview technical checks and have contingency plans in place, such as backup devices to mitigate potential disruptions.
2) During the Interview
d) Time Management
It is crucial to monitor the time carefully to ensure all questions are covered within the allocated duration. We advise allocating approximate time slots for each question or thematic block and remaining attentive to any deviations. Periodic time checks help maintain the overall pace, ensuring that all questions are addressed without rushing or omitting any of them.
e) Recording Options and Note-Taking
The decision to audio-record an interview largely depends on the study design, planned analytical methods, and available resources. Researchers can choose to take notes instead of recording, record audio only, or record audio and video. Given the time constraints of a 30-min interview, we recommend focusing exclusively on the conversation rather than taking notes. Note-taking can be distracting and may lead to the loss of valuable insights and time; in fast-paced interviews, even brief moments of diverted attention can result in missed nuances or incomplete data. Audio recording allows full engagement in the discussion, ensuring all responses are captured accurately. For our studies, video recording was not necessary, as body language was not a factor in the analysis.
Discussion
For years, qualitative research has been regarded as a time-consuming method requiring in-person interaction to obtain in-depth insights (Gillham, 2005; Vindrola-Padros & Johnson, 2020). Over the past five years, however, the field has undergone substantial transformation, accelerated by the COVID-19 pandemic, which prompted a shift to online settings and the development of rapid analysis techniques (Keen et al., 2022; Smith et al., 2023; Vindrola-Padros, 2021). Moreover, the growing use of Natural Language Processing for sentiment analysis and semantic coding, alongside the recent public release of artificial intelligence programs such as ChatGPT, has brought automated data analysis to the forefront of methodological advancements in qualitative research (Morgan, 2023). While these tools remain in early stages of integration, their potential to support thematic analysis, and summarisation continues to attract attention from both experienced and early-career researchers. In parallel, the widespread adoption of digital platforms has lowered logistical barriers to participation, facilitating access to diverse and geographically dispersed populations. These changes reflect broader shifts in research practice, where flexibility, speed, and digital accessibility are increasingly prioritised without compromising rigour.
Challenges in Short Interviews and Strategies to Address Them
Limited Depth of Responses
One concern we had when considering 30-min interviews was that their shorter duration could compromise data quality by limiting in-depth discussion. To maintain high data quality, it is fundamental to: (1) Pilot test the interview extensively to ensure the introduction remains brief and the overall flow of the interview is smooth. (2) Carefully select the number of planned questions. Based on our experience, short 30-min interviews are most effective when research questions are well-defined and limited to a maximum of five. Concise, clear, and well-structured questions ensure that 30 min are sufficient to obtain meaningful insights from interviewees, maintaining a steady pace and avoiding rushed responses. (3) Prepare follow-up questions to narrow the discussion, particularly focusing on ‘how’ and ‘why’ questions, is an effective way to encourage deeper responses within a short timeframe. (4) Employ clarification probes to encourage additional detail and verify interpretations thereby enhancing data richness (Robinson, 2023).
It is also important to note that our interviewees were highly qualified professionals responding to policy-related and technical questions, which required pragmatic answers. The situation may differ when interviewing different study populations, such as members of the public or patients discussing highly sensitive topics or personal experiences, as they may require longer sessions and additional rapport-building to achieve comparable depth.
Reduced Rapport and Interviewee Engagement
30-min interviews give interviewer less time at the beginning of the interview to build rapport with the interviewee. This was not a limitation in our studies which focused mainly on technical aspects rather than sensitive topics, and a few warm-up questions at the start were sufficient to create a relaxed atmosphere and encourage natural responses. However, this may pose a limitation in studies requiring empathy, particularly when exploring sensitive or deeply personal subjects, such as illness narratives, trauma, or family-related issues. In such contexts, participants may need more time to feel safe, understood, and respected before they are willing to share openly. It is therefore essential, in such studies, that interviewers allocate adequate time at the beginning of the session to establish trust and adjust their tone, language, and body language accordingly (Westland et al., 2024).
Risk of Missing Key Themes
Short interviews may miss key themes that could emerge in longer discussions. Given that all interviews have a set time limit, this risk must be acknowledged. Limiting the number of questions and pilot-testing the interview to ensure that interviewees have sufficient time to elaborate on their responses reduces this risk. Moreover, inviting interviewees to take additional time should they wish to reflect further or discuss particular issues in greater depth, and offering the option to follow up with written reflections after the interview, is a valuable way to mitigate this risk. Furthermore, achieving data saturation - where no new insights emerge from additional interviews - remains a key criterion in short interviews to uphold the quality of the study (Fusch & Ness, 2015; Glaser & Strauss, 2017; Saunders et al., 2018). Therefore, it is advisable to increase the number of interviewees if necessary until data saturation is achieved.
Increased Bias Risks
While biases such as interpretation bias, moderator bias, and selection bias apply to both short and longer interviews, confirmation bias may be more pronounced in short interviews, as interviewers might steer the conversation towards expected answers to save time (McSweeney, 2021). This can occur consciously or unconsciously, particularly when the interviewer feels pressure to cover all planned questions within the allotted timeframe. Adopting a semi-structured approach with standardised, pre-defined interview questions helps mitigate this bias while ensuring flexibility and comprehensive coverage of key topics. This format maintains consistency across interviews while allowing room for clarification and follow-up where necessary.
Short, 30-min interviews proved to be a viable method for collecting meaningful data from health policy professionals and digital public health stakeholders with time constraints. However, in some instances, interviewees reported that they could not dedicate 30 min to the interview. As a last-resort method for gathering valuable data from those with severe time limitations and ensuring their inclusion in the study, we offered interviewees the option to respond to open-ended interview questions in writing. To maximise participation, we recommend including a concise project introduction in the email with the interview questions, clarifying the study’s purpose and the value of their contributions. Simplifying the response process is also crucial - requesting direct email replies proved more effective than sending a Microsoft Word document, which was often too demanding for busy interviewees. While this method limits follow-up opportunities, it increases participation and allows researchers to gather insights from otherwise inaccessible individuals, thereby strengthening the overall robustness and representativeness of the dataset.
Beyond the duration of the interviews, the priority of qualitative researchers is to ensure rigour in their studies to demonstrate validity and reliability through verification techniques (Morse et al., 2002). These include methodological coherence - ensuring alignment between the research question and methodological components; sampling sufficiency - selecting interviewees who best represent or possess knowledge of the research topic, evidenced by saturation (Fusch & Ness, 2015); fostering a dynamic relationship between sampling, data collection, and analysis through iterative interaction; applying theoretical reasoning; and supporting theory development (Morse et al., 2002). Each of these elements contributes to the internal consistency and analytical transparency of the research process, regardless of interview length. Maintaining such standards is essential when employing short-format interviews, as brevity may otherwise be perceived as a compromise.
This paper proposed the short interview method as a viable approach within qualitative research to enable researchers to interview and collect data from individuals who would otherwise be inaccessible, while maintaining methodological rigour. Future research should assess the broader applicability of the proposed 30-min interview method in contexts beyond the highly qualified professionals interviewed across the four studies underpinning this paper, such as with patients or in community-based settings. This is particularly relevant in sensitive contexts where time plays a crucial role in establishing trust between interviewer and interviewee and in enabling the collection of high-quality data. A meta-analysis on interview length and its impact on study reliability and validity suggests that shorter interviews can be more resource-efficient without compromising data quality (Thorsteinson, 2018). This evidence, however, originates from occupational and organisational psychology, therefore, there remains a clear need for future research comparing short and longer interview formats in different domains, including health policymaking, for example, by examining differences in data richness, thematic saturation or participant experience. Such work would help empirically validate the 30-min interview method as a robust alternative to longer formats, making qualitative research suitable for time-constrained participants.
Conclusions
With appropriate methodological adaptations, short interviews of up to 30 min are an effective method for obtaining meaningful and in-depth insights from professionals with demanding schedules. Drawing on over 100 online 30-min interviews with health policy professionals and digital public health stakeholders conducted across four studies, this paper offers experience-based guidance for planning and conducting short interviews effectively. Extensive pilot testing and the use of a well-structured set of up to five interview questions form the foundation for ensuring high-quality data and research rigour. Alongside the adoption of online interviews and rapid analysis, the 30-min short interview format marks a significant advancement in qualitative research, traditionally considered a time-intensive method reliant on in-person interactions for in-depth insights, expanding access to time limited participants. As demands for faster, more agile research methods increase -particularly in health policy and digital public health contexts- short interviews present a scalable and rigorous solution. We encourage further reflection and empirical study on their use across diverse populations and research domains to refine and formalise this emerging methodological approach.
Supplemental Material
Supplemental Material - The 30-Minute Interview Methods Guide: Lessons From Over 100 Interviews in the Health Policy Domain
Supplemental Material The 30-Minute Interview Methods Guide: Lessons From Over 100 Interviews in the Health Policy Domain by Federica Zavattaro, Felix Gille in International Journal of Qualitative Methods
Footnotes
Acknowledgements
We would like to thank all interviewees for their valuable contributions, which enabled us to reflect on the interview process and write the present manuscript.
Ethical Considerations
All studies on which this manuscript is based complied with the ethical principles outlined in the Declaration of Helsinki. As none of the studies fell within the scope of the Swiss Human Research Act, they did not require formal ethical approval from cantonal ethics committees.
Consent to Participate
All participants in the original studies provided written informed consent to take part in the research.
Consent for Publication
All participants gave written consent for the publication of anonymised data derived from the studies.
Author Contributions
FZ and FG jointly conceptualised the manuscript. FZ prepared the initial draft, and FG reviewed and edited the text.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: FZ and FG receive funding from the Digital Society Initiative at the University of Zurich. FG is concurrently employed at the Federal Chancellery of Switzerland; however, the views expressed in this article are solely those of the author and do not represent the position of the Federal Chancellery.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data Availability Statement
Data from each individual study are available via the respective publications in which the studies were originally reported.
Supplemental Material
Supplemental Material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
