Abstract
Member checking is a technique which aims to increase the trustworthiness or rigour of qualitative research by asking participants to comment on study findings. However, traditional methods of member checking (e.g., transcript reviews) face scrutiny for being ineffective or tokenistic ways of eliciting participant feedback. Emerging member checking approaches seek to evoke feedback in more meaningful ways. While these alternatives have merit, persistent challenges include eliciting critical feedback, time constraints, supporting an ongoing dialogue with participants, and setting future research directions. To address these challenges, we introduce a novel alternative to member checking, “Participatory Member Checking” (PMC). PMC draws from the principles of Patient Engagement (a participatory approach) and promotes the co-creation of qualitative research findings between participants and researchers across five steps: (1) Elicit Feedback, (2) Summarize Feedback, (3) Check for Understanding, (4) Implement Feedback, and (5) Demonstrate Accountability. PMC encourages critical feedback, is practical and efficient, promotes ongoing dialogue through both written and verbal feedback, and involves participants in setting future research directions. The present article presents PMC in the context of a qualitative study exploring patient partners’ experiences of being engaged in research projects supported by the Canadian Institutes of Health Research. We describe PMC in sufficient detail to facilitate uptake by other researchers, show how PMC meaningfully impacted our research findings, and demonstrate the acceptability of PMC among a group of 11 participants (Median age = 62, range = 25–82, 81.8% women). Considerations for adopting PMC in future research are discussed.
Keywords
Introduction
Member checking is a well-established technique within qualitative research whereby researchers invite participants to confirm, build upon, or amend (a) raw qualitative data, (b) the researchers’ analysis or interpretation of the data (e.g., code or theme development), and/or (c) the research findings (Creswell, 2008; Miles et al., 2014). This practice has also been referred to as “participant feedback,” participant validation” and “respondent validation” (Motulsky, 2021). During a member check, participants can re-examine their dialogue with the researcher (e.g., Stoner et al., 2005) or use their firsthand knowledge of the research context to identify instances where a researcher’s interpretations deviate from participants’ lived experiences (Hammersley & Atkinson, 1983). Through member checking, researchers aim to enhance the credibility (i.e., the results reflect participants’ lived experiences), dependability (i.e., the results are consistent with the data), and the overall trustworthiness of their research (Lincoln & Guba, 1985). Examples of member checking used in past research include returning transcripts to participants for feedback (e.g., Carlson, 2010), focus groups (e.g., Krueger & Casey, 2000), editing transcripts alongside participants (e.g., Doyle, 2007), interviews drawing from the analysis of a single participant’s data (e.g., Harvey, 2015), and eliciting participant feedback on a synthesized report of the study results (e.g., Koelsch, 2013).
Criticisms of Member Checking
Though member checking is proposed to benefit research by enhancing qualitative rigour, this method has faced scrutiny. Thomas (2017) reviewed 44 published articles that used member checking and concluded this practice did not substantially impact research quality in terms of theory generalization, participant representation, power-sharing, or social change. At the level of individual publications, member checking often lacks description or justification; researchers may not state how member checking was undertaken, for what purpose, or how it aligns with their ontology and epistemology (Birt et al., 2016). Member checking can also be tokenistic, as power dynamics may cause participants to avoid questioning or criticizing a researcher’s findings (Buchbinder, 2011). If participants do question the researcher or disagree with the study findings, researchers may be unwilling or unable to accommodate divergent viewpoints (Birt et al., 2016). Further, established methods of member checking often lack instructions for integrating the outcomes of a member check into a study’s findings (Creswell, 2008). As a result, member checking has been criticized as a process lacking accountability (Yin, 2014). From an ethical perspective, member checking can trigger emotional distress among participants, particularly within sensitive research areas (Motulsky, 2021). Member checking can also place an undue burden on participants, who may be asked to spend several hours reviewing transcripts without fully understanding how their input will be used (Motulsky, 2021). In these situations, member checking may do more harm than good. Approaches are needed that empower participants to engage with their data meaningfully as co-researchers while avoiding the pitfalls of member checking described above. Within health research, patient engagement is one approach that shifts the dynamic between participants and researchers by inviting people with lived experience (e.g., participants) to become co-researchers and influence study findings.
Patient Engagement in Research
As the knowledge user group with direct experience of health conditions and accessing health services, patients possess a knowledge base that is unique and complementary to clinical and scientific knowledge (Canadian Institutes of Health Research [CIHR], 2019). Patient engagement (also commonly known as patient and public involvement, consumer and community involvement, or stakeholder engagement) is a participatory research approach that capitalizes on this knowledge through the formation of meaningful and active collaborations between patients and care partners (i.e., patient partners) and researchers across the research cycle, wherein study decision-making is guided by patient partners’ experiences, values, and expertise (Harrington et al., 2020). The exact nature of these collaborations has been conceptualized as varying according to whether there is a two-way flow of information and who holds the decision-making power between the groups (Manafò et al., 2018). Unsurprisingly, there are many ways to engage patients in research (Chudyk et al., 2022). However, all engagement initiatives should follow a core set of underlying principles, such as inclusiveness, support, mutual respect, and co-building (CIHR, 2019), to ensure meaningful and active engagement in the research.
The principles of patient engagement are designed to mitigate against tokenistic or harmful ways of engaging people with lived experience as co-researchers (CIHR, 2019). Therefore, by drawing from these principles, researchers should be better positioned to conduct member checking meaningfully and achieve the desired outcomes of enhancing qualitative rigour and ensuring participants can see themselves in the research findings. Member checking can be a transformative process aligned with the goals of patient engagement when it “provides power to shape the research agenda to reflect [patients’] priority issues (Stringer, 2003) and/or to influence how they are represented in research reports (Torrance, 2012) through shared decision-making” (Brear, 2019, p. 945). Below, we describe four alternative approaches to member checking that meaningfully engage participants and align with the principles of inclusiveness, support, mutual respect, and co-building (CIHR, 2019). We then address potential shortcomings of these approaches and present our alternative method which takes a patient engagement approach to member checking.
Member Checking Alternatives
To improve upon previously established member checking methods, Birt et al. (2016) developed an approach called “Synthesized Member Checking”. This method involves first screening participants to ensure they will not be distressed by member checking, distributing a written results summary for feedback, charting this feedback, and integrating results. Synthesized Member Checking prioritizes harm prevention, efficiency, and practicality (i.e., participants are not overburdened), and supports co-building by outlining how participants’ feedback should be integrated (Birt et al., 2016). Since the publication of Synthesized Member Checking (Birt et al., 2016), more researchers have developed member checking alternatives. For example, “Enhanced Member Checking” (Chase, 2017) involves researchers collaborating with participants to co-create their narratives across two review meetings. Within this method, member checking is used to center participants’ voices within the research and promote social justice and empowerment (Chase, 2017). Brear (2019) developed “Dialogic Member Checking,” a four-stage, iterative approach in which participants think independently about the data, share their thoughts, critique findings, and negotiate the presentation of results. This approach involves group discussions, sorting activities, researcher presentations, and agree/disagree activities. Dialogic Member Checking prompts the researcher to consider their power and position in relation to the participants and data (Brear, 2019). Similarly, Motulsky (2021) proposes “Reflexive Participant Collaboration” as an alternative to member checking. Reflexive Participant Collaboration does not capture one specific approach, but rather, is a way of reconceptualizing the member checking process that encourages researchers to follow the principles of reflexivity and participant empowerment (Motulsky, 2021). Finally, in their approach termed “Collaborative Reflection,” Urry and colleagues (2024) invited participants to comment on how published research findings could shape future research or clinical practice. Despite the differences between these member checking alternatives, they all contain common themes of researcher reflexivity and participant involvement or empowerment.
Challenges of Member Checking Alternatives
Motulsky (2021) suggests that researchers should consider the purpose of member checking, the potential for participant burden, ethical ramifications, the method of choice, and how to incorporate participants’ feedback. While each of the member check alternatives presented above have strengths, they also shed light on emerging challenges related to these areas. Chase (2017) noted that power dynamics between researchers and participants are not automatically resolved through member checking, and participants may still hesitate to question the “expert” researcher. Indeed, Birt (2016), Brear (2019), Chase (2017), and Motulsky (2021) all described participants’ tendency to agree with researchers’ interpretations rather than critiquing their findings. New approaches to member checking should continue developing safe avenues for participants to question or criticize research findings. Another potential shortcoming is the time-consuming nature of member check alternatives for both researchers and participants. While Chase (2017) succeeded in creating an emancipatory member check process, scheduling two revision sessions with each participant may not be feasible for larger studies. Similarly, Brear (2019) noted that the time required for participants to complete Dialogic Member Checking was a limiting factor to engagement. Therefore, efficiency and practicality remain important considerations when developing member checking alternatives that can be widely adopted.
Lack of follow-up or missed opportunities for participant input also present challenges for some member checking alternatives. For example, participants working with Birt et al. (2016) provided feedback on the study results but were not shown how their feedback was used. While Synthesized Member Checking aims to minimize participant burden (Birt et al., 2016), an additional touchpoint with participants could have affirmed the value of their contributions. In addition, the member checking approach developed by Urry et al. (2024) did not allow participants to influence study findings. Rather, participants commented on already-published data as a priority-setting exercise for future research (Urry et al., 2024). While Collaborative Reflection can be a launch point towards co-created research (Urry et al., 2024), participants may wish to provide input on study findings before their data is published. Therefore, emerging member checking approaches should allow participants to co-create the findings of studies they contributed to, clearly demonstrate how their feedback was used, and encourage them to suggest future research directions.
The Present Study
Considering the growing research on member checking alternatives and the challenges in this area, we presently introduce a new member checking alternative called “Participatory Member Checking” (PMC) that is applied to the data analysis stage of qualitative research. PMC was developed following the principles of patient engagement and builds upon other member checking alternatives (e.g., Birt et al., 2016; Urry et al., 2024) by creating a safe space for participants to provide critical feedback or question the researchers’ interpretations, attempting to minimize participant and researcher burden by prioritizing efficiency and practicality, embedding multiple touchpoints with participants so they can see how their feedback was used, and providing opportunities for participants to both co-create research findings and suggest future research directions.
We developed PMC as part of a larger qualitative study that explored patient partners’ experiences of being engaged in research projects supported by the Canadian Institutes of Health Research Strategy for Patient-Oriented Research (CIHR SPOR; Chudyk et al., 2023). The present article aims to (1) describe our method of PMC in enough detail to facilitate adoption by other researchers, (2) demonstrate how PMC can meaningfully impact study results, and (3) evaluate the acceptability of PMC among research participants who were involved in this process.
Methods
Research Context
Ethical approval for this research was obtained from the ethics board at the University of Manitoba (Protocol #E2019:082(HS23180)). Our reporting follows the COREQ checklist (Tong et al., 2007) wherever appropriate (Supplemental File 1). We acknowledge that some COREQ items do not apply to our article because our primary aim is to describe our method of member checking rather than report on qualitative research findings. We direct readers to a recently published article that describes the data collection methods and the initial findings of the larger qualitative study in which we developed PMC (Chudyk et al., 2023).
The larger qualitative study and PMC were both co-led by a patient partner and a patient engagement researcher (AMC) and supported by a larger team consisting of another patient partner, two senior scientists, a patient engagement specialist, and an implementation scientist. This research was conducted within a constructivist paradigm whereby members of the research team endorsed a relativist ontology and a subjectivist epistemology. That is, we believed that each participant and researcher who engaged in the qualitative study and PMC experienced different realities due to their personal histories and social locations. Therefore, we viewed knowledge as something that was co-created between people who all bring their unique perspectives to a research project (Crotty, 1998; Lincoln et al., 2011). These underlying beliefs justify the participatory nature of PMC, whereby member checking was used as a method of knowledge co-creation.
Data Collection
Data for the larger qualitative study was collected through semi-structured individual interviews that were hosted via online videoconferencing and co-led by one researcher (AMC) and one patient partner. Each interview was conducted in English and lasted between 60 and 90 minutes (Markula & Silk, 2011). The use of semi-structured interviews ensured that each participant answered similar questions about their experiences as a patient partner. This approach also grants research teams the flexibility to deviate from the interview guide and explore new topics in depth (Markula & Silk, 2011). More detailed information about data collection can be found in a recently published article (Chudyk et al., 2023). The semi-structured interviews were audio-recorded, transcribed verbatim, and subjected to a codebook thematic analysis (Braun & Clarke, 2022) which was conducted in collaboration with all members of the research team (Chudyk et al., 2023). Consistent with codebook thematic analysis, a more structured approach than reflexive thematic analysis, the themes were organized as topic summaries under each of the 18 questions asked in the interview guide (Braun & Clarke, 2022). The interviews conducted in this study aimed to address several research questions, with the goal of producing multiple qualitative articles examining different aspects of the data (Altay & Koçak, 2021). Consequently, a larger number of broad questions were posed to participants, allowing for the generation of more themes than typically observed in an analysis centred on a single research question (Braun & Clarke, 2022).
Participants
Participant Demographic Information (N = 11).
Participatory Member Checking (PMC)
Five Steps of Participatory Member Checking
Step 1: Elicit Feedback
Qualitative Interview Questions.
Member Checking Questions
Step 2: Summarize Feedback
In Step 2 of PMC, participants’ feedback on each candidate theme and subtheme was compiled into a single copy of the fillable PDF originally used to gather the feedback. This feedback was categorized under the four prompts listed in Box 2. Other organizational methods that could be used to complete PMC Step 2 include digital spreadsheets or qualitative data management software like NVivo. After compiling the feedback, author AMC led the process of adjusting the analysis by creating new data “codes” (Braun & Clarke, 2022) and editing the themes and subthemes based on participants’ responses to the feedback document. This process was supported by all co-authors who contributed to the original qualitative study through collaborative discussions with AMC. Changes to the analysis were tracked using a table with two columns: Column 1 = “Participant Input” and Column 2 = “Change to Analysis.” More details about how participants’ individual feedback impacted the analysis are provided in the results section below. As author AMC incorporated participants’ comments, they identified themes that lacked feedback or had significant disagreement, which would need to be prioritized for group discussion in Step 3 of PMC.
Step 3: Check for Understanding
In Step 3 of PMC, all participants involved in member checking were invited to attend a three-hour member checking workshop one week after they had returned their written feedback to the research team. Two workshops were conducted on different days and times to accommodate all participants’ schedules. All workshop options were hosted online via videoconferencing. Before the workshop, participants were sent a document containing the agenda and instructions for using the virtual meeting platform. Each workshop was conducted in English and was facilitated by two researchers, one patient partner, and one student who assisted with note-taking and technical support (i.e., monitoring the online chat and meeting waiting room, helping participants use the online meeting functions). Due to a scheduling issue, one participant completed the workshop individually with the lead researcher (AMC). The workshops were audio-recorded to ensure all relevant statements from the group would be captured. While our approach to conducting member checking workshops was unique to PMC, similar methods have been used to facilitate participatory qualitative data analysis workshops (e.g., Blair et al., 2022).
The member checking workshops opened with a land acknowledgement; introductions; reminders about privacy, confidentiality, and respect; ground rules for virtual interactions; and an icebreaker activity. The icebreaker activity (Clotheslines and Kite Strings; Bernstein, nd) was chosen to foster open-mindedness by demonstrating how people may come to conclusions for different reasons. The icebreaker also reminded participants that the research findings were to reflect all of their collective experiences. The introductory activities lasted approximately 35 minutes and were required to help establish a safe space for open discussions (Chlup & Collins, 2010).
Questions for Debriefing After Member Checking Workshop
Step 4: Implement Feedback
In Step 4 of PMC, the feedback provided by participants in both Step 1 and Step 3 was used to revise the preliminary analysis into a final set of themes and subthemes that represented the perspectives of both the research team and participants. The themes that were prioritized as being most important by the participants (see PMC Step 2, Aim 2) were then used to inform the preparation of an academic journal article (Chudyk et al., 2023). Other themes of secondary importance were set aside for publication in future research articles.
Step 5: Demonstrate Accountability
In Step 5 of PMC, an individualized report was prepared for each participant who engaged in member checking. An example of this report can be found in Supplemental File 3. Each report outlined exactly how the feedback a participant provided in Step 1 of PMC led to changes in the analysis and synthesis of study findings. For example, if a suggestion made by a participant resulted in the creation of a new data code or revision of a subtheme, this was stated in the individualized report. General questions participants asked about the data were also addressed in this report. Finally, the report provided a summary of the collective feedback we received during the member checking workshops in PMC Step 3. By outlining exactly how participants’ feedback was used and why, Step 5 of PMC addressed the accountability gap that can pose an issue in other member checking methods as described by Birt et al. (2016).
Evaluation of Participatory Member Checking
Two methods of evaluation were embedded within our PMC approach to assess acceptability. First, participants provided written feedback on Step 1 of PMC using a short evaluation survey located at the end of the feedback document. Specifically, participants typed their responses to the following open-ended questions in a text box within the PDF: “(1) What, if anything, did you like about the design/interactivity of this document? (2) What would you improve, and (3) Having reviewed the document, what questions do you still have that you feel it could have helped answer?” Responses to these questions were returned to the research team at the end of PMC Step 1 when participants submitted the fillable PDF with their thoughts on the candidate themes/subthemes. The second evaluation occurred at the end of the member checking workshop. Participants were provided with a link to an anonymous online survey containing seven open-ended questions assessing: (1) their overall impressions of the workshop, (2) their ability to express themselves in both the workshop and the feedback document, (3) their perception that their perspective was heard, (4) whether the workshop met their expectations, (5) what worked well, (6) what did not work well, and (7) their suggestions for future similar workshops. Participants were asked to elaborate on their responses to each of these questions.
Results
Impact of Participatory Member Checking on the Research Findings
Summary of Changes to Analysis from PMC Step 1.
Examples of Changes to Analysis from PMC Step 1.
While we sought to integrate all written feedback into the analysis, we faced challenges in the context of two topics that elicited diverse opinions from participants — education and training, and the compensation of patient partners. Regarding education and training, there was notable disagreement about how researchers and patient partners should learn to engage. Some participants felt that prior engagement-specific education was essential for both groups, while others argued that hands-on experience should be paired with education. Regarding fair compensation for engagement, participants also voiced different views and expectations. Given the diversity of opinions, these two topics were prioritized for discussion in PMC Step 3 (Check for Understanding) to help the research team accurately reflect participants’ views and experiences in the analysis. All eleven participants completed PMC Step 3 and attended a group member checking workshop (n = 10) or a one-on-one workshop (n = 1). The conclusions drawn about these two topics are described below.
The research team’s initial interpretations of the data concerning education and training focused on training patient partners. This involved setting expectations, communicating clearly, providing introductions and orientations, using accessible language, and offering opportunities for peer mentorship and skill development. However, this interpretation was limited, as it solely addressed patient partner training, and did not consider the training needed for researchers to effectively engage with patients. While statements related to the need for researcher training were available in the dataset, they were not sufficiently represented in the analysis. Through workshop discussions, participants expanded upon their vision for representing the education and training of both researchers and patient partners in the analysis.
Participants described how patient partners and researchers should both learn together (to foster relationship-building) and learn separately (to create safe environments for sharing within their respective knowledge groups). They agreed that this learning process should be supported through structured training opportunities and practical engagement experiences. Participants felt researchers should be transparent about their level of engagement experience and collaborate with patient partners to define the nature of their engagement relationship. Moreover, when providing training for patient partners, researchers should establish clear and realistic expectations for training outcomes and ensure that patient partners can access the necessary baseline knowledge to meaningfully engage.
In terms of compensation, the research team initially interpreted the data by focusing on the various types of compensation received by patient partners (e.g., honoraria, hourly compensation, reimbursement for expenses) and emphasized the importance of compensation from the patient partner perspective. However, participants felt that a more nuanced discussion of compensation was necessary. They believed it was important to clearly define different types of compensation and outline situations in which disbursing each type of compensation was appropriate. The research team had also overlooked the discussion of compensation supplements/alternatives (e.g., authorship opportunities or training). In their written feedback, some participants proposed that compensation should be standardized across studies, while others argued that it should be evaluated on a “case-by-case” basis.
Furthermore, while participants agreed with the researchers’ interpretation that compensation was valued by most patient partners, they felt the researcher’s analysis did not address the value messages conveyed through compensation. Appropriately compensating patient partners sends a powerful message about the level of value researchers place upon their contributions, and this distinction was incorporated into the analysis during the workshop discussions. Other additions to the analysis included the need for researchers to be transparent about compensation, to plan adequately for it (such as incorporating it into grant budgets), and to engage in open dialogue to define appropriate compensation with patient partners.
The second aim of the workshop in PMC Step 3 was to identify key messages in the data that participants felt should be prioritized for dissemination in future manuscript(s). Briefly, these messages included the importance of recognizing that patient partners are a heterogeneous group of individuals who have various motivations for engaging in research. As such, academic researchers should provide a range of flexible engagement approaches that patient partners can choose from. Furthermore, patient partners bring more to a study than just their condition or experience with the health system; they have a wealth of life experiences and skills that can benefit a research project. Participants also emphasized that patient engagement is not a cookie-cutter process; it can happen at different research stages, take different forms, and be used for different purposes. Given this diversity, researchers must clarify roles and expectations for engagement at the outset of a project. Overall, participants felt that developing mutual respect and trusting relationships with patient partners was essential to creating quality engagement experiences and avoiding tokenism within patient engagement. Some of these qualitative themes were recently shared in a published manuscript (Chudyk et al., 2023).
Evaluation of PMC
Ten participants completed the evaluation questions at the end of the feedback document in PMC Step 1. The task of providing feedback through the fillable PDF was viewed as generally favourable by 8 of the 10 participants. The benefits of the feedback exercise included the use of multiple prompts to help participants think critically about the themes, the clear layout and lay language used in the fillable PDF, the ease of use, and the ability for participants to read supporting excerpts alongside each theme/subtheme. These benefits were exemplified by Participant 9, “I liked being able to read about what other patient partners had to say, I especially liked reading their quotes,” and Participant 8, “I appreciated that you gave us different options to fill in. There isn’t really anything I would improve, and I don’t have any lingering questions.”
Participants also identified some criticisms and opportunities to improve PMC Step 1. Three participants commented that the length of the document and the number of feedback prompts resulted in fatigue or frustration. These participants felt the activity would have been more accessible if they could provide verbal responses or give feedback in multiple shorter stages, for example, “I found the document a bit long to read. I couldn’t focus well towards the end. I think sending it in two different chunks would have helped me provide better feedback/review” (Participant 2). They also recommended that the research team avoid repetition of thoughts or ideas in the feedback document and remind participants about the length of time required to complete the activity. While these suggestions should be addressed, they are also tempered by the more positive feedback about the activity length provided by other participants, I hope I have met your objectives through my answers. It did require significant time and effort to complete this document, but the time of reflection was important. It is a necessity to have open-ended questions in order to achieve the clearest understanding of patient engagement so a strong foundation of enacting patient partnership in research can be laid. (Participant 12).
Summary of Evaluation Survey Responses.
Discussion
PMC is a new approach to member checking which draws from the principles and goals of patient engagement to involve participants in co-creating qualitative research findings through their input on qualitative data analysis. The present article described the five steps of PMC in detail to promote adoption by other researchers, identified how PMC meaningfully impacted the results of a qualitative study, and demonstrated the general acceptability of PMC among the participants engaged in this process. We provide a figure summarizing the steps of PMC in Supplemental File 4.
PMC builds upon existing member checking alternatives (e.g., Birt et al., 2016) by creating a safe space for participants to provide critical feedback. The use of written feedback in PMC Step 1 was intended to promote safety by allowing participants to ask questions or provide criticism indirectly (i.e., through the fillable PDF) rather than through face-to-face communication, which could be intimidating to some (Buchbinder, 2011). The feedback prompts also welcomed criticism/questioning by asking participants to identify ideas that were missing. Participants’ responses to the evaluation survey demonstrated they could freely express their points of view and felt their views were heard by the research team.
While we aimed to create a safe space for sharing criticisms, we recognize that power dynamics between researchers and participants may not have been fully addressed through this method. Participants may have felt hesitant to express their criticisms in writing to the research team. Future researchers using PMC should consider additional measures to foster a safe environment, such as appointing a trusted community partner to facilitate the member checking process or adding a preamble to the document that conveys the team’s openness to hearing all feedback, including criticisms. Furthermore, some participants noted that the length of the feedback document was a barrier, potentially hindering their ability to articulate criticisms, as doing so would require additional time to complete the document.
Revised Member Checking Questions for PMC Step 1
Of note, PMC occurs at the level of data analysis to inform theme and code development. Participants may indicate that a researcher has misinterpreted a theme if it does not accurately reflect their personal lived experiences. At this stage, it can be beneficial to remind participants that themes represent a synthesis of many participants’ statements and that others may have had different experiences. We reaffirmed this point on Page 2 of the written feedback document (Supplemental File 2). Participants’ feedback should be collaboratively discussed in PMC Step 3, where adjustments to themes and codes can be considered in light of individual criticisms while ensuring integrity to the collective experiences of the group.
The use of written feedback in PMC Step 1 was intended to prioritize efficiency and practicality, particularly for researchers who wish to conduct member checking with larger groups of participants. This objective was partially met; N = 11 participants provided between 4 and 46 comments on the candidate themes/subthemes within one week. However, some (n = 3) participants commented that the time required to provide written feedback in PMC Step 1 was a barrier. While we had originally anticipated that completing the feedback document would take participants upwards of an hour, we did not ask participants to record how long the activity took them and could not report these data. Concerns around the time required to provide written feedback might stem from the numerous themes presented to participants in the current study, which aimed to inform multiple research questions and manuscripts. Future researchers employing PMC should aim to gather participants’ feedback on a more focused set of themes, subthemes, and codes. Furthermore, researchers should pilot-test the written feedback document to provide participants with an estimated time needed for completion. The actual time required for participants to complete the written feedback document should also be reported.
We maintain that obtaining written feedback from participants is a practical strategy for conducting meaningful member checking, particularly with larger groups (e.g., those exceeding 10 participants), as the time required to conduct individual member checking interviews with each participant may not be feasible. Nevertheless, this process still demands significant time. This includes the time researchers need to create the feedback document, compile participants’ written responses, conduct member checking workshops, synthesize feedback to inform the analysis, and share follow-up reports with participants. Additionally, the time required for participants to provide their written feedback can be substantial. Researchers must be mindful of the time and resources involved in member checking and choose the most suitable method for their study. For instance, studies with smaller samples may opt for conversation-based member checking methods, which can occur within a timeframe similar to (or shorter than) that of PMC, especially if these conversational methods align better with the study population (e.g., Chase, 2017). Additionally, the participatory nature of PMC means that this approach takes more time than other member checking methods. For example, asking participants to approve their transcripts or a final version of the analysis is much faster than PMC but may only gather limited feedback. Therefore, when choosing a member checking approach, we encourage researchers to consider the available time and resources, the needs of their study population, the size of their participant sample, and the amount of feedback they wish to obtain.
PMC follows the recommendations of Birt et al. (2016) by embedding two touchpoints where participants can see how their feedback was implemented by the research team. The workshop in PMC Step 3, which was informed by all participants’ responses to the written feedback document (PMC Step 1), represented the first touchpoint. This use of written feedback allowed participants to see their ideas represented in the group discussions and ensured the discussion guide was tailored towards topics that would benefit most from group dialogue. PMC Step 5, “Demonstrating Accountability,” represented the second touchpoint. By sharing personalized documents outlining how participants’ individual feedback (PMC Step 1) and group feedback (PMC Step 3) impacted the analysis, they could fully understand how their efforts made a difference to the research outcomes. The success of these touchpoints was demonstrated through the evaluation questionnaire in which nine out of 10 respondents indicated they felt their perspectives were heard by the research team. Our evaluation of PMC also fulfills the suggestion of Birt et al. (2016), which called for more research examining how participants experience member checking to help identify the best approaches for eliciting participant feedback.
PMC combines the strengths of two existing member checking alternatives into one cohesive approach. Specifically, the member checking method described by Birt et al. (2016) encourages participants to co-create research findings but does not include opportunities for shaping future research directions. Conversely, the member checking method proposed by Urry et al. (2024) engages participants in setting future research directions but does not involve the co-creation of research findings. Drawing from both of these approaches, PMC encourages participants to shape their own research findings through the written feedback document and the member checking workshop. PMC then invites participants to set future research directions by identifying key messages for publication in resulting manuscript(s) and encouraging them to remain involved in the project moving forward. Through these steps, PMC presents a streamlined method of engaging participants in the co-creation of research findings and fostering future research partnership opportunities.
This article provides a detailed description of PMC to promote its adoption by other researchers. However, we do not intend for PMC to be an overly rigid approach. Researchers may choose to tailor PMC to best suit their context and participants. For example, although we engaged 11 participants in PMC, adaptations could be made to involve more or fewer participants. When conducting member checking with many participants, researchers should consider how they will facilitate workshops in PMC Step 3 (e.g., holding multiple workshops or only involving a subset of participants in a workshop) and how they will provide feedback to participants in PMC Step 5 (e.g., providing group summary feedback as opposed to individualized feedback). Furthermore, researchers may modify the questions asked of participants in the written feedback document and member checking workshop to generate information tailored to the needs of their studies. While PMC was developed in the context of a study examining patient engagement, this approach can be applied to co-create qualitative findings in other fields and is not limited to patient engagement research. Finally, we applied PMC within a constructivist paradigm because it aligns with our belief that reality is experienced subjectively and, therefore, knowledge is co-created between researchers and participants (Crotty, 1998; Lincoln et al., 2011). PMC can be applied within other paradigms, and we encourage researchers to reflect on (and explicitly state) how their philosophical stance influences what they hope to gain from member checking (Motulsky, 2021).
While we hope other researchers may consider adopting PMC, we do not suggest that it must be universally used across all qualitative research contexts. Our approach relies heavily on written feedback and was developed within a study involving a group of highly educated participants. While eliciting written feedback from participants is not unique to PMC (see Birt et al., 2016; Brear, 2019), it may be less suitable for participants who are not comfortable with extensive written communication (e.g., individuals from lower educational backgrounds). We encourage researchers to critically assess the needs of their study population if they decide to pursue member checking and consider the most effective methods for gathering feedback from those groups. Conversation-based approaches to member checking may be more fitting for certain populations. For example, Chase (2017) applied an approach called “Enhanced Member Checking,” which drew more heavily from verbal feedback over written feedback.
Additionally, we recognize that within some qualitative paradigms or methodologies, the researcher’s interpretations of data may take precedence, and co-creation may not be an important goal. We do not suggest that member checking must be conducted within all qualitative studies. Prior to initiating member checking, researchers are encouraged to reflect on what value this process might add to their work and whether it could enhance the rigor or trustworthiness of their findings. In some cases, these reflections might lead researchers to decide against implementing member checking. However, if researchers choose to use member checking, participants should be given the necessary tools and opportunities to contribute meaningfully within the scope of the study. For this reason, we hope that other researchers may adopt or modify PMC to meet the specific needs of their studies if it is appropriate to do so.
Conclusion
Member checking, when done well, can increase the trustworthiness of qualitative research (Lincoln & Guba, 1985) while empowering participants to shape their own research findings (e.g., Chase, 2017). We present PMC as a novel approach to member checking which draws from the principles of patient engagement to co-create qualitative findings alongside participants. PMC addresses gaps in other member checking alternatives, can meaningfully impact research findings, and is acceptable to participants. We recommend that future researchers use this article as a guide for conducting PMC and continue to share how this method can be adapted to suit other research contexts as well as how it impacts both participants and research findings.
Supplemental Material
Supplemental Material - Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings
Supplemental Material for Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings by Sasha M. Kullman, and Anna M. Chudyk in International Journal of Qualitative Methods
Supplemental Material
Supplemental Material - Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings
Supplemental Material for Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings by Sasha M. Kullman, and Anna M. Chudyk in International Journal of Qualitative Methods
Supplemental Material
Supplemental Material - Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings
Supplemental Material for Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings by Sasha M. Kullman, and Anna M. Chudyk in International Journal of Qualitative Methods
Supplemental Material
Supplemental Material - Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings
Supplemental Material for Participatory Member Checking: A Novel Approach for Engaging Participants in Co-Creating Qualitative Findings by Sasha M. Kullman, and Anna M. Chudyk in International Journal of Qualitative Methods
Footnotes
Acknowledgements
The authors acknowledge the instrumental contributions of the late Mr. Roger Stoddard 1 , who co-led data collection and conceptualized participatory member checking alongside the senior author AMC. We also sincerely thank the participants who contributed their valuable perspectives to our qualitative study and engaged so thoughtfully in member checking. Lastly, this study is part of a larger project whose co-investigators included Dr. Annette Schultz, Dr. Todd Duhamel, Serena Hickes, Dr. Nicola McCleary, and Carolyn Shimmin.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported through funding from the University Collaborative Research Program (University of Manitoba, grant number (50664)) and the Canadian Institutes of Health Research through author AMC’s Patient-Oriented Research Awards-Transition to Leadership Stream award grant number (170670). We would like to express our gratitude for all funding support. The datasets used and/or analyzed during the current study are available from the corresponding author upon reasonable request.
Ethical Statement
Supplemental Material
Supplemental material for this article is available online.
Note
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
