Abstract
Introduction
Unpaid primary caregivers (caregivers) play a critical role within the healthcare system1,2 by: (1) voicing the needs and context of community dwelling, older adults living with multiple complex conditions who are unable to fully participate in care 3 ; (2) positively impacting adherence to medical care plans 4 ; and (3) decreasing healthcare utilization. 5 This role should be regularly assessed, as caregiver experience is a crucial component of the equity-informed quadruple aim framework. 6 Currently, there is no psychometrically sound, cost-free, short, clear language, easy to complete experience survey specifically designed for caregivers of community-dwelling older adults needing outpatient specialized geriatric services (SGS). Therefore, a reliable and valid caregiver experience survey is needed.
The objective of this study was to co-design a psychometrically sound caregiver experience survey to optimize the degree to which caregivers experience meaningful engagement with SGS clinicians and are best supported in their caregiver role.
Methods
Using recognized survey research methods, 7 this study was conducted in 3 phases as outlined in Table 1. Please note that the gap between Phase 2 and Phase 3 was due to the COVID-19 pandemic (2020-2023).
Methods Used to Develop and Test the Caregiver Experience Survey.
Phase 1
Literature Review
A rapid review of the literature was conducted 8 in 2018 to identify existing surveys, survey items, or dimensions of caregiver experience. Search criteria were determined, and a comprehensive search limited to the English language and published between 2013 and 2018 of MEDLINE, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO was conducted. Literature was excluded if it exclusively focused on paid/formal caregivers, caregiver burden/stress, caregivers of an adult using a non-health-related service, or if survey items or dimensions were not provided.
Operationalization of Caregiver Experience
Literature from the rapid review was analyzed thematically to identify key domains of caregiver experience. A survey framework and discussion guide were developed to support a standardized participatory co-design approach, 9 where caregiver advisors and geriatric experts collaboratively determined survey item wording.
Phase 2
Caregiver Cognitive Interviews and Refinement
The study was approved by the Queensway Carleton Hospital Research Ethics Board and the Health Sciences North Research Ethics Board in 2019. Written informed consent was obtained by researchers for semistructured cognitive interviews 10 conducted at 2 SGS sites (Ottawa and Sudbury). Both sites are members of a provincial network (Provincial Geriatrics Leadership Ontario) that specializes in the provision of SGS; both offer outpatient services to individuals aged 65 and older who are encountering physical, cognitive, mental, and/or social health challenges. A convenience sample of caregivers included those who could read and speak English. Caregivers provided insights regarding elements of the paper version of the survey (eg, relevance and utility). A participatory co-design approach 9 was utilized to develop an online version for Phase 3.
Phase 3
Pilot Testing
The Queensway Carleton Hospital Research Ethics Board approved implied consent for pilot testing the online version at their site (Ottawa) for 15 months in 2023 to 2024 by a convenience sample of caregivers who could read and write English. After an onsite discharge meeting, caregivers were invited to voluntarily complete a closed Microsoft Forms survey (7 pages, 1-5 required questions/page, “back button” review). Surveys were offered on an on-site tablet, or a direct survey link was sent via email. An information letter was provided that detailed the study purpose, investigator, time to complete, lack of incentives, use of Microsoft Forms, disabled IP tracking, anonymous responses stored in the United States for 10 years, and implied consent by clicking on the submit button. Assuming a 95% confidence interval width of ±0.15 around a moderate (0.4-0.6) 11 Spearman rank correlation (rho), a sample size of 120 completed surveys was selected (if rho = 0.6, sample size 86; rho = 0.5, sample size 111; rho = 0.4, sample size 132). 12
Analysis and Refinement
Frequency distributions and measures of central tendency were generated for categorical variables and continuous variables, respectively. Eleven survey items were scored from 1 (No, definitely not) to 5 (Yes, definitely), and for 2 items (see Table 2, #10, #11), respondents were also able to select “does not apply.” Cronbach's alpha was used to quantify the internal consistency of the 11-item Caregiver Experience Survey. Tool validity was then determined (construct convergent validity: Spearman rho correlation between an overall rating [“Overall, this experience has helped me in my role as a caregiver”; 0 = No, definitely not, 10 = Yes, definitely] and the summed core 11-item score; construct convergent validity: differences in item scores [Kruskal Wallis test] by age group, gender, and/or season). All quantitative analyses were done in Stata v.18. 13
Inter-item and item-total Spearman correlations were examined to assess alignment with the proposed survey structure. Framework subdomains were used to code responses to 2 qualitative questions (“As a caregiver what did you find helpful?”; “As a caregiver what would have made your experience better?”). This was done to determine both the degree of alignment between quantitative ratings and qualitative comments and the extent to which provided comments could support quality improvement activities.
Refinement of the pilot version involved the review of study findings and collaborative decision making. The final version of the online survey was then determined.
Results
Phase 1
Literature Review
The rapid review identified 882 articles and gray literature for review. After duplicates were removed, 824 articles were screened by title and abstract. The review revealed that a reliable and valid caregiver experience survey did not exist for outpatient SGS clinics/programs, thereby indicating the need for a valid and reliable caregiver experience survey.
Operationalization of Caregiver Experience
A total of 22 articles (domestic and international) were included in the information synthesis. A thematic analysis of these articles identified 3 caregiver experience domains and 11 subdomains.
Phase 2
Caregiver Cognitive Interviews
During the interviews, all 8 caregivers (Ottawa: 5; Sudbury: 3) indicated that the draft pilot survey items measured different key facets of their caregiver experience, supporting face and content validity of the survey.
Refinement
Feedback from caregivers led to additional specificity in the wording of 3 items (#2, #5, #6), the rewording of feedback items (#13, #14), and a demographic item (#16). Additionally, a “does not apply” response option was added for 2 items (#10, #11). These findings were utilized to create a survey with 11 items scored on a 5-point Likert scale, including 2 with a “does not apply” option, 1 overall item scored on an 11-point Likert scale, 2 qualitative questions, and 2 demographic questions.
Phase 3
Pilot Testing
Although 154 caregivers met the Phase 3 study inclusion criteria and were invited to complete the survey (via onsite tablet: 144; via email with direct survey link: 10), only 120 responded to all items (exclusive of qualitative and demographic items). This resulted in a completion rate of 78%.
Analysis
Twenty-four (20.0%) and 28 (23.3%) caregivers selected “Does not apply” for items #10 and #11, respectively, and 19 (15.8%) selected “Does not apply” for both items. Summed core 11-item scores for the 87 respondents, who provided quantitative responses to all 11 items, ranged from 18 to 55 (mean: 48.3 [SD: 8.4]; skewness: −1.88 [Shapiro-Wilk: P < .000]; median: 51 [interquartile range: 47, 55]). Twenty-two (25%) respondents had the maximum score of 55; no one had a minimum score.
Cronbach's alpha was 0.94, and Pearson item-total correlations varied from 0.72 (#4) to 0.86 (#8, #9). Table 2 shows that more than 88% of respondents selected the highest score for 2 items (#3, #4). The Spearman rho correlation between the summed core 11-item score and the overall experience rating was 0.74 (P < .001), providing evidence of construct convergent validity. Despite relatively little dispersion, differences were detected by age group and season for 4 of the 11 core items (Kruskal-Wallis test, P < .06), providing some evidence of construct divergent validity.
Pilot Survey Item-by-Item Analysis Findings.
Note: questions introduced with “Please read the statements and click on the number that best describes your experience as a caregiver with this clinic/program.”
a Item scored on a scale from 1 (No, definitely not) to 5 (Yes, definitely).
b Items scored on a scale from 1 (No, definitely not) to 5 (Yes, definitely) but also allowed to select “does not apply.”
c Significance: * P < .06; ^ P < .05, Kruskal-Wallis test.
d Difference by age (A): under 50, 50-64, 65-74, 75-84, 85-94 or by season (S): Dec 6, 2022-Mar 31, 2023, Apr-Jun 2023, Jul-Sep 2023, Oct-Dec 2023, Jan-Feb 2024.
e Item-total correlation = Pearson correlation between the item score and the summed framework-based items excluding that item.
f All Pearson correlations, P < .001.
g Spearman correlation between item and overall rating.
h All Spearman correlations, P < .001.
One hundred eighty-one qualitative comments were provided by 114 (95%) respondents. Fifty-nine (33%) comments provided enough detail to support quality improvement initiatives, and 158 comments (87%) aligned with quantitative item ratings.
Refinement
Qualitative data analysis revealed 23 misalignments between quantitative response ratings and qualitative comments (ie, scored 5 “Yes, definitely”, yet provided negative comments). Of those, 16 misalignments were for 2 items (#10, #11) with the “does not apply” response option. Once this response option was removed, alignment improved to 94%. Therefore, “does not apply” was removed from the final version of the survey.
Discussion
As there was no existing tool, a psychometrically sound caregiver experience survey was co-designed by caregivers and geriatric experts to measure clinician-caregiver engagement. Based on a literature-informed framework that identified 3 caregiver experience domains and 11 subdomains, the proposed survey demonstrated high internal consistency (Cronbach's alpha: 0.94), indicating the measurement of a common construct. It is unlikely that this high score was due to item redundancy, as each survey item aligns with a unique literature-based subdomain key to the global construct “caregiver experience.” Further, cognitive interviews with caregivers indicated that all survey items were meaningful, relevant, and needed. Face and content validity were established through caregiver cognitive interviews and the grounding of the survey in existing literature. Construct convergent and divergent validity were established in Phase 3.
Limitations
As pilot testing was conducted at only 1 (albeit typical) SGS site, future testing of survey generalizability in broader populations and geographies is warranted. Additional demographic data (eg, language, ethnicity, education, etc) should be collected to enable disaggregation and analysis across diverse groups. The survey should also be translated into multiple languages using professional translators, cognitive testing, and back translation.
Additionally, self-reported data are subject to social desirability bias, possibly limiting respondents' use of all response options and decreasing Cronbach's alpha estimates. As well, estimates of overall satisfaction with SGS services may be inflated.
While a “does not apply” option allows survey respondents to skip questions they feel are not applicable to their situation, this survey was co-designed to focus on common issues that are applicable to all caregivers of those receiving out-patient SGS services and not a particular subset.
It is suggested that simple imputation not be used if item responses are missing, as additional examination of group, age, and sex differences is needed. Further, in lieu of summed 11-item scores, it is suggested that individual item scores, along with the qualitative responses, be used to identify potential quality improvement activities.
Conclusions
A psychometrically sound, evidence-informed, online caregiver experience survey was developed. Final wording of the core items can be found in Table 2, and the online survey is included as a Supplementary Document. The survey findings will provide clinicians with information that will inform quality improvement initiatives to support clinician-caregiver engagement, which may then increase patient adherence to care plans and decrease healthcare utilization by community-dwelling older adults who are unable to fully participate in care.
Supplemental Material
sj-docx-1-jpx-10.1177_23743735251385309 - Supplemental material for Clinician-Caregiver Engagement in Older Adult Care. Development of a Validated Caregiver Experience Survey to Inform the Optimization of the Caregiver Role
Supplemental material, sj-docx-1-jpx-10.1177_23743735251385309 for Clinician-Caregiver Engagement in Older Adult Care. Development of a Validated Caregiver Experience Survey to Inform the Optimization of the Caregiver Role by Ronaye T Gilsenan, MA, Rhonda E Schwartz, MA, and Iris A Gutmanis, PhD in Journal of Patient Experience
Footnotes
Acknowledgments
The following individuals collaboratively developed the survey framework and wording of survey items with the authors: Anne-Marie Yaraskavitch—Caregiver Advisor, Ontario; Phyllis Hymmen—Caregiver Advisor, Ontario; Samantha Jibb—North-East Specialized Geriatric Centre; Andrea Gatien—North-East Specialized Geriatric Centre; Lisa Brancaccio—Specialized Geriatric Services, South-East; Kelly Milne—Regional Geriatric Program of Eastern Ontario; David P. Ryan—Regional Geriatric Program of Toronto; Jane McKinnon-Wilson—Specialized Geriatric Services, Waterloo-Wellington; Kelly Kay—Provincial Geriatrics Leadership Ontario. The following individuals recruited caregivers for Phase 3 Pilot Testing: Kelly Potts—Queensway Carleton Hospital Geriatric Day Hospital; Louise MacDonald—Queensway Carleton Hospital Geriatric Day Hospital.
Author Contributions
The following authors approved the final version of the article for publication and agreed to be accountable for all aspects of the work: Ronaye T Gilsenan, MA—Project administration, conceptualization, methodology, supervision of data acquisition, analysis of qualitative data, interpretation of data, and manuscript writing (original draft preparation, review, editing); Rhonda E Schwartz, MA—Conceptualization, methodology, supervision of data acquisition, interpretation of data, and manuscript writing (original draft preparation, review, editing); Iris A Gutmanis, PhD—Analysis of quantitative data, interpretation of data, and manuscript writing (original draft preparation, review, editing).
Consent to Participate
Phase 2 cognitive interviews: Written informed consent was gathered for caregivers who participated in the cognitive interviews.
Consent for Publication
Not applicable. All study data are grouped. Survey data are anonymous.
Data Availability Statement
Not applicable. The authors understand that data sharing is not applicable to this journal.
Declaration of Conflicting Interests
The authors declare no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Ethical Approval
Ethical approval of this study was obtained from the Queensway Carleton Hospital Research Ethics Board located in Ottawa, Ontario, Canada (Approval no.19-02) on November 5, 2019 and the Health Sciences North Research Ethics Board located in Sudbury, Ontario, Canada (Approval no.19-037) on October 15 2019.
Phase 3 pilot testing: The REB waived the requirement for informed consent for the anonymous online survey. Instead, implied consent was gathered using online survey instructions (clicking on the submit button implies consent to participate). In addition, an information letter was provided to participants (eg, study purpose, investigator, time to complete, lack of incentives, use of Microsoft Forms, disabled IP tracking, anonymous responses stored in the USA for 10 years, etc).
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
Writing Assistance and Third-Party Submissions
None.
Supplemental Material
Supplemental material for this article is available online.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
