Abstract
At the outset of the COVID-19 pandemic, Alabama’s Title V Children and Youth with Special Health Care Needs (CYSHCN) team was forced to innovate in order to gather community input and to prioritize the findings of the 2020 Title V Maternal and Child Health Five-Year Comprehensive Needs Assessment. On a shortened timeline, the team pivoted from a full-day, in-person meeting of professionals and family representatives to an asynchronous, online “meeting” that included all planned and necessary content, allowed for comment by community members, and resulted in a prioritized list of needs. This needs assessment process showed that by using a platform like the online survey tool, Qualtrics, in an innovative way, programs can capture broader, more diverse perspectives without sacrificing quality of communication, content, or feedback. It shows the possibility for strengthening maternal and child health (MCH) systems and other systems of care though rich engagement. This model can be easily replicated in other survey tools, benefiting other states that are faced with difficulties convening geographically dispersed professionals and communities.
Community engagement efforts often struggle to gain insight from populations and others directly impacted by programs and policies due to geography and time constraints. The COVID-19 pandemic forced innovation in how practitioners work with communities and organizations.
This case study highlights how creative application of technology and a reframing of what is considered “community engagement” can lead to broader input from individuals and organizations without losing depth and detail in feedback.
This work shows that robust community engagement can be achieved through non-traditional means through innovative uses of technology and thoughtful deployment of resources including being mindful of cost, personnel time, and accessibility.
Background and Assessment of Need
All states and territories in the U.S. are required to complete a comprehensive needs assessment for their Title V Maternal and Child Health Block Grant every 5 years in order to set the priorities and areas of focus for maternal and child health services and programs in these jurisdictions. The Alabama process was approaching its final phases during March 2020 as organizations throughout the state began transitioning to remote working, reduced on-site capacity, and restrictions on travel and gatherings due to COVID-19. For every 5-year needs assessment cycle, Children’s Rehabilitation Service (CRS), the Block Grant recipient for Children and Youth with Special Health Care Needs (CYSHCN) allocations in Alabama, hosts a full-day, in-person meeting of professionals and family representatives to review the findings of the process and prioritize the CYSHCN-focused needs for the following 5 years. Five days before the 2020 meeting, CRS was compelled to cancel the event and develop a new engagement strategy. While other states and programs opted for live webinars or online meetings, the team decided that, in a time of uncertainty and change, and when partnering with those living with chronic and special health care needs, a live meeting would be less inclusive and accessible for the population of focus. In less than 1 month, CRS staff and members of the Applied Evaluation and Assessment Collaborative (AEAC) team at the University of Alabama at Birmingham School of Public Health designed and built an asynchronous, online “meeting” that included all planned and necessary content, allowed for comment by priority communities, and resulted in a prioritized list of needs.
Description of the Strategy or Innovation
With a short timeline and limited resources for a novel platform, AEAC staff used Qualtrics, traditionally an online survey tool, to build a module that led participants through the content and process of the asynchronous meeting. Built-in features including survey logic, flexibility around question types, embedded media, accessibility checks, and automatic email distributions allowed the AEAC to develop a flexible tool and respond to any support needs from participants in real time.
Participants were first presented with a series of short videos: A welcome from the staff, overviews of the data that had been collected and synthesized, and a walkthrough of the prioritization process that participants were tasked with completing. Each video included downloadable links to relevant slides, websites, and, most critically, space for participants to reflect, share feedback, and ask questions.
Next, participants were presented, one-by-one, with 15 need statements based on the collected data. They had the opportunity to share feedback on each need and then rate it according to 3 criteria: importance, alignment with other priorities, and whether effective interventions exist. Finally, participants were asked to choose their overall top 3 needs.
Intended Impact/Outcomes
The goals for the transition to the virtual meeting were to maintain quality of responses in light of uncertain and unfamiliar processes. It was assumed that there would be fewer participants with limited feedback, especially with regard to the qualitative data. At completion, however, the virtual meeting included feedback from 3 more participants than had been registered for the in-person event (47 vs 44) and from a broader geographic area, responses to the materials and needs were complete, including rich qualitative data. All data were automatically captured. Reflections from the team noted that the qualitative comments were actionable and focused compared to previous in-person meetings, with a benefit being that individuals were able to have control over how their comments were captured rather than relying on notetakers.
Evaluation Approach
As each participant submitted their data, an automatic trigger in Qualtrics sent a brief satisfaction survey to participants. There was a 55% response rate to the satisfaction survey (26 of 47 participants). When asked whether they agreed with the following statements, 96.2% of participants “strongly agreed” or “somewhat agreed” that they believed that CRS values their opinion and that they understood the purpose of the process. Furthermore, 92.3% of respondents “strongly agreed” or “somewhat agreed” that they had the information they needed to complete the process.
Challenges and Successes
The process was not seamless. Notably, some participants experienced technological challenges while completing the virtual meeting. While the number of individuals experiencing challenges was small (less than 5), in order to support participants experiencing challenges, an AEAC point of contact was identified. This staff member’s contact information was included in email distributions. Importantly, this individual was familiar with the technical aspects of the virtual meeting build and had a high level of subject matter expertise related to the content and goals of the meeting. This role could be split into 2 (technical support and content/process support) if necessary.
As noted above, the automation of the process and the asynchronous nature of the virtual meeting allowed for consistent content delivery, convenience for participants (particularly those in essential functions at the outset of the COVID pandemic), and automated reporting for project staff. This ensured a streamlined process that meant that not only were data reported on time, but also that, data collection could continue in a way that was responsive to the needs of those in frontline pandemic response roles without adding significant additional burden.
While the qualitative data was useful and focused, as noted above, one tradeoff was that the networking opportunities and social aspects of a traditional meeting were lost. For this reason, the team would not recommend wholesale replacement of in-person meetings with these types of tools, but rather using asynchronous, virtual strategies in ways that are either complementary to in-person meetings or as alternatives when traditional gatherings are not possible.
Next Steps
The virtual meeting has inspired an expanded set of possibilities for partner and community engagement. Replications and modifications of this model have been used for a variety of scenarios including to assess the acceptability and utility of public health education and promotional materials. While community and agency partners have become more comfortable with online delivery of material and virtual meetings as complementary or alternative approaches to in-person gatherings, using a widely available tool such as Qualtrics can be a relatively low cost approach with a low barrier to entry.
Implications for Practice
This needs assessment process showed that Title V programs can capture broader, more diverse perspectives without sacrificing quality of communication, content, or feedback by using technology that can be adapted in innovative and flexible ways. Qualtrics is traditionally known as a survey platform, but it has enough flexibility in the structure and possibilities for modifications in the development of individual tools, that it can be deployed through thoughtful and creative means. This type of asynchronous, virtual meeting shows the possibility of strengthening maternal and child health (MCH) systems though rich engagement during times when in-person gatherings are not possible. This model can be easily replicated in other survey tools, and other states that are faced with difficulties convening geographically dispersed community and professional representatives could benefit from this model. While the team would not recommend wholesale replacement of in-person meetings with these types of tools, this case study has shown the utility of this type of meeting as an alternative or supplementary strategy.
Footnotes
Correction (January 2023):
Article updated to correct the authors’ order.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Ethics and Informed Consent
This work was completed as part of a technical assistance and evaluation contract. The IRB at the University of Alabama at Birmingham designated this work as not human subjects research and did not require informed consent from participants.
