Abstract
Background
Adherence to intervention training implementation strategies is at the foundation of fidelity; however, few studies have linked training adherence to trainee attitudes and leadership behaviors to identify what practically matters for the adoption and dissemination of evidence-based practices. Through the conduct of this hybrid type 3 effectiveness-implementation cluster randomized controlled trial, we collected Exploration, Preparation, Implementation, and Sustainment (EPIS) data and merged it with tailored motivational interviewing training adherence data, to elucidate the relationship between provider attitudes toward evidence-based practices, leadership behaviors, and training implementation strategy (e.g., workshop attendance and participation in one-on-one coaching) adherence.
Method
Our sample included data from providers who completed baseline (pre-intervention) surveys that captured inner and outer contexts affecting implementation and participated in tailored motivational interviewing training, producing a dataset that included training implementation strategies adherence and barriers and facilitators to implementation (N = 77). Leadership was assessed by two scales: the director leadership scale and implementation leadership scale. Attitudes were measured with the evidence-based practice attitude scale (EBPAS-50). Adherence to training implementation strategies was modeled as a continuous outcome with a Gaussian distribution. Analyses were conducted in SPSS.
Results
Of the nine general attitudes toward evidence-based practice, openness was associated with training adherence (estimate [EST] = 0.096, p < .001; 95% CI = [0.040, 0.151]). Provider general (EST = 0.054, 95% CI = [0.007, 0.102]) and motivational interviewing-specific (EST = 0.044, 95% CI = [0.002, 0.086]) leadership behaviors were positively associated with training adherence (p < .05). Of the four motivational interviewing-specific leadership domains, knowledge and perseverant were associated with training adherence (p < .05). As these leadership behaviors increased, knowledge (EST = 0.042, 95% CI = [0.001, 0.083]) and perseverant (EST = 0.039, 95% CI = [0.004, 0.075]), so did provider adherence to training implementation strategies.
Conclusions
As implementation science places more emphasis on assessing readiness prior to delivering evidence-based practices by evaluating organizational climate, funding streams, and change culture, consideration should also be given to metrics of leadership. A potential mechanism to overcome resistance is via the implementation of training strategies focused on addressing leadership prior to conducting training for the evidence-based practice of interest.
Plain Language Summary
Researchers and practitioners, who aim to improve the uptake of evidence-based practices, continue to seek ways in which to improve provider participation in training implementation strategies. The persistent challenge in addressing provider disengagement, while linking this disinterest to poor patient outcomes, has been ascertaining how to quantify relevant delivery considerations, for example, provider attitudes and leadership behaviors that may influence commitment to learning or apathy to behavior change, concurrently with training adherence. Through the conduct of this study, we collected both types of data: (1) provider attitudes and leadership behaviors and (2) training adherence outcomes. We found that provider openness, general leadership behaviors, and motivational interviewing-specific leadership behaviors were associated with adherence to training implementation strategies. As more emphasis is placed on assessing clinic readiness prior to adopting new evidence-based practices, a discussion on including metrics of provider attitudes to evidence-based practice, innovation, and the specific intervention is warranted, alongside consideration for how implementation training strategies focused on addressing leadership can bolster change-supportive behaviors prior to delivery of innovations.
Keywords
Introduction
Adherence to training implementation strategies for evidence-based practice (EBP) is at the foundation of fidelity, the degree that an intervention is implemented as it was designed and intended (Mowbray et al., 2003). Assessing fidelity is necessary within intervention studies, as a mechanism to ensure the intervention or EBP is being implemented in a way that is likely to produce the intended results (Budhwani & Naar, 2022). When EBPs are not implemented with fidelity, they may not produce expected results, and therefore it becomes difficult, to determine whether any observed changes are due to the intervention or to other influences (Carroll et al., 2007; Nagy et al., 2022). There are several well-known factors that can impact fidelity, including the training and expertise of those delivering the intervention, the quality of the materials and resources, and the level of support and supervision provided to those delivering the intervention (Budhwani & Naar, 2022; Carroll et al., 2007; Mowbray et al., 2003).
Considering the importance of evaluating fidelity, it is surprising that few full-scale trials measure adherence to training implementation strategies, a precursor to competence and fidelity (Schoenwald et al., 2013). Low adherence to training implementation strategies (e.g., completion of fidelity assessments for feedback, coaching session participation, workshop attendance) can render well-crafted theory-informed interventions as ineffective (Aarons, Horowitz, et al., 2012; Aarons et al., 2014). Thus, elucidating facilitators and barriers to adherence to training implementation strategies is critical to implementation efforts directed at reducing the science-practice gap. Implementation science that carefully examines contextual factors’ influence on EBP training adherence can inform the development of new and responsive strategies to enhance adherence to training implementation strategies, leading to more efficient delivery of EBPs and innovative interventions.
Motivational interviewing (MI) is a communication method of counseling that is designed to help individuals overcome ambivalence and make changes in their behavior; tailored motivational interviewing (TMI) is developmentally and culturally tailored to be delivered by HIV adolescent care providers to address the unique needs of youth with HIV, such as medication nonadherence, risk reduction, and to address problem substance use. TMI is MI personalized to a local context, taking into account clinics’ organizational needs, goals, and practices (Budhwani et al., 2022; Budhwani & Naar, 2022; Naar et al., 2022). TMI training enlists specific strategies; namely, ongoing provider one-on-one coaching, delivery of workshop trainings, and engagement in customized roleplays (Budhwani & Naar, 2020, 2022). TMI is a gold standard intervention for adolescent HIV clinical settings, considering that TMI is one of the few interventions that addresses the high level of resistance to change and ambivalence that are indicative of adolescence and emerging adulthood developmental periods (Budhwani & Naar, 2022; Naar et al., 2022). Even though TMI is vital to the delivery of autonomy-supportive care to youth with HIV, TMI adoption remains low among clinical providers (Amodeo et al., 2011). Potential reasons for this gap include providers’ lack of understanding about TMI benefits, limited training and support, time constraints, provider resistance to change, misperceptions about TMI, organizational barriers, clinical challenges, and provider attitudes toward EBPs and/or TMI (MacDonell et al., 2022; Nagy et al., 2022).
To ascertain the relationship between adherence to TMI training implementation strategies and contextual factors, we conducted a hybrid type 3 effectiveness-implementation randomized controlled trial applying the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework (Curran et al., 2012; Moullin et al., 2019; Moullin et al., 2020). The EPIS framework includes a thorough exploration phase; the data collected in this phase can be used to inform the implementation of EBPs (Carcone et al., 2022; Moullin et al., 2019). During this phase, evaluators uncover readiness, strengths, and sentiments or attitudes toward a specific innovation or intervention, such as TMI, and more general EBPs. In the preparation stage, the key personnel prepare to deliver the intervention and craft or select acceptable training implementation strategies. During implementation, practitioners and researchers leveraging EPIS focus on the importance of implementing EBPs in ways that are feasible, acceptable, and sustainable in real-world settings. During the sustainment phase, EPIS users recognize that the adoption and integration of TMI into routine practice falls to the clinics, which includes providers and their leadership. By guiding the process of identifying acceptable training strategies, leveraging the EPIS framework may enable researchers and practitioners to improve the likelihood of sustainability (Moullin et al., 2020). Within EPIS, implementation scientists are guided to understand inner and outer context factors that affect EBP implementation and specify bridging factors such as provider attitudes toward EBPs, leadership behaviors, service environments, governmental policies unique to each state, and funding streams that can have direct and indirect effects on EBP processes (Aarons et al., 2011; Moullin et al., 2019). Regarding HIV care and treatment for youth with HIV, using EPIS enables us to theorize that provider attitudes—alongside leadership behaviors, implementation climate, and organizational culture—influence the adherence to training implementation strategies, fidelity, adoption, and thereafter future scale-up of EBPs such as TMI. Within EPIS, we also find delineation of factors likely to influence adoption and implementation in complex service systems such as those involved in implementing EBPs in prevention and treatment settings. We hypothesize that EPIS provider attitudes and leadership behaviors will explain differences in adherence to training implementation strategies (workshop participation, supervision with coaching, fidelity assessment and feedback, implementation team selection process, and implementation team meetings) for TMI (Powell et al., 2017).
Method
Parent Study
To understand the process, context, determinants, and mechanisms of TMI implementation, we conducted a hybrid type 3 effectiveness-implementation stepped wedge cluster randomized controlled trial under the auspices of the Adolescent Medicine Trials Network for HIV/AIDS Interventions (ATN, Protocol 146) (Idalski Carcone et al., 2019). The ATN is a research network dedicated to developing, testing, and implementing behavioral interventions for youth with HIV to improve health outcomes and well-being (Naar et al., 2019). In this study, we leveraged a convergent parallel mixed-method design to describe the inner and outer context factors affecting implementation at sites in Baltimore, MD; Birmingham, AL; New York City, NY; Detroit, MI; Los Angeles, CA; Memphis, TN; Miami, FL; New Orleans, LA; Philadelphia, PA; San Diego, CA; Tampa, FL; and Washington, DC. While the true beneficiaries of TMI are youth with HIV patients, study participants were clinical providers, such as medical doctors, nurses, psychologists, social workers, pharmacists, and other care providers with at least 4 h of weekly patient contact. Leaders self-identified themselves and typically held the title of director (e.g., medical director and clinic director). To collect contextual data at baseline, prior to participating in any TMI training activities, providers completed a telephone interview and an online survey. This manuscript reports on findings from the baseline quantitative survey, as well as TMI training adherence outcomes.
Participants and Procedures
Study participation was voluntary. Potential participants received an introductory email and invitation to schedule an interview, based on their availability. Once the interview was completed, participants received an email including an online survey link. After completing both data collection components, participants received a $10 Amazon e-gift card. At the end of the first step, 140 individuals attempted the survey, but the sample for this study is limited to the subset of those who completed surveys that captured inner and outer contexts affecting implementation and TMI training (n = 77). Due to missing data patterns (e.g., three participants did not complete the MI-specific leadership scale), the sample size in each regression ranges from 74 to 77. This study was approved by multiple institutional review boards, and informed consent stressed the voluntary nature of study participation.
Measures
Primary Outcome: Adherence to TMI Training Implementation Strategies
Adherence to TMI training implementation strategies was measured by attendance across ten training opportunities following a clinic-wide workshop, including two 60-min coaching sessions, four standardized patient interactions, and four 45-min coaching sessions. The corresponding TMI training implementation strategies adherence score, which ranged from 0 to 1, reflected the proportion of the ten possible training activities attended. Psychometric properties of this training implementation strategies adherence tool and a description of measurement development are detailed elsewhere (Alfonso et al., 2023).
Secondary Outcomes: Provider and Clinic Characteristics
Participant characteristics included age, race, ethnicity, gender identification, level of education, number of years in HIV service or clinical delivery, current job title, number of years in current position, and caseload (number of patients).
Secondary Outcomes: Provider Attitudes and Leadership Behaviors
The evidence-based practice attitude scale (EBPAS-50) was used to assess EBP adoption attitudes (Aarons, 2004; Aarons, Cafri, et al., 2012). EBPAS-50 includes 50 items allocated to 12 subscales: intuitive appeal of EBP (α = .80) (Idalski Carcone et al., 2019), likelihood of adopting EBP given requirements to do so (α = .90), perceived divergence from usual practice with research-based/academically developed interventions (α = .90), organizational support for learning an EBP (α = .85), the balance of skills and the role of science in treatment (α = .79), time and administrative burden associated with learning EBPs (α = .77), positive perceptions of receiving feedback related to service delivery (α = .82), EBP Fit with the values and needs of the patient and clinician (α = .88), limitations and their inability to address patient needs (α = .92), negative perceptions of monitoring or supervision (α = .87), likelihood of increased job security or professional marketability provided by learning an EBP (α = .82), and openness to new practices (α = .78). All subscales aim to assess providers’ attitudes toward EBPs as a general construct.
To gauge provider attitudes toward MI specifically, four subscales were minimally adapted to probe provider attitudes toward MI (rather than EBPs more generally). The adaptation consisted of rewording items (where necessary) to target MI. These subscales were burden, limitations, security, and support. For scoring all subscales, we computed a mean of each set of items loaded on a given subscale, with a higher mean score suggesting a more positive attitude for that domain.
Leadership behaviors were assessed using two scales. First, we employed a brief global assessment of organizational leadership, the director leadership scale (DLS, α = 90) (Broome et al., 2009). The DLS is a nine-item instrument that captures perceptions of leadership behaviors within an organization. The DLS specifically measures aspects of transactional and transformational leadership styles among providers and their clinic leaders. Transformational leadership inspires and motivates providers, while transactional leadership is based upon reinforcement and exchange of contributions. Second, we used the implementation leadership scale (ILS), an assessment of leadership behaviors that are in support of specific EBPs, in the case of this study, TMI (Aarons et al., 2014). The ILS is comprised of four subscales: proactive leadership (α = .95), knowledgeable leadership (α = .96), supportive leadership (α = .95), and perseverant leadership (α = .96). Proactive leaders have clear standards for EBP delivery with plans to facilitate implementation and actively removes obstacles. Knowledgeable leaders are knowledgeable about the EBP and can answer questions about the innovation. Supportive leadership appreciates staff efforts to use the EBP and supports learning about the EBP. Perseverant leaders persevere through the challenges of EBP implementation and respond to critical issues related to EBP implementation, training, and sustainability.
Data Analysis
The dependent variable, provider adherence to TMI training implementation strategies, was modeled as a continuous outcome with a Gaussian distribution. Predictors of TMI training adherence included provider demographic variables and provider attitudes toward EBPs and leadership behaviors, acquired through a baseline survey from the protocol. Providers were nested within ten clinics, which were themselves nested in five random clusters. The timing of TMI training varied across clusters and therefore across providers. Regardless of cluster order, predictor variables were assessed at a single time point, at baseline. The relatively small number of clinics, clusters, and observations precluded multilevel modeling of these data. To account for the nonindependence of providers from the same cluster, all regressions controlled for the provider cluster, where cluster 1 was the reference group and clusters 2 through 5 were indicated by dummy codes. Single-level models were run in SPSS, where demographics, EBP attitudes, and leadership behaviors served as predictors of adherence to TMI training implementation strategies. For all predictors, tests of statistical significance were based on the Wald test, with 95% confidence intervals computed as indicators of the magnitude and precision of significant effects. Missing data were addressed with listwise deletion.
Missing Data or Low Endorsement
Demographic data and provider work characteristics were available for all 77 providers. Providers reported some identities at low frequencies. These response rates led to certain coding decisions for regression models. Specifically, because no respondents reported gender identities other than female and male, gender was coded as female = 0 and male = 1. Education was coded as 0 = undergraduate degree or less and 1 = any postgraduate or professional education. Low frequencies of respondents identifying as Latinx or Hispanic precluded the examination of ethnicity as a predictor of TMI training adherence. Race was coded such that 0 = White (the highest frequency group), with dummy codes for Black, Asian, or other races. Of the 77 providers, only four endorsed a race other than Black or White. Thus, although all participants were included in the model, it was only possible to test for statistical differences between Black and White participants. Therefore, significance tests for the Asian or other race indicators are not reported.
For attitudes toward EBPs in general, 99% (n = 76) of providers completed Balance, Burden, Feedback, and Monitoring. For MI-specific attitudes, 99% (n = 76) of providers completed burden, limit, security, and support. For both EBPs in general and MI specifically, 96% (n = 74) of providers completed the five perceptions of leadership scales. All 77 providers completed the remaining instruments.
Results
Descriptive Statistics
The number of providers in each clinic ranged from n = 1 (clinic 9) to n = 14 (clinic 1). The number of providers in each cluster of randomized clinics ranged from n = 12 to 20. See Table 1 for provider sample sizes within each clinic and cluster. Twelve percent of providers identified as male, with the remaining providers identifying as female. Provider-reported age ranged from 24 to 63 years (mean [M] = 43.03, standard deviation [SD] = 10.87), and 11% identified as Latinx ethnicity. Only four providers identified as Asian or an unlisted race, compared to 44% of providers who identified as Black and 50% of providers who identified as White. Over half of the sample reported some postgraduate or professional education (n = 52, 68%). Among the remainder, 4% of providers reported having an undergraduate degree, 10% reported some college, 12% reported high school completion, and 4% had not completed high school. The mean score for TMI training adherence indicated that on average, providers completed approximately half of the TMI training implementation strategies (M = 0.56). There was notable variability across providers (SD = 0.21), and individual scores spanned the full range of the scale. Descriptive statistics for all demographic, attitude, and outcome variables are presented in Table 2.
Provider Frequencies Across Cluster and Clinic
Descriptive Statistics and Regression Estimates for Study Variables
EST = estimate; EBP = evidence-based practice; MI = motivational interviewing; TMI = tailored motivational interviewing.
aBecause of low frequencies (n = 2), statistical tests for the regression estimate corresponding to this variable are not stable and therefore not reported.
bBecause White (the highest frequency category) was the statistical reference group for this regression, there is no estimate to report.
Note. All regressions control for provider Cluster ID. N for each predictor was equivalent to N for the corresponding regression. General and MI-specific EBP attitudes are scored by domain (not overall). General leadership attitudes are a single score (no domains). MI-specific leadership attitudes are scored overall and by domain. For significant results, 95% confidence intervals are reported in the text. For other variables, 95% confidence intervals can be computed from this table with the following formula: EST ± [SE × 1.96].
Regression Models: Predictors of Adherence to TMI Training Implementation Strategies
Provider demographics, attitudes toward EBPs, and leadership behaviors were tested as predictors of adherence to TMI training implementation strategies. Regression estimates, standard errors, and significance tests for all models are presented in Table 2. Confidence intervals (CI) for significant results are presented in the text.
Demographic variables (gender, age, race, education) and provider work characteristics (years in HIV care, years in current position, and caseload size) were not associated with adherence to TMI training implementation strategies. Of the nine provider attitudes toward EBPs in general, openness was associated with adherence to training implementation strategies (estimate [EST] = 0.096, p < .001; 95% CI = [0.040, 0.151]). As provider openness toward EBPs increased, so did adherence to the TMI training implementation strategies. Of note, multiple leadership behaviors were significantly associated with adherence to TMI training implementation strategies; this held for both general leadership (EST = 0.054, 95% CI = [0.007, 0.102]) and MI-specific leadership. In addition to overall perceptions of MI-specific leadership (EST = 0.044, 95% CI = [0.002, 0.086]), of the four MI-specific leadership domains, two were significantly associated with training implementation strategies adherence. Specifically, as provider MI-specific leadership behavior domains of knowledge (EST = 0.042, 95% CI = [0.001, 0.083]) and perseverant (EST = 0.039, 95% CI = [0.004, 0.075]) scores increased, so did adherence to the TMI training implementation strategies.
Discussion
Based on previous literature, we hypothesized that provider attitudes, namely, attitudes specific to TMI and attitudes related to EBPs more generally, would explain differences in providers’ adherence to training implementation strategies for TMI; we only found evidence that the general EBP attitude of openness was associated with adherence to training implementation strategies. We also hypothesized that leadership behaviors would be positively associated with TMI training implementation strategies adherence. We found support for this hypothesis with general leadership, as well as MI-specific leadership behaviors being significantly associated with TMI training adherence score, perhaps indicating that when promoting EBP training to providers, more extensive onboarding with the provision of EBP-related education that highlights the EBP's impact and body of evidence to support implementation and delivery of, in this case, TMI, is necessary.
Implementation strategies can place emphasis on assessing clinic and provider readiness prior to implementing a new EBP by evaluating outer context constraints (e.g., policies, funding) and inner context factors such as organizational leadership. In addition to measures of organizational readiness, a discussion on the value of including metrics of the organization's leadership behaviors generally and specific to the EBP of interest could inform the development of tailored implementation training implementation strategies focused on addressing leadership readiness and downstream adherence to EBP training (Powell et al., 2019; Shea et al., 2014).
Based on our finding that leadership behaviors are relevant to training adherence, a potentially relevant tool is the leadership and organizational change for implementation (LOCI; Aarons et al., 2015; Aarons et al., 2017; Skar et al., 2022). LOCI aims to align leadership and trainees to support an implementation climate that communicates to clinic staff that the use of an innovation or EBP such as TMI, is expected, supported, and rewarded in the organizations (Aarons & Sommerfeld, 2012; Ehrhart et al., 2014; Jacobs et al., 2014). While the present study, we conducted an assessment of attitudinal and leadership dimensions, the field of implementation science would benefit from rigorous studies that test strategies such as LOCI aimed at changing organizational context to support providers adhering to training implementation strategies for models such as TMI and to EBPs such as MI (Aarons et al., 2014).
Future Directions
In addition to integrating and testing LOCI as a precursor to EBP training and adoption, implementation scientists may explore how changes in attitudes and leadership behaviors affect training adherence, acknowledging that both may change with the increased exposure that occurs during EBP training. Further, replicating this study within a tighter geographic area may produce different outcomes. The outer context construct of sociopolitical climate may be markedly different in the under-resourced southern United States as compared to more urban and prosperous cities.
Limitations
Findings should be applied considering limitations. The data are from a relatively small sample of sites and clinic 9 only included one participant after merging datasets, limiting generalizability. The number of sites did not allow for a statistical analysis accounting for the nested data structure. Findings should be interpreted within the narrow context of HIV service provision for youth with HIV. The EBPAS-50 balance subscale (α = .63) had low reliability so caution should be used when interpreting outcomes related to this subscale. Providers were mostly female, which although not uncommon in HIV care and treatment settings, limited broader application. Further, it is possible that adherence to training implementation strategies may be impacted by other inner-setting contextual factors that were not captured in our data, such as organizational staffing processes that may impede providers’ ability to adhere to training implementation strategies. Despite these limitations, our findings offer unique insights into attitudinal and leadership factors affecting providers’ TMI training protocol adherence.
Conclusions
Leadership behaviors and provider attitudes are particularly important when preparing organizations to embed an EBP into routine care; leadership behaviors and attitudes of openness to new EBPs were directly associated with adherence to TMI training implementation strategies, a precursor to intervention competence, leading to TMI fidelity, producing positive behavior change among patients. In the case of youth with HIV, positive behavior change may include improved medication adherence, greater comfort in status disclosure, or transmission risk reduction–all high priorities for ending the HIV epidemic (Budhwani & Naar, 2022; Budhwani et al., 2021). Using the EPIS framework, to inform this study on how multiple factors might affect TMI training implementation strategies adherence, helped to uncover that leadership behaviors and attitudes of openness to new EBPs could be valuable additions to clinic readiness assessments that are conducted prior to EBP delivery and uptake.
Footnotes
Acknowledgements
HB is the lead author of this manuscript. SN is the senior author expert in TMI and is the protocol PI for this study. GA is a senior implementation science scholar and contributed to the utilization of the EPIS framework and attitude and leadership measures. AC and KC led the implementation science data collection. ZA and JC conducted all analyses reported herein. MPB contributed to editing. All authors contributed to the writing and editing.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Research reported in this publication was supported by the National Institute of Mental Health (NIMH) and Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) of the National Institutes of Health (NIH) under Award Numbers K01MH116737 (Budhwani) and U19HD089875 (Naar). The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.
