Abstract
Introduction
Well capacitated national health research systems (NHRS) are essential to resolving national health challenges (COHRED, 1990); however, NHRS capacity is limited in many low- and middle-income countries (LMICs) (Kirigia et al., 2016; Ralaidovy et al., 2020) and knowledge as to how best to support NHRS strengthening is undeveloped. Universities are a critical part of NHRS (Hanney et al., 2020) and have – over the years – played a catalytic role in health research and innovations (Bryant & Velji, 2011; van Niekerk et al., 2020). Traditionally, universities have played the tripartite functions of teaching, research and community service (Bryant, 1991; Steele & Rickards, 2021). Capacity to perform these functions varies within and across universities although research capacity is generally lower in LMIC contexts, potentially because LMIC universities have traditionally prioritised the teaching function (Cloete et al., 2018).
Health research capacity strengthening (HRCS) programmes targeting LMIC universities have gained traction in an effort to strengthen these institutions to carry out impactful health research. Research capacity strengthening (RCS) has broadly been defined as the continuous empowerment of individuals, institutions and societies to systematically define and prioritise their development challenges and subsequently develop evidence and solutions that effectively and sustainably address their challenges (Lansang & Dennis, 2004). In Africa, HRCS initiatives are designed to address health research challenges that have been comprehensively documented and which include, but not limited to, underdeveloped or lack of research infrastructure and equipment, lack of research career pathways for researchers, inequitable research partnerships, lack of political will and support and inadequate funding (Lansang & Dennis, 2004; Pulford et al., 2020).
It is widely acknowledged that HRCS initiatives are complex, multifaceted, and dynamic both in design and implementation (Bates et al., 2014, 2015; Boyd et al., 2013; Vicente-Crespo et al., 2020). Despite this, studies aimed at assessing the effectiveness of HRCS have typically only sought to document the effects/outcomes of the HRCS initiatives without examining the underlying assumptions or contextual constraints to understand ‘how’ and ‘why’ they work or not (Boyd et al., 2013). For studies to yield any meaningful learning, Marjanovic et al. (2017) argue, generative causation need to be described to provide decision makers with the evidence and insights as to how and why an intervention works or not. As a consequence, there is a growing call for more complexity-aware evaluation approaches in HRCS (Bates et al., 2015; Boyd et al., 2013).
Cohn et al. (2013) defined complexity as the “dynamic and constantly emerging set of processes and objects that not only interact with each other but come to be defined by those interactions” (p.42) and this defines interventions that are implemented in open social systems such as HRCS. HRCS in a university context is both: a complex intervention implemented in a complex setting. Universities are complex institutions characterised by their array of systems, policies, and practices (Cloete et al., 2018). Operationally, universities are heavily constrained by the politicized internal and external environments (Chankseliani et al., 2022; Sabzalieva, 2022) and socio-economic challenges (Fussy, 2019) that characterise the education and research systems. University-level HRCS initiatives are not only comprised of multiple components (multiple activities such as training, mentorship, building infrastructure, etc.) but cut across different levels (i.e., individual, institutional, and societal) and these HRCS initiatives are normally implemented in contexts characterised by historical, political, environmental, and socio-economic forces (Pulford et al., 2021).
Realist Approach
Realist methodology seeks to develop programme theories to provide generative causal explanations of what it is about a programme that works, why, how, for whom and under what circumstances (Pawson et al., 2005). Realist approaches do not seek to provide a verdict as to whether a programme has succeeded or failed, but rather to examine how a programme interacts with specific local contexts to trigger mechanisms necessary for generating intended or unintended outcomes (Pawson & Tilley, 1997). Anchored on realist philosophy of science, realist approaches are well suited for, and have been used to evaluate complex interventions or interventions delivered in complex settings (Pawson et al., 2005). Contrary to the linear cause-effect approaches (such as experimental approaches) which strive to control for or eliminate other non-programme-related variables, realist approaches engage with the complexity of the programme by examining how a social intervention achieves (or not) its outcomes through mechanisms that are triggered by context (Dalkin et al., 2015; Pawson, 2006). Subsequently, realist work develops programme theory by using the context mechanism outcome (CMO) framework as a heuristic. Jagosh (2019) defined context as the elements in the backdrop environment of a programme that have an impact on outcomes (e.g., culture, institutional policies, legislation, infrastructure), mechanism as the resources offered through a programme and the way beneficiaries respond to those resources (e.g., motivation, incentive, trust, confidence) and outcomes as the intended or unintended effects based on context-mechanism interactions (e.g., improved skills, social connections, improved health) (p. 363).
Study Aim
To develop the initial programme theories for the university-level health research capacity strengthening component of the DELTAS Africa initiative and assess their plausibility in the light of the existing literature.
The DELTAS Africa Initiative
The DELTAS Africa Initiative is a science and innovation programme first launched in 2015. The initiative supports African-led research groups/consortia to carry out collaborative research, train young and emerging researchers to be scientific leaders and strengthen African universities and research institutions. With support ($100 million) from Wellcome and the Foreign Commonwealth and Development Office (FCDO), the first phase (2015–2021) of DELTAS Africa was implemented jointly by the African Academy of Sciences (AAS) and the African Union’s New Partnership for Africa’s Development (NEPAD) supporting 11 consortia with lead institutions from Sub–Saharan Africa and partner institutions from the Global North. Phase II of the Programme (2023–2026) is currently implemented by the Science for Africa Foundation (SFA) with a commitment of US dollar $70 million from Wellcome and the FCDO. Phase II is supporting 14 consortia that are spread across four African regions (Eastern, Western, Southern and Northern Africa), based in 8 African countries with partnerships across 35 countries (26 African) and 71 institutions (53 African) globally. This study will not be evaluating the DELTAS Africa programme as a whole, rather the specific HRCS component of the programme.
Methodology
Different approaches can be used to develop initial programme theories (IPTs) (Shearn et al., 2017) which would be tested through a realist evaluation. For instance, some researchers have reviewed programme documents, conducted systematic reviews of scholarly articles that have reported on similar interventions and conducted interviews or focus groups with programme designers (Fick & Muhajarine, 2019; Flynn et al., 2020; Smeets et al., 2022). In this study, we will firstly conduct a realist review (described below) of the DELTAS Africa programme documents, and relevant published papers that report on similar research capacity interventions to inform the drafting of the IPTs. Secondly, we will conduct interviews with the DELTAS Africa programme designing team using a realist interviewing technique (described below) to further develop the CMOs. Lastly, we will synthesise the evidence from both the review and interviews to come up with the final IPTs Figure 1. Development of the IPTs.
Stage 1: Realist Synthesis
Realist synthesis is a form of systematic literature review that uses secondary data to understand complex interventions and to develop programme theory (Jagosh, 2019; Pawson, 2006). In this study, a realist synthesis will be conducted to identify the evidence on underpinning programme theory relating to the a) DELTAS Africa programme and, b) other studies that have examined HRCS initiatives in African universities. The realist synthesis will seek to address three questions: (1) What are the key mechanisms that drive the outcomes of health research capacity strengthening for universities in Africa? (2) What are the contextual factors that enable the triggering of mechanisms necessary for generating health research capacity outcomes for African universities? (3) What is the causal connection between the key mechanisms, contexts, and the health research capacity outcomes?
The DELTAS Africa programme documents (e.g., funding proposals, programme description of action, the M&E framework, and the grantees reports) will be reviewed to gather insights about how the programme was designed and identify the underlying assumptions. If the programme was not designed with a realist lens, it is likely that the programme documents will not explicitly articulate the linkage between the context, mechanism, and outcomes. However, through the review process, we will potentially gain understanding of the DELTAS Africa activities and what outcomes they were envisioned to generate. This information will provide us with initial ideas on ‘whom’ is targeted by the programme through ‘what’ activity and to achieve what capacity outcome(s) across the different programme contexts. Using the CMO heuristic as the analytical tool, the information will be analysed, delineating the contexts, mechanisms, and outcomes (where possible) and highlighting gaps and how those gaps can be addressed through the subsequent review of papers and/or interviews with stakeholders.
Papers that describe HRCS interventions implemented across African universities will also be reviewed to explore tacit theories or alternative programme theories as postulated by researchers. The review process will follow guidance provided by Pawson et al. (2005) and the quality standards for realist reviews described by Wong et al. (2013) which include a) having a defined realist research question(s), b) application of the underpinning realist principles, c) focusing the review both in breadth and depth, d) construction and refinement of realist programme theories, e) development of search strategy, f) selection and appraisal of materials, g) data extraction and, h) reporting. A four-step process (Figure 2) will be followed to conduct the review of published literature. Realist review process – adapted from Molitor et al. (2023a).
Step 1: Search Strategy
Since the realist synthesis will only contribute towards the development of the IPTs (theory building) without testing them, the search strategy will involve one iterative phase of searching for relevant papers. This is different from the two-phased searches that are necessary for both theory building and theory testing (Pawson et al., 2005). The key search words will include (((“research capacity strength*” AND “Africa”) or (“research capacity build*” AND “Africa”) or (“research capacity develop*” AND “Africa”))). The main reason for focusing broadly on ‘research capacity strengthening’ is because many aspects of RCS are not discipline specific and limiting the search to the health discipline would likely yield a small number of relevant papers. The search strategy will include electronic database searches including Embase, Scopus, PubMed, Google Scholar, Web of Science and Global Health which have been deemed as the minimum threshold for systematic reviews by Bramer et al. (2017). We will additionally hand search the Health Research Systems and Policy journal – whose aim and scope include HRCS among other health research-related topics. Input from a librarian/information specialist will be sought to develop, pilot, and refine the literature search strategy.
Step 2: Study Selection Criteria and Procedures
Articles will be selected based on their relevance, that is, the data derived from the articles should inform the theory building. Papers will be considered eligible for inclusion if they report 1) research capacity strengthening initiatives, 2) programmes implemented in an African setting, 3) university as the main or part of the study setting, 4) published between 2000 and present. Different types of the documents e.g., quantitative and qualitative studies, commentaries, process evaluations and project stakeholder reflections will be included. Since we are interested in university (institutional) research capacity, papers that describe initiatives targeting individuals (individual-level capacity) without demonstrating how the research capacity is institutionalised will be excluded.
Retrieved sources will be screened for relevance (assessing whether the data can contribute to building the programme theory) and rigor (assessing whether the methods used to generate the relevant data are credible and trustworthy). Molitor et al. (2023a) argue that contrary to quality appraisals conducted in other systematic literature reviews, quality appraisal in realist reviews does not hierarchise the evidence and any study types can be included provided they are relevant to the programme theory development. Further, Dada et al. (2023) argue that appraisal of evidence in realist reviews is less about the study’s methodological quality and more on its contribution to the understanding of generative causation which is uncovered through retroductive theorising. Notably, Pawson et al. (2005) acknowledge that appraisal checklists are not favourable for realist reviews given the critical importance of the reviewer’s judgment and discretion (p. 29). Notwithstanding, this review will employ the realist review appraisal checklist by Molitor et al. (2023) to systematically assess the included papers with the aim of reporting the nature and quality of the available evidence on research capacity strengthening of African universities. The checklist guides in assessing the focus of the papers, the comprehensiveness of the information about C, M and O and importantly, the relevance of the evidence to the programme theory.
Step 3: Data Extraction and Organising
The selected papers will be examined for CMOs on how the reported research capacity strengthening intervention was envisioned/reported to work which will then be highlighted, annotated, and appropriately labelled as context, mechanism, or outcome. Data summarising the characteristics of the included papers will be captured in an MS Excel template. A template will be developed by the lead researcher and reviewed by two co-researchers (one of them an expert in realist methods) defining the key characteristics of the included papers. The lead researcher will independently extract and chart the data, and continuously update the form in an iterative process and this will regularly be reviewed by the two co-researchers and any issues discussed, agreed, and documented.
Regarding the extraction of the CMO data, the appraisal form template by Molitor et al. (2023) will be used to sort and annotate the fragments of data extracted from each paper since it already includes a section where the connection between outcomes and process (C + M = O) can be described. This will reduce the number of templates that need to be used throughout the review process. Data on actors, interventions, context, mechanisms, and outcomes will be extracted from the papers and charted by the lead researcher and this will again be independently checked by the two co-researchers. Retroductive reasoning as defined by Mukumbang et al. (2021) will be applied to extract the CMOs by identifying the research capacity outcomes and the mechanisms that are triggered across the reported university contexts thus delineating the contextual elements.
Step 4: Data Synthesis
Evidence from the review will provide more explanation and detail about the RCS initiatives and their programme theories thus informing the refinement of the IPTs. The CMOs reported across the papers will be examined and compared to identify an overarching CMO configuration and the areas of convergence and divergence between CMO configurations described thus demonstrating how specific mechanisms are triggered across different contexts to generate research capacity outcomes. To do this, a table will then be created to help summarise the CMO elements across all studies, thus highlighting overlaps (where C, M or Os have been reported by multiple papers). For any outcome related to the IPT that will be identified, data will be sought to infer the specific causal mechanism(s) that might be ’fired’ and the context(s) in which the mechanism might be activated. This will ultimately help to manage any overabundance of unstructured programme theories that may potentially be generated particularly in situations where, as Shearn et al. (2017) highlight, the included studies have not employed realist approach. The IPTs will be examined in the light of the existing evidence as part of plausibility check (Marchal et al., 2010).
Stage 2: Interviews with Programme Designers
The draft IPTs will be refined further through interviews with the programme designers. We presume that the programme documents may not articulate every detail about the CMOs and, therefore, the interviews are anticipated to elicit deeper insights regarding the assumptions and thinking behind the DELTAS Africa’s HRCS component and change/outcome pathways. The interview discussions will seek feedback on the CMOs as well as explore alternative explanations by employing a realist interviewing technique which focuses on the draft programme theories. The ‘realist evaluation – set of starter questions’ by Westhorp and Manzano (2017) will be useful in the framing of the interview questions. Relevant probing questions will be used; seeking to elicit deeper information about the participant’s responses by asking them to provide more information about the context, mechanism, or outcomes. For instance, questions may be framed as follows: what is it about the [activity] that makes it work to generate research capacity outcomes for the participating universities? What contextual elements are likely to affect/influence the achievement of the capacity outcomes and why?
At the interview preparation stage, we will request each of the programme design team members to choose a convenient date/time when they would be available for a 1-hour interview. A participant information sheet and consent form, which will detail the aim, objective and type of information sought and for what reason, will be sent to each potential participant to read through and sign if they were willing to participate. This will ensure that their participation is voluntary. Once a participant has consented to participate and provided a date/time for an interview, a Microsoft Teams meeting link will be sent. At the start of an interview, we will reiterate that participation is voluntary, and that they have the right to withdraw if they feel like doing so, and without providing any explanation for their withdrawal. Ethics approval for these interviews has been granted by both the Liverpool School of Tropical Medicine, UK, and Strathmore University, Kenya, Research Ethics Committees as part of a wider realist study.
Stage 3: Final Synthesis
The interviews will primarily yield qualitative data which will be analysed thematically with the aid of NVivo software. CMOs that support the draft IPTs will be deductively identified while new CMOs will be inductively coded from the data – consistent with guidance provided by Mukumbang et al. (2021). We will identify the reported outcomes and the mechanisms that are triggered in different contextual conditions that are necessary for those outcomes. Insights from the realist synthesis will iteratively be incorporated to uncover the CMOs and inform the framing of the CMO configurations. The CMO configurations will be used as a heuristic to guide the data coding on NVivo where each of the IPTs will be assigned a parent node and those specific to C, M, and O will be assigned child nodes. Any new programme theory identified will be added as a parent node thus expanding the initial code list. These IPTs will subsequently be tested and refined through a separate primary study.
Conclusion
Understanding of how HRCS of universities in African settings works can inform decision making about how to design, implement and evaluate HRCS processes at the institutional level. Throughout the IPT development process, we will pragmatically and flexibly deal with the complexity that is inherent in realist studies (Rycroft-Malone et al., 2012) by assuming a level of abstraction that allows us to balance between the variation in the evidence and the purpose of the review (p. 3). Realist synthesis, according to Pawson et al. (2005), cannot be strictly protocol-driven since “they are more about principles that guide than rules that regularize" (p. 32) and based on this, iterations and adaptation of the processes will be appropriately made. Any changes to the IPT development process that diverges from this protocol, and the justification thereof, will be described in the publication of the results. The IPTs on health research capacity strengthening of African universities will provide useful information for researchers, policymakers, funding and implementing partners in the African health research space. Importantly, this study will fill methodological and evidence gaps given that there is dearth of evidence on how to apply realist methodologies in HRCS programmes, specifically in the African university contexts.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work has been funded by the LSTM-Lancaster Medical Research Council (MRC) Translational and Quantitative Skills Doctoral Training Partnership (DTP) Award.
