Abstract
Higher education institutions are pressured to use data for university rankings, accreditation, decision making, and technical support of students and faculty. However, these organizations also experience barriers and resistance to such data-driven aspirations. Rather than simply focusing on the whole organization, this research attends to variations in organizational subunits and asks what processes help and hinder data use. It pays attention to the puzzle of how
Keywords
During the turn of the 21st century, higher education institutions (HEIs) had started incorporating data in various ways to improve technical capacity, increase accountability, and address institutional pressures (Agasisti and Bowers, 2018; Bichsel, 2012; Custer et al., 2018; Gagliardi, 2018). Universities use various forms of student and administrative data to make organizational decisions, compete in university rankings, understand trends among student and faculty populations, and market themselves to stakeholders (Cox et al., 2017; Custer et al., 2018; Williamson et al., 2020; Wong, 2017). More recently with the global COVID-19 pandemic, higher education organizations employed surveys and learning analytic tools to understand, address, and solve student concerns (Fonseca et al., 2020; Prinsloo et al., 2021). Data are essential not only for student academic performance but also for monitoring students with physical disabilities and mental health issues (Chen, 2020; Cooper et al., 2016). Because of these, they are working toward being data-driven organizations that are often cited to be more successful than those merely driven by intuition (Anderson, 2015; Davenport et al., 2010).
Despite the many potential benefits of using data, however, universities experience difficulties in optimizing the use of their data. For example, data oftentimes function more for ceremonial rather than technical purposes, as in the case of institutional assessments that have no impact on student experience or outcomes (Cox et al., 2017). Higher education staff also resist or merely comply with the collection of institutional data without any consequences for their work (Arroway et al., 2016; Miller, 2019). Instructors also more often rely on their professional expertise rather than attend to data from learning analytic software (Arroway et al., 2016; Macfadyen and Dawson, 2012). These scenarios show that HEIs are not completely data-driven and suggest the need to understand more deeply data use in higher education.
This research asks what processes help or hinder the systematic and centralized use of data in a university. Although previous studies document various uses of data in higher education, these are limited by three factors. First, they assume monolithic organizations of higher education rather than acknowledge the variations across organizational subunits. For example, the variation in data storage, definition, structure, and use across subunits often lead to inaccuracy and poor data quality necessary for decision making (Gagliardi, 2018). Second, previous studies often discuss technical aspects of creating effective data systems while neglecting political and interpersonal dynamics that prevent these systems from taking root (Macfadyen and Dawson, 2012; Miller, 2019). Third, most studies focus on higher education organizations in the Global North even as similar processes of data-driven management and pressures are happening in universities in the Global South.
To understand processes that help or hinder data use, we leverage the case of a private university in the Philippines, which has experienced pressures to provide data to accrediting and ranking organizations and has attempted to improve its systems for data collection and analysis. We use this case as a springboard for understanding the complex dynamics of how universities—notorious for being decentralized and siloed—succeed or fail at systematizing data. This university is situated in both the national field of HEIs that need triannual accreditation and an international field of global university rankings. Thus, we are using this university to show the pressures experienced globally by universities to be legible to both national and international stakeholders. In the Philippines, HEIs are evaluated by the Commission on Higher Education (CHED), which collects, collates, stores, and disseminates data to inform policies. Since the commission's inception in 1994, it has regularly collected data from all HEIs in the country which it uses to update its higher education management information systems. Moreover, the university participates in international rankings like the Times Higher Education and QS world university rankings—showing the twin pressures from national accreditation and international competition agencies.
Recognizing the value of evidence-based decision-making to promote accountability and compliance is not unique to the Philippines. This accountability and compliance movement shows a global phenomenon where HEIs from both the Global North and the Global South are compelled to provide data to external audiences as part of reporting standards and requirements as well as an increasing orientation of accrediting bodies towards data-driven continuous improvement (Morest, 2009). In addition, Mandinach (2012) noted that the increasing pressure on faculty and HEI administrators are driven by policy-makers’ desire to use data to inform their practice. These pressures for accountability and compliance, which are described by Scott (2003) as external to the institution, tend to exert influence over the internal operations of HEIs.
To preview the argument, we found that university subunits can be data-driven even as the larger institution struggles with creating effective data systems. Moreover, processes that were meant to help, such as centralization and identification of data-driven subunits, can break down in the absence of integrative systems. We suggest that these ideas about
We argue that this research has implications for the sociologies of higher education, organizations, and quantification. In terms of higher education, the research highlights the need to understand how the decentralized and often decoupled character of university departments and offices can lead to autonomy and nimbleness but prevent integrated and streamlined data systems (Macfadyen and Dawson, 2012). In terms of organizational studies, this research contributes to recent studies challenging the monolithic and unitary character attached to organizations like a university or a state (Arthars and Liu, 2020; McDonnell, 2017). Moreover, the present research contributes to studies of new institutionalism by suggesting the process for how technical changes happen on the ground through organizational subunits even as only ceremonial changes happen for the larger organization. In terms of studies of quantitative technologies, this research clarifies how processes that were supposed to help systematize data can backfire and have unintended consequences (Chun and Sauder, 2021; Williamson et al., 2020).
Literature review
Aids and barriers for a data-driven university
In the past 30 years, there has been an increased interest to embed data-driven practices in the education sector (Marsh et al., 2006). A data-driven culture is characterized as the continuous process of collecting data, using data analytics to derive insights, and making decisions based on these insights (Anderson, 2015; Berndtsson et al., 2018; Buitelaar, 2018; Halper and Stodder, 2017). Applied to the context of higher education, data-driven decision making (hereafter, DDDM) refers to the “systematic collection, analysis, examination, and interpretation of data to inform practice and policy” (Mandinach, 2012: 71).
According to Marsh and colleagues (2006), DDDM in the education sector is borrowed from the fields of quality management and organizational learning, which focus on organizational improvement being responsive to various types of data such as input, process, outcome, and satisfaction data. These data are then used as evidence to inform the decisions of policymakers and executives. The growing movement to have a “culture of evidence” is due to greater demands for transparency and accountability (Middaugh, 2007) as well as external pressures to monitor student learning and institutional effectiveness (Morest, 2009). Many researchers highlight the importance of data-driven culture to better improve organizational performance (Berndtsson et al., 2020; Storm and Borgman, 2020). For example, Davenport et al. (2010) suggest that becoming data driven is the way for organizations to “make better decisions and take the right actions” and not rely on the “golden gut.”
However, this desire for DDDM may meet resistance in the higher education environment. In their study of the implementation of learning analytics, Ferguson et al. (2014) argued that while complex systems like higher education institutions are stable systems, they are also often resistant to change. Hence, they suggest that the successful adoption is attained not only through better analytic tools but also by beginning with a strategic vision, critically identifying barriers, and establishing relevant training and support systems (Ferguson et al., 2014). A top-down approach in addressing resistance to change creates central data systems where subunits are interconnected and interoperable (Bichsel, 2012; Gagliardi and Turk, 2017). Having a central data system helps organizations to collect, use, manage, and analyze data for decision-making (Agasisti and Bowers, 2018). While many studies of learning analytics focused on a top-down approach, Arthars and Liu (2020) examined a bottom-up approach in the institutional implementation of learning analytics, that is, an initiative taking place at the individual or subunit level.
While these practices are helpful, other researchers have focused on at least three barriers that prevent quantitative technologies and DDDM from taking root in organizations. According to Gagliardi (2018), a major barrier to an organization's transformation is a
A second barrier is the lack of
Finally,
Quantification as ceremonial, technical, or both?
Data are not neutral; they are active components in changing individual and organizational behaviors, actions, and decisions. Data-driven technologies and evidence-based practices are merely some of the ways for data and numbers—in a word,
How data are used is largely dependent on the systems that sustain organizations. Organizations do not simply follow dense, tight systems to maintain operations; on the contrary, they can be better understood as “loosely coupled” systems. Weick (1976) in his seminal essay on loose coupling proposes that subunits within organizations are simultaneously responsive and interconnected to other units as much as they are autonomous and logically separated. This suggests that organizations are held together by building blocks that are not highly dependent on each other but rather severed (Orton and Weick, 1990).
In light of loose coupling, Meyer and Rowan (1977) highlighted the myth and ceremony that are often used in educational organizations. They argue that the “formal structures of many organizations in postindustrial society … dramatically reflect the myths of their institutional environments instead of the demands of their work activities” (Meyer and Rowan, 1977: 341). They go on to say that educational systems are loosely coupled in a way that they strive to maintain legitimacy rather than address the demands for technical efficiency. They give an example of the education sector where inter-organizational conformity and ritualization take precedence over technical changes (Meyer and Rowan, 1977). What makes this acceptable is institutionalization, which legitimizes the organization in formalizing structures, refining boundaries, and establishing routines (Colyvas and Powell, 2006). Thus, in the case of data-driven technologies, their use may be more aligned with performative goals rather than technical effectiveness—an example of how schools ceremonially use data in order to avoid closer evaluation (Meyer and Rowan, 1978).
Such ideas, however, can be challenged given more recent examples of how education “is introducing a new element of competition and forcing established institutions to become more market minded and entrepreneurial” (Dieter-Meyer and Rowan, 2006: 2). Researchers argue that three institutional changes have altered education: more educational options, tighter coupling, and a knowledge-dependent economy (Dieter-Meyer and Rowan, 2006). In particular, tight coupling happens when the technical work is tightly managed and coordinated, and not merely ceremonial. Using the example of law school rankings, Espeland and Sauder (2009) show that rankings function as an institutional technique to promote this tight coupling between what is projected to others and what actually happens within organizations. Using data for rankings helps organizations and their members internalize certain goals such that “abstract systems become embedded in organizations and embodied in members” (Sauder and Espeland, 2009: 79). Taken together, these suggest the possibility of quantification to have both ceremonial and technical functions.
Propositions
As this research asks what processes aid or hinder data use in higher education organizations, researchers may look into practices employed in the entire organization. However, organizations are made up of various organizational subunits that have differing functions, differential use of data, and variable processes (Kanten et al., 2015). Given the variations between offices in organizations, the use of data may also be equally variable. Thus, we suggest this proposition:
In this sense, we are suggesting that certain organizational subunits may have their own processes that aid or hinder the use of data in conjunction with, or in opposition to, the processes in the larger organization. Another implication of this proposition is that some organizational subunits may be more “effective” than others. For example, one office may have sophisticated technical protocols for handling data and making decisions out of them while another office (in the same organization) may only use data for performative purposes or have no data at all to use.
However, it is possible that when enough organizational subunits use data for rational means and decision-making, the larger organization may follow suit (Tiller, 2012). If, for example, a critical mass of offices has their own data systems, they can in principle turn the tide toward a more effective data system. Thus, we propose that:
To complement this initiative of having enough organizational subunits using data for technical ends, an organization may intentionally create systems and structures to centrally coordinate data (Peyman et al., 2011). Thus, we hypothesize that:
However, Propositions 2 and 3 may not hold because the coordination of a large organization is not only the sum of individual parts being data-driven. While the key functioning of organizational subunits may be a necessary condition for a data-driven larger organization, it may not be a sufficient one. Moreover, while a coordinative body may be an important component, in theory, its practice may tell another story. Thus, we explore these propositions with an empirical case.
Data and methods
We conducted the study at a large private university in Manila, Philippines, herein called Hillside University (HU). With a population of more than 8000 undergraduate and 6000 graduate students, it is considered one of the top universities in the country, attracting students from different cities, hosting exchange students from other countries, and consistently being part of international university rankings. We use the case of HU for a number of reasons. First, the university experienced pressures from international rankings, accrediting bodies, and changing government policies. These are phenomena many universities around the world experience, and while each university is unique, the present case study provides a site of understanding lived grounded dynamics of how universities respond to these pressures. Second, the university's response is typical of many higher education organizations, which is to create offices and institute structures for managing data (see Chun and Sauder, 2021; Sauder and Espeland, 2009; Williamson, 2018). Although previous studies highlight the creation of structures, they do not focus on what happens after these structures were created. This is what this research tries to document. Third, the case of HU provides an important documentation of universities in the Global South experiencing similar globalizing and neoliberalizing pressures as universities in North America, Europe, Australia, and East Asia. In a sense, it suggests how the pressures for university rankings, DDDM, and learning analytics are not simply concerns for universities in developed countries but also universities in developing countries.
To answer our research question, a senior member of our research team interviewed 36 senior and mid-level administrators and staff across 21 organizational sub-units at HU, including an academic vice president, deans, administrative directors, and assistant directors. These participants were carefully selected as they were the unit's leaders or managed the unit's data. Moreover, these student-interfacing sub-units have been included because of their regular use of data or being compelled to use them. The researchers opted not to interview instructors as data were rarely directly handled by them since these were usually coursed through the administrative offices. Lasting between 40 and 90 minutes, the semi-structured interviews provided a holistic view into the organization of data and information in the university, and how data were collected, used, managed, and analyzed in the different offices and within the university as a whole. Our questions included (1) How would you compare the use of data when you started in the university and now? (2) What facets helped or hindered in being more data-driven? (3) What internal and external pressures affected data use? Appendix 1 details the list of interview themes and questions while Appendix 2 lists the sub-units interviewed. These interviews were then transcribed, coded, and discussed by the research team.
Our team, composed of four individuals, used thematic analysis, initially reading through the transcripts to identify themes and codes (Maguire and Delahunt, 2017). We settled on seven themes with three to five codes under each theme; they span different themes such as data use, data culture, and data systems. After setting the themes, we coded the transcripts, being mindful of quotes and situations that support or contradict the codes we initially included (Braun and Clarke, 2006). To assure reliability, at least two individuals in the research team looked at every interview transcript, each person independently hand-coding them in separate Word documents. In coding the data, the researchers adopted an inductive approach to develop codes directly from the data and derive credible insights and interpretations from these (Linneberg and Korsgaard, 2019). Each researcher reviewed the transcript and created an initial list of 90 codes based on this data. The second round of coding was done to create higher-level categories from this initial list. Moreover, the group met consistently to refine the codes and discuss the key points from the study. From the seven initial themes, we collapsed related ideas with each other to suggest the three themes that confirm or contradict the propositions we laid out in the previous section. Appendix 3 includes the themes and codes used for this research.
Findings
Our research finds support for the proposition that a university has different organizational subunits that may differ from each other in terms of the level of use and technical functionality of data. However, we do not find support for the proposition that having data-driven subunits and a coordinative office is sufficient to transform data use practices in a university. On the contrary, having strong independent data-driven organizational subunits may actually be a hindrance to an integrated university data system.
Data use variation in organizational subunits
Universities are highly decentralized organizations, and HU is no exception. Such decentralization can provide nimbleness and flexibility for decisions to be made quickly, but a decentralized system may also prevent information sharing, leading to redundancies and inefficiencies in the work. Many of the senior university officials highlighted this fragmented, incoherent, and redundant data ecosystem in the university, with one respondent highlighting that the key issue revolves around different offices and departments “operating in silos”—each one with its own way of collecting, storing, warehousing, and analyzing data.
While previous research may assume that the organization will be either data-driven or not, we found that many sub-organizational variations resulted from this decentralized system. For example, a number of offices had a clear protocol for data use and analysis. During an interview with the graduate programs’ office, two senior officials talked about how they used student application information and satisfaction surveys to monitor student progress, make decisions for scholarships, and provide information to different departments. They provided details about how data was collected through Google Forms, stored in their cloud drives, and disposed of after some time. The director of this office said, Well, usually, our main data, the bulk of it comes from our applicants for grad studies so we mainly get those from application forms that we filled up. Right now, we have an application portal of our grad programs. That one, we use for mainly, initially for evaluation … Then the department will use that for evaluating if the particular applicant will be able to get into their program or not. Afterwards, once they have been admitted, those data will be transferred to the Registrar's Office. That data will be used as the student's records already. That's the beginning of their record with the school.
In contrast, some offices were able to collect a lot of data but were unable to fully analyze them. For example, the library logged how many students entered its premises and different rooms. The library had automated systems for counting the number of entrants to the library's three buildings and manual systems of librarians documenting the number of students entering their specific areas. However, such data were never analyzed. Even senior library officials were unaware of why this had been the practice for so long since they created daily logs but never made any consequential decisions from them.
Moreover, certain inter-office relationships have clearer protocols for data sharing than others. For example, the student information from the Office of Admissions is automatically sent to three offices immediately after students are enrolled: the Offices of the Registrar, Student Health Services, and Guidance and Counseling. We found similar dynamics of systematic protocols for information shared between the offices of student activities and campus events. However, many other offices had unclear systems and workflows for requesting, receiving, or sharing data.
These different examples highlight the concept of “data use variation,” where certain organizational subunits were more effective at using data for practices and processes that affect their technical work. This subsection presents a key insight regarding the importance of the organizational subunit, and how the factors that aid data use have to be interrogated in the subcomponents of an organization. The loosely coupled structure of HEIs, and HU, in particular, provide the possibility for a subunit to be data-driven (or not), without being consequential for the rest of the organization.
Data fragmentation or how the organizational whole is less than the sum of its parts
While we initially wanted to assess whether the whole university was “data-driven,” we found that it was better to make distinctions between the organizational subunits and the university. When we made this distinction, we found an interesting puzzle. On one hand, many administrators expressed that their respective offices collected and used data for decision making, evaluation, systems management, and strategic planning—with many subunits asserting to be “data-driven.” Conversely, many of the same administrators critiqued the university culture for not being more data-driven—suggesting that the whole is
One dean had been intentional about data collection during the 2020 COVID-19 pandemic, even going so far as creating the division's own survey of classroom experiences. However, when asked to reflect about the university as a whole, he mentioned: [Decision-making] still [uses] emotional reasoning; we rely on impressionistic reasoning … We came from a culture that “you trust the man [
Similar critiques came from another dean who regularly analyzed students’ performance and faculty statistics, but who critiqued how the university used data after the fact. His example was the university's decision to create a new laboratory facility before doing any market analysis for it. A student affairs director had similar experiences as his programs were regularly evaluated and made changes out of these data, but he also lamented how the university “collects data for the sake of collecting data.” A number of examples include the university's decision to shift to a quarterly system during the COVID-19 pandemic with little data or survey to justify the shift, and the school's triannual college student survey that few people had really analyzed aside from descriptive statistics presented to the school forums.
These different examples highlight the concept of “data fragmentation,” where individual component parts were effective in their function of collecting and analyzing data, but these components were never harmonized with each other. In a recent staff survey on data governance, many offices in the university had acceptable scores for data management and information security (averaging 3.5 on a 5-point Likert scale). However, a common refrain from the administrators was the inability to have a system and structure to organize and analyze the data, suggesting that the component parts may be effective whereas the whole university is not.
While we expected that simply having enough offices use data for technical ends would tip the scales toward a data-driven culture, we found that having discrete data-driven subunits was not enough to transform the university. In fact, even if all subunits in HU had been data-driven, this was no assurance that the whole university would be data-driven. This comes because of the very character and structure of the university that tends to be siloed, autonomous, and at times, even “territorial”—a factor we hope to explore in the next section. Here we show that tight coupling and loose coupling occur simultaneously, though it is to the detriment of the data culture of HU. While there was a system that tightly facilitated data gathering in organizational subunits, data's very fragmentation makes HU as a whole less data-driven than its subunits combined.
Centralization/coordination versus structural reality
Because of the fragmentation of data and the pressures for data from external organizations, many universities create offices that try to coordinate efforts at collecting and analyzing data (Garfolo and L’Huillier, 2015). In HU, the institutional focus on data came around the time when data were needed by international ranking organizations such as Quacquarelli Symonds and Times Higher Education, and accrediting bodies like the ASEAN University Network and Philippine Accrediting Association of Schools, Colleges and Universities. To respond to the need to systematically collect information necessary for these external organizations, the university created two offices around 2018: one quality assurance office (herein referred to as QAO) for the university's academic divisions, and another strategic management office (SMO) for the whole university, inclusive of the K-12 and professional divisions. Often working alongside each other, these offices have been tasked to liaise with these ranking and accrediting bodies, and collect data from the subunits.
While the creation of the offices was the first step to data coordination, this did not pan out as smoothly. The director of one of the offices mentioned: It was very bad. I think there are some offices within our university that have come to realize how important the data would be, not only to submit to our office, but to improve their operations. But then there are some offices [where they say] “we don't need to document.” So, it varies, and the levels of resistance are very different. There's some that are cooperative. There's some that will shut us out.
A number of factors contribute to this resistance from organizational subunits to coordinate and centralize data. First was the
A second factor that hindered data use practices was the opposite of no one wanting to collect data. It was when there were
If the first two factors were about the structural elements of the university, the third factor hindering data use is a cultural one. A respondent noted that “If it becomes a culture, then it is part of the task that you do.” When data practices are institutionalized over the years, it is much more difficult to adapt to a demand for change. For example, SMO has difficulty proposing changes to the existing data management system utilized by the different offices because there are traditions embedded in the culture that resisted these changes. One director mentioned that “a lot of things are shut down immediately and we proceed [with] our traditional way of doing things.” Some of these traditions include reliance on intuition and informal handling of data, which manifests as cultural resistance that hinders organizations from becoming more data driven.
In all these examples, we find that the barriers to data use are not so much specific barriers experienced by individual organizational subunits. Rather, these are barriers unique to the larger organization, and we would argue also, barriers that are intimately tied to the very structure and culture of many universities. The problems of unclear, inconsistent, and overlapping functions are endemic to organizations whose subunits work autonomously, and so the goal is not simply to create better-centralized
Discussion and conclusion
This study contributed to our understanding of data processes in a decentralized system and the factors that promote or inhibit a data-driven culture. Higher education institutions aim to inculcate data-driven cultures, and the need for data in the education sector has gained prominence given the desire to improve organizational processes and derive evidence-based decisions (Berndtsson et al., 2020; Espeland and Stevens, 2008; Storm and Borgman, 2020). However, many universities are prevented from optimizing their data systems because of structural and cultural barriers (Bichsel, 2012). Our research details how different processes help and hinder data use in higher education, and we provide three crucial insights.
First, in decentralized organizations like universities,
Second, contrary to our expectations, having a critical mass of data-driven subunits is not enough to tip the scales to have a more data-driven larger organization. In fact, strong sub-organizational cultures may prevent centralized efforts when the organizational subunit is unable or unwilling to share information. Such dynamics of
Third, while efforts at centralizing data collection and analyses are important, the
These insights have implications for the study of higher education, organizations, and quantification. For higher education, this study contributes a richly descriptive account of the complex dynamics and politics in establishing educational initiatives like data systems (Macfadyen and Dawson, 2012; Miller, 2019). Given the importance of the concept of DDDM, the present study provides an in-depth case of the potentials and pitfalls of data-driven initiatives. For organizational sociology, this study suggests the concept of variable organizational subunits, and how organizational studies must “break down” the study of organizations into its many component parts (Anderson, 2015; Arthars and Liu, 2020; Hora et al., 2017; Storm and Borgman, 2020). For new institutionalism, the study highlights the possibility for loose and tight coupling to appear simultaneously, leading to effective subunits with technical changes but also ineffective systems that only had ceremonial changes. For the sociology of quantification, this study provides the insight that different organizational subunits rather than the whole organization may respond differently to pressures from accrediting and ranking bodies. This challenges assumption by Sauder and Espeland (2009) and Hazelkorn (2015) about tight coupling that comes as a result of ranking pressures. Although some subunits show tighter coupling, this is no assurance that all parts of the organization will do so.
Notwithstanding these important contributions to these different fields in sociology, we must acknowledge the limitations of the present study. One important limitation is the focus on a single higher education organization, which may limit generalizability (Carminati, 2018; Gheondea-Eladi, 2014; Morse, 2008). However, what we did gain from this focus on a single organization is a depth in understanding the grounded experiences of actors who are trying to institutionalize data use in a school. We suggest that researchers view this case as a springboard for concepts that may apply—to some extent—to other universities that are pressured by ranking firms, accrediting bodies, or data-driven stakeholders. Another limitation is the potential peculiarity of a case from a developing country trying to participate in global university rankings. However, we argue that many universities in the Global South attempt to join these rankings and experience pressures from accrediting bodies (Van Vught, 2008; Vidal and Ferreira, 2020). Thus, more research must actually take seriously the efforts of these universities. Third, we suggest different propositions that may be extended with quantitative information, and so, we look forward to future research projects that test these propositions empirically with survey and administrative records. Finally, we propose further research on the factors leading to the variations among organizational sub-units to know if such variations can be explained by an office's mandate or a leader's data-driven ethos.
Despite these limitations, we argue for the rigor of our analysis of the case, and the connection between our theoretical concepts and our empirical data. We provide suggestion regarding the variation of organizational subunits in universities as they try to establish data-driven systems, and the barriers presented by the structural constraints of siloed and decentralized systems. One practical implication to come from this research is the importance of helping organizational subunits while clarifying the functions and connections of these different subunits. Moreover, improvements in data systems may often meet failure if the structural foundations of the university are not reckoned with.
Footnotes
Authors’ note
All authors were researchers or research associates at the Ateneo de Manila Institute for the Science and Art of Learning & Teaching.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
