Abstract
Power BI is a Microsoft data visualization and business intelligence platform currently being used for higher education governance at a range of institutions. This study contributes to critical scholarship on the normative dimensions and politics of data visualizations, with particular attention to how dashboards co-shape user subjectivities and are intertwined with educational policies that use time as a means of enacting higher education governance. Drawing on 11 interviews with administrative advisors and faculty connected to humanities programs that are particularly vulnerable to de-funding and enrollment challenges at a large public research university in Denmark, this paper investigates how the design, use, and marketing of Power BI are entangled with top-down higher education policy agendas. It emphasizes how students are constructed as discursive objects whose demographic information, academic progress, and workforce participation can be sorted, aggregated, and tracked in “real-time” for meeting performance metrics as well as regulatory compliance. However, this article also demonstrates how Power BI is a site where unanticipated frictions emerge. While the software is designed to render program administrators and faculty increasingly responsible for hedging against the vagaries of the post-graduation job market, in practice, participants report ambivalence and disengagement, navigate issues of transparency and accuracy, and attempt to leverage the platform to creatively resist threats to program defunding.
Introduction
In recent decades, Danish universities have undergone a process of neoliberalization, including the transformation of higher education governance according to the principles and practices of New Public Management (NPM). These changes have included the consolidation of decision-making power into the hands of university boards, the creation of performance contracts between the Danish ministry and the university that impact funding (Wright et al., 2020), and the extensive use of bureaucratic surveillance in order to monitor this performance (Moutsios, 2022). This shift toward performance contracts is part of a wider and explicit policy in Denmark to vocationalize university programs, resulting in significant reductions to the humanities (Moutsios, 2022).
Following the work of Warren (2023), I understand neoliberalism as a set of organizational logics that have become increasingly dominant in Denmark since the 1973 economic crisis and the ensuing decline of Keynesianism, where “the nation/state largely functions to create the legislative and legal conditions for maximizing market behavior” (30). In the last two decades, there has been a significant shift toward more economistic and market-oriented decision-making in Danish higher education that focuses on “competition, measurement, assessment, and employability” (Wulf-Andersen and Larsen, 2020: 303). This shift coincides with wider global governance transformations, including the increased power of international organizations and policy networks like the Organization for Economic Cooperation and Development, trends toward marketization and high-stakes accountability policies, and anticipatory governance premised on the analysis of large-scale digital data (Hartong and Decuypere, 2023; Ratner and Gad, 2019; Robertson, 2022). Across Europe, with the launch of the 1999 Bologna reforms, there has been a move from collegial governance toward managerial approaches and the implementation of NPM broadly in the public sector (Brøgger et al., 2023).
The idea that both public and private sectors can be rendered more efficient and effective through datafication is part of the dominant political imaginary in Nordic welfare states. Datafication refers to “a sociotechnical process characterized by the ever-growing utilization of advanced methods to analyze and recirculate data” (Broomfield and Reutter, 2022: 2). The Nordic welfare state model has long relied on the creation of databases and the surveillance of citizens to manage resources and sort populations, producing categories of “deserving” and “undeserving” citizens (Offe, 1984). However, intensifying forms of mass data collection, automation, and artificial intelligence (AI) have coincided with the increasing dismantling of the welfare state, including the privatization of public services (Dencik and Kaun, 2020).
In the context of higher education, science and technology studies (STS) scholarship on datafication has stressed how shifts toward “data-driven” institutional decision-making are often informed by technocratic logics that privilege institutional metrics of success over the needs of faculty and students (see Prinsloo, 2020, Weinberg, 2024; Williamson, 2017). The implications of digital data practices in higher education have also included increased monitoring and control of educational performance and success with unprecedented flexibility and granularity (Jarke and Breiter, 2019) and an influx of private providers for data-related services (Sellar, 2014). Drawing from both STS and critical policy studies, this article's key contribution is to evaluate how Power BI, a Microsoft data visualization dashboard and business intelligence platform currently being used for higher education governance at a range of institutions, is entangled with the infusion of NPM-inspired reforms in Danish higher education. Building on STS scholarship regarding how data visualization software is a highly situated, value-laden, and contingent arrangement that facilitates various techno-organizational dynamics (Ratner et al., 2018; Ratner and Gad, 2019; Vanermen et al., 2024a), I trace how Power BI both enables and restricts users in ways that are shaped by socio-material and political forces in Denmark. Following the work of Wright et al. (2020), I also understand universities as spaces that are enacted, both through the influence of top-down reforms as well as through the bottom-up daily activities of those who live and work in universities.
According to promotional literature on Power BI, the software promises to accelerate higher education's “digital transformation” (3Cloud, 2021), enable “data democratization” (Myers et al., 2023), and support more “data-driven” institutional decision-making (Microsoft, 2024). STS literature on higher education dashboards has emphasized that despite marketing promises of greater transparency and control for users, in practice, dashboards often do not provide insight into the underlying algorithms and data selection processes that support the specific user interactions the dashboard enables for higher education decision-makers (Williamson, 2015). Dashboards can be experienced as unwelcomed surveillance, offering little transparency about data collection, access, and decision-making procedures (Komljenovic et al., 2025), and encouraging new forms of work for keeping pace with dashboard related metrics (Williamson and Kizikcec, 2022).
At the same time, dashboards are also sites where unanticipated frictions and varied (dis)organizing effects can disrupt hierarchical power relations in the academy (Ratner and Gad, 2019). This study contributes to critical scholarship on the normative dimensions and politics of data visualizations, with particular attention to how dashboards co-shape user subjectivities (Vanermen et al., 2024b), and are intertwined with educational policies that use time as a means of enacting higher education governance. In this sense, time is not natural, but rather, constructed, shaped, and ossified through technologies that influence, and are influenced by, social life (Tierens et al., 2024). Temporal research on educational policy is an emerging literature in need of more empirical work, particularly on the role of digital infrastructures (Tierens et al., 2024).
In what follows, drawing on results from a year-long research project in the Danish university system, I first provide an overview of recent policy shifts in Danish higher education shaping efforts to algorithmically manage students and faculty. Then, I present findings from a reflexive thematic analysis (Braun and Clarke, 2012, 2013, 2019) of 11 interviews with 8 administrative advisors and 3 faculty connected to humanities programs that are particularly vulnerable to de-funding and enrollment challenges at a large public research university in Denmark. Microsoft (2024) Power BI and Microsoft partner 3Cloud's (2021) websites, Power BI guidance documentation hosted on Microsoft Learn (Myers et al., 2023), and the Power BI interface's design and functions were also scrutinized to further situate interview findings in relation to dominant discourses concerning the software's utility and its affordances. Ultimately, I demonstrate how the use of Power BI constructs students as discursive objects whose demographic information, academic progress, and workforce participation can be sorted, aggregated, and tracked for meeting key performance metrics as well as regulatory compliance in “real-time.” I examine how Power BI's demographic classification system intersects with broader debates over who belongs in Danish higher education, with disparate consequences for marginalized students. I then address how Power BI works to render faculty and program administrators increasingly responsible for hedging against the vagaries of the post-graduation job market. However, in practice, participants express ambivalence and disengagement, navigate issues of transparency and accuracy, and in some cases, attempt to leverage the platform to counter threats of program funding. I conclude by identifying existing forms of resistance against Power BI, and envisioning ways that students and faculty can collectively struggle for a more just higher education system.
New public management in Denmark
Since the 1970s, there has been a move toward “data-driven” evaluations and decision-making in OECD nations (Sellar and Gulson, 2019). Denmark began privatizing state-level data management functions in the 1990s as advancements in computing power made digitized data management feasible through contracting with private information technology providers (Bækkeskov, 2012). The past two decades of major Danish higher education reforms have contributed significantly to the entrenchment of “data-driven” NPM strategies, including the University Act of 2003, which mandated that a majority of external representatives from private companies and organizations be appointed to all university boards, who then have the power to appoint the rector, who appoints the head of each department (Cone, 2017). Prior to 2003, faculty, students, and administrators elected representatives for major decision-making positions. The Act closed down collegiate governing bodies, including the University Senate and Faculty Councils. In 2007, accreditation became a legal requirement in Denmark following the European Bologna process. That same year, a series of university mergers consolidated 25 Danish institutions into 8, which increased the size and scale of major research universities. Management subsequently led efforts to centralize control through guidelines and performance indicators (Moutsios, 2022). With the passage of the 2013 Accreditation Act, universities were made responsible for establishing their own quality assurance processes in compliance with nationally defined criteria from the European Standards and Guidelines (Brøgger et al., 2023).
After the 2014 introduction of the “Study Progress Reform” and “Dimensioning Plan,” a range of university programs underwent downsizing and shutdowns due to labor market considerations, with significant consequences for the humanities and social sciences (Brøgger et al., 2023). These reforms also imposed strict limits on how much time students are granted to complete their degrees and intensified staffing cuts (Risager and Thorup, 2016). Current student finance reforms have also increased pressure on students to graduate during a prescribed time frame or otherwise take on loans to finance their degrees (Yasar, 2023). These shifting forms of temporality include the transformation of 10% of two-year master's programs into one and a quarter-year programs, and another 10% into professional master's programs where students work and receive an education concurrently.
Power BI was initially adopted at this institution in the wake of these reforms. It is used to keep detailed track of student progress to degree and to generate an internally reviewed annual action plan. Typically, the department head, often in consultation with faculty and student representatives, is responsible for creating the annual action plan and describing how the department intends to address the software's indications of poor performance. The university requires these annual reviews in anticipation of the external quality assurance process. This quality assurance process takes place every five years and is necessary for the university to remain accredited. Representatives from the labor market are then invited to provide input on the direction and priorities of university programs based, in part, on the results of the quality assurance process.
The platform provides various metrics that the university determines and standardizes across programs, including drop-out rates, exam performance and completion rates, teaching hours, post-graduation (un)employment rates, and self-reported information about the amount of time students spend studying and their stress levels. Drawing on data from students, the university, the national data warehouse, and national surveys, the performance of departments is measured according to these metrics and subsequently assigned either a green, yellow, or red risk category per metric (visualized as either a green circle, a yellow triangle, or a red square, Figure 1) based on program performance. These thresholds are primarily determined by a committee that includes the pro-rector, vice-deans across the university, and the director of student administration, although some thresholds are also informed by what constitutes full-time work (in the case of measuring study intensity) and graduate unemployment levels sourced from Statistics Denmark.

Example of Power BI visualization of color-coded risk assessments per performance indicator and arrows conveying whether indicator values are rising or falling.
Using the software, data can be visualized in a variety of forms, including tables, bar graphs, line graphs, and pie charts (Figure 2) that can be hovered over with a cursor to provide information about what filters have been selected for generating the visual. These results are used for making comparisons across years, across departments/programs, and for reports for communicating with the government.

Example of Power BI bar graph based on synthetically generated numbers of passed and failed exams for a selected program over time.
This software also operates within a political context where the inclusion of international students and the value of critical humanities scholarship are highly contested. Increased restrictions on international students and on courses taught in English in 2021 have gone hand in hand with rising general societal hostility to migrants, refugees, and asylum seekers in Denmark (Warren, 2023). That same year, the Danish Parliament also adopted a position “on excessive activism in certain research environments” (Dahl et al., 2021), with major parties voting in favor across the political spectrum, and which targeted critical research and teaching centering issues of race, gender, migration, and postcolonial studies. While study places for English-language taught master's programs were increased in 2024 to attract international students in order to address emerging labor market needs and declining birthrates, shorter program completion times mean that international students will face an even harder time gaining employment and establishing community in Denmark, where internationals and people from non-Western backgrounds, especially, face discrimination (Bjerre, 2022; Dahl and Krog, 2018). As the analysis below will demonstrate, Power BI helps to produce digital data according to the temporal rhythms and demands of austerity measures in Denmark in ways that have implications for the humanities and for the degree of (in)action on the educational outcomes for marginalized students.
Interview methodology
Semi-structured interviews were conducted in 2024 with 3 full-time humanities faculty with either (2) departmental leadership or (1) educational committee roles and 8 administrative advisors working at the (3) departmental, (3) school, and (2) university level at the same public research university in Denmark. These administrative advisors have responsibilities that range from planning the scheduling of courses and exams, to counseling university administrators based on trends in post-graduation employment and faculty research productivity, to assembling reports for quality assurance processes. These participants engage with Power BI at a frequency that ranged from once a month to weekly, depending on their responsibilities. Interviews were held in-person for 60-minutes, and participants were recruited based on their publicly listed positions at the university and through snowball sampling.
Questions were designed to investigate the following: (a) what is the relationship between Power BI and neoliberal transformations in Danish higher education, and (b) how do institutional actors make sense of the categories Power BI uses to measure and monitor student behavior? The interview protocol focused on how institutional actors understand what Power BI is and how they use it, what they see as its benefits (if any) and limitations (if any), and how they understand its relationship to higher education policy and decision-making procedures. Participants had the option to open the Microsoft Power BI software during the interview, but only two participants did so. Interviews were initially transcribed in NVivo. After reviewing the transcripts and recording together for typos and inaccuracies, the transcripts were then pseudo-anonymized, masking participant names and the names of people they mentioned, places, and university details.
To analyze the qualitative data from the interviews, I conducted reflexive thematic analysis based on Braun and Clarke's (2012, 2013, 2019) six-phase approach. The reflexive approach to thematic analysis emphasizes how the researcher is an active participant in knowledge production, and that codes represent the researcher's interpretations of patterns of meaning (Byrne, 2022: 1393). Rather than using predetermined codes or themes and applying them to the dataset, the researcher first familiarizes themselves with the data, develops initial codes, generates initial themes based on aggregate meaning across the dataset for answering the research questions, reviews the quality of the themes, finalizes the themes for coherency and consistency with the data, and finally, produces the report. An effort was made to code for both institutional actors’ own subjective accounts of their attitudes and experiences as well as for the researcher's own reflexive influence using a combination of open-codes and respondent/data-based meanings. Codes used to examine what a respondent directly said included how they characterized Power BI, such as “reliable,” “trustworthy,” “efficient,” “labor-saving,” “neutral,” and “scalable,” whereas latent codes for identifying underlying assumptions, ideas, and ideologies included “permission differences,” “real-time responsiveness,” “interpretation differences,” and “lack of training.” Here, I was interested in capturing the performance and experience of institutional actors’ ideologies and tensions within those experiences. Ultimately, four themes were identified: technical affordances; data democratization; Danish higher education policy's impact; and metrification's limits, described and discussed in detail below.
Given that higher education reform and the technologies that co-constitute it exist within a field of contestation, it is necessary to attend to organizational concerns and practices with empirical nuance (Ratner and Gad, 2019). Hence, this study investigates the design and marketed promises of Power BI, as well as the lived experiences of academic workers, to account for its situated organizational effects. Data and data visualization software like Power BI shape and are shaped by academic workers’ subjectivity, how they interact with both students and colleagues, and their experiences of space and time. Dashboards, as Tkacz (2022) explains, are “layered with perceptual orientations, cultural significations and epistemological qualities that format not only the data and elements that pass through it, but also the bodies and organizations that make use of it” (17). The findings below capture how the software works to structure social practices within Danish higher education in ways that participants actively navigate, negotiate, and sometimes resist.
Technical affordances
Part of the interview process was designed to get a sense of how each participant understood what Power BI is. Sociotechnical imaginaries, or “collectively held, institutionally stabilized, and publicly performed visions of desirable futures, animated by shared understandings of forms of social life and social order attainable through, and supportive of advances in science and technology” (Jasanoff, 2015: 4), actively shape the everyday sense-making of institutional actors and their understanding of the tools they encounter and engage with. At the same time, sociotechnical imaginaries do not overdetermine how individuals approach technology, as the relationship between technology and society is highly co-constitutive and contextual. As Sartori and Bocca (2023) explain, “in dealing with technicalities, different stakeholders shape technology while solving their conflicts over resources, affordances, and power” (444). In particular, affordances were crucial to how participants articulated what Power BI is, providing insight into how these participants perceive the software and negotiate how it mediates their agency within the workplace.
Technical affordances generally refer to “the potential for behaviors associated with achieving an immediate concrete outcome and arising from the relation between an artifact and a goal-oriented actor or actors” (Strong et al., 2014: 12). According to Nagy and Neff (2015), affordances are not reducible to conscious or rational expectations. Rather, affordances are both material and perceptual, in that users’ perceptions of what actions are available to them are shaped by their lived experiences, attitudes, and emotions, as well as the features of a given technology and the possibilities they engender (Nagy and Neff, 2015).
One of Power BI's key features is to provide users with what the promotional literature calls “data-driven” insights (Myers et al., 2023). Norton's (2021) critique of “data-driven” policies is instructive here, demonstrating how the idea of data-driven decision making is inherently misleading in that what data is included, excluded, and valued can never be neutral. Three administrative advisors used the language of data-driven decision making when explaining that Power BI helps with department and university-level planning processes, including using resources effectively, revising programs of study by looking at patterns in student behaviors such as exam performance, and benchmarking by comparing different program performance metrics to one another. For example, according to Erik, a university-level administrative advisor who routinely handles institution-wide educational data for the quality assurance process, the software provides users with “better abilities to, to take data-driven decisions and be enlightened about what's actually going on when you decide.” Erik felt that the platform improves his ability to take responsibility over his work and to make decisions based on the possibilities represented in the data.
Another affordance of Power BI that several administrative advisors emphasized was its role in helping to make programs’ annual review processes more efficient by helping to streamline the quality assurance process and program governance more generally. However, Erik shared that “I think asking more people to look into BI, Power BI, and to take data seriously, it also requires at the faculty levels more employees that are good at looking at data.” Similarly, Astrid, an administrative advisor who does student data management work at the university level, felt that Power BI “requires a lot of people … to like interpret the data … [to] find the right data.” She noted an extensive use of data personnel required for its maintenance and updating. The platform also facilitates data access and reporting in such a way that requires labor, including, these participants shared, of faculty who were previously more removed from processes of university data collection and management.
This type of reconfiguration of faculty labor has been described by Narkunas (2020) as a form of “trickle-down managerialism” that entices faculty to use the actuarial techniques of management in ways that undermine narrative modes of faculty expertise and judgment, which do not lend themselves as easily to quantification and standardization. Similarly, according to Moutsios (2022), academics become obliged under NPM “to speak the discourse of the bureaucracy, a discourse of meticulous productivity controls and hierarchical submission, which precludes any other form of communication” (285). Yet, in my conversations with participants, many of their relationships to Power BI were complex and ambivalent. While at times, our conversations were constrained by the protocols, imperatives, and conceptual frameworks baked into the software and the promotional rhetoric surrounding it, at other times, participants troubled these constraints.
For instance, while Anna, an administrative advisor for two departments, framed Power BI as essentially a data repository for searching and storing reports, Nora, a different department-level advisor, explained how she struggles with whether to conceptualize Power BI as neutral. She shared, “I tend to think of it as more neutral…or at least in the level that I use it in. I would like to think of it as neutral, but I'm pretty sure that it's not. I think sometimes the way the data is presented in the Power BI makes me aware of some things and maybe not others.” Erik also characterized the tool itself as neutral, while at the same time describing how it changed the logic and the culture of the university to be more in line with the idea of data-driven decision making. While the university previously used Excel files for intake and admissions, Power BI allows for significantly more data to be combined and analyzed in ways that are more standardized and scalable.
Data democratization
A second theme that emerged during the interviews was about data democratization. In the promotional literature for Power BI, data democratization “refers to putting data into the hands of more users who are responsible for solving business problems. It's about enabling more users to make better data-driven decisions” (Myers et al., 2023). Some participants, such as Jens, a school-level administrative advisor serving multiple departments with former experience as a Power BI developer for the university, explicitly used the language of democratization: “it [Power BI] has made it much more democratized to see educational data. When I started … if you wanted some specific educational data, then on each faculty, there were maybe two to five who could find the data … but right now, every employee can go and see the data.” Here, the notion of data democratization is tied to the idea of wider employee access.
While Microsoft (2024) Power BI's website presents the tool as empowering more users to better understand and contribute to a company or organization's management, at the same time, the most desirable state of data democratization according to the guidance documentation is one where there are “automated, monitored processes … anyone with the need or interest to use data can follow these processes to perform analytics” (Myers et al., 2023). This emphasis on automation is tied to the idea that tools like Power BI allow for faster, more objective and strategic decision-making while saving costs (Arcot Group, n.d.).
Data democratization within Power BI's marketing literature is closely connected to the notion of a data culture, meaning a set of behaviors and norms in an organization that are allowed, rewarded, and encouraged in order to promote a culture of “data-driven” decision-making that is “based on analytics, not opinion,” and that reduces reliance on “undocumented tribal knowledge” and on “hunches and gut decisions” (Myers et al., 2023). While data democratization promises to empower more people within the university, the university administration and the Danish ministry determine what data gets prioritized, how it gets sorted and classified, and what outcomes are considered desirable through the quality assurance process and funding-related policies.
It also became clear during the interviews that different participants had access to varying levels of data granularity based on their role within the university. Even in cases of participants like Aksel, whose role as an advisor to a dean requires engaging with Power BI frequently, he shared that “it's, it's very easy to, at least for me, to think this is all the data there is, and the only way it can be represented … but because I’m not sort of a back end user of Power BI, I don’t really know how, actually, how you set it up.” Aksel's insights raise the question of to what extent Power BI can be considered democratizing, revealing tensions between being able to see the data, on the one hand, but not necessarily feeling like one has decision-making authority over what data is visualizable and why, on the other. Furthermore, according to the promotional literature, the vision of an organization's healthy data culture should originate not bottom-up from workers, but from the executive level, to be enforced through praise, recognition, and reward (Myers et al., 2023).
At the same time, according to some participants, Power BI helps them influence others within the institution, particularly those that have more formal decision-making power within the university structure. For example, Anna described how part of her role is to help leaders make decisions, including by explaining what the data shows and what might be the tradeoffs of different choices. Others explained that they have seen Power BI used to correct what they feel are misperceptions of a given department, such as a dean demonstrating that graduates from a given humanities department are indeed placed successfully in internships with private industry during their studies. In contrast, a school-level administrative advisor felt her job was best described as presenting the numbers for others to deliberate about, rather than influencing decision-makers.
Power BI mediates discussions at multiple levels of university reporting and decision-making procedures, including between the university and the ministry, between students, faculty, and labor market representatives during advisory board meetings, between faculty members in a given department, and between faculty and students serving on program-level educational committees. In Nora's experience, faculty in her department frequently questioned the data in terms of whether it's measured in the right way, where it comes from, and whether the interpretation of the data strikes an adequate balance between the relevancy of education for the labor market and sticking to “the basics of what's the education about.” Similarly, a former department head felt that faculty were frequently debating the usefulness and significance of Power BI data. However, according to several other participants, there are many faculty who do not participate in these deliberations, either because they are “turned off by data,” “not very excited about data,” “overwhelmed by the huge amount of data,” or not trained for or familiar with the Power BI system. Furthermore, according to Anna, many faculty do not know that such a system exists, especially those with very few administrative responsibilities.
What my conversations with participants also reveal are the ways Power BI related deliberations, reflections, and decision-making are sites of larger ideological, political, and epistemic struggles over how to best understand the purpose of higher education and how to manage it effectively. Those who hold primarily administrative roles were more likely to understand resistance to Power BI in epistemic rather than political terms. Notably, Jens framed faculty researcher resistance to Power BI as a matter of failing to understand the system and its corresponding priorities. Jens explained, “Our researchers especially can’t understand because they’re more focused on the quality and the professionalism at each study … they don’t understand why the most important indicator is unemployment rates.” More pointedly, Erik described faculty resistance as grounded, for some, in what he called “ideological suspicion.” He shared, “there are academics who say that this data-driven logic and this way of talking about educational quality and performance, it's out of line with what we’re meant to do as a university…. Why, why should we even monitor all these things? And why should I, as an academic, use my time looking at data and graphs instead of teaching and talking to students?” However, Erik also felt that while faculty used to be more oppositional, this is increasingly less the case.
Furthermore, Ida, a faculty member with significant educational committee duties, noted that the type of data Power BI collects is privileged over other methods of arriving at answers to questions, and how this privileging is part of a larger emphasis in Danish society on data-driven decision-making in public organizations. She explained, “I think the overall paradigm, like societal, ideological paradigm is this, that it's sort of suspicious if something is out of reach data-wise … sometimes we don’t need more data, would be my response. Maybe we need to go ask people.” For Ida, it is precisely slower to acquire, non-standardized, highly contextual, and directly interpersonally acquired information that Power BI metrics fail to directly account for given how the data is sourced and represented. This insight is consistent with much of the existing literature on data-driven decision-making and the datafication of higher education (see, for example, Jarke and Breiter, 2019; Ratner and Gad, 2019; Taylor, 2020). However, some university management processes at this institution, such as faculty hiring, do rely on qualitative comments. Aksel shared that comments regarding the likelihood that a given candidate will “integrate” into the department, the university, and more broadly, Danish society, is documented via Excel. This type of information gathering can contribute to discrimination against candidates from marginalized backgrounds and open up space for managerial decision-making based on personal connections and network ties (Nielsen, 2016).
Danish higher Ed's policy impact
Through conversations with the participants, it also became evident that concerns about employability, funding, accreditation, progress to degree, and study intensity played a significant role in how Power BI metrics were constructed and interpreted. All of the participants shared that employability metrics, sourced from Denmark's national data warehouse, are weighted significantly in funding deliberations, with disparate consequences for the humanities given that students majoring in the humanities take, on average, more years to find full-time work.
Furthermore, as Camille, a director of studies for one of the university schools, explained, the strategic contract between the university and the ministry of higher education specifies a required amount of collaborations with external firms in order to fast-track students to the workforce through internships. Such policies and practices have resulted in the reduction of permitted intake for students who want to major in the humanities, or otherwise, fines for universities that exceed the permitted intake allotment. Camille shared that “the unemployment amounts, especially in the humanities programs, have been used as the argument for very severe cuts in our programs … even though we can have all kinds of anecdotes … stories about individual students and the career path afterwards, we just have to also know something about these figures.” In this sense, the direct, personal knowledge that academic personnel have access to due to their proximity to students is undervalued, and the quantitative unemployment data, largely abstracted from the complexities of social reality, is privileged for the use of punitive financial measures.
Such punitive financial measures include direct funding cuts as well as restrictions on student intake allowances based on how long it takes students to graduate and find a job, as well as reported drop-out and student satisfaction rates. Camille described feeling torn because on the one hand, she shared, “I don’t want to educate people to be unemployed. But in another way, it also pressures [us] to, to look at our programs … in a certain way…. I think we’re also running the risk of looking at … students … as numbers and the statistics.” Camille's comments were about a potentially dehumanizing administrative gaze, where students become objects of institutional knowledge for monitoring, nudging, and assessing their performance, as well as an overreliance on employment data for making decisions about programs. Leonard D. Taylor (2020) argues that the overreliance on quantitative data for institutional efficiency is often “indicative of economic logics” (1084), which prioritize cost-savings, efficiency, and labor-market outcomes.
This emphasis on fast-tracking employment coincides with discourses and practices concerning study intensity, meaning how many hours students report using to complete their studies. For example, Malthe, a former department head, explained that “the government wants to make sure we offer full time education…. There's a moral thing about, the students have to do full time work and full time workload and not sit at home and play computer games … it's also because of them giving subsidies and so on.” Several participants were quite critical of the study intensity metric, in part because it is self-reported from students who may not log or experience their effort in the same way. Relatedly, the questionnaire that collects study intensity data might be sent out right before an exam for some students, and for others, during a lull period, depending on their program. Nonetheless, and especially for humanities departments, study intensity attracts the attention of politicians who argue that the humanities fail to provide adequate preparation for the job market and are not rigorous or sufficiently scientific (Frydenlund, 2021; Myklebust, 2019).
Anna also noted that when it comes to how study intensity metrics are classified according to the red, yellow, or green category (Figure 1), the red category applies when students self-report under the desired 30 hour per week intensity rate, but not if they are working excessive hours. Anna remarks, “it's only a problem if it's too low. It's not a problem if it's too much. I think then that's saying something about the pressure … and I also think … as a student, do I know how much time I’m using on my studies? I don’t think so. I don’t think I did.” Similarly, Jens shared that “there is a contrast [between student wellbeing and progress to degree related metrics] in the way the government wants our students to be faster and faster.” Despite an increasing emphasis on student wellbeing among administrators and the ministry in the wake of COVID-19, Anna's point demonstrates how the Danish social welfare state seeks to ensure that students are working “hard enough” in exchange for access to financial resources from the state while simultaneously interpellating them into roles as informants against their programs and instructors through surveys. This idea of moral worthiness within the Nordic welfare state shapes debates about access to the student grant system, including arguments that foreign students are a drain on the educational resources available to Danish students despite evidence to the contrary (Wright, 2022: 104). However, Ida describes using Power BI to help reduce student stress by ensuring students are progressing as smoothly as possible through their studies and modifying programs if needed to better streamline examinations and project deadlines. Student wellness is also bound up with instrumental imperatives for the university, including student retention and the taking of examinations, as departments receive funding based on successful student examinations.
The tracking of student progress to degree is also part of how Power BI represents and reproduces the temporal structure and rhythms of Danish higher education governance. As data visualization software, Power BI modulates the knowledge and choices of academic workers in ways that eliminate the perception of any delay in the processing of information and the presentation of options. This imperceptibility creates the sense of “realtimeness” that users of Power BI experience. According to Weltevrede et al. (2014), “realtimeness refers to an understanding of time that is embedded in and immanent to platforms, engines and their cultures. Following the idea of such immanent and device-specific time further, realtimeness brings to attention how the specificity of time cannot be accounted for from the outside, applying extraneous measures, but only from the inside, tracing the increasing or decreasing intensity of pace in each device and its internal variation” (143). In this sense, Power BI helps to structure the subjective experience of time within the university through the sensation of realtimeness that academic workers describe when using the interface, and through the ways the software is keyed to policies regarding student progress to degree.
In light of the spate of market oriented higher education reforms beginning in 2003 and continuing in the present, students are expected to finish their studies faster than ever before and get jobs as quickly as possible, and this requires monitoring student progress, retention, and placement frequently and with unprecedented granularity. According to Nora, Power BI allows for “some kind of feeling of being able to monitor and control what we’re dealing with.” Similarly, Camilla emphasized, “the great thing is that it is dynamic. Instead of receiving these updates from the university, you know, two times a year about how many students we have, I can actually go in and see this morning we have this number of students.” The notion of monitoring students in the here and now speaks to the ways Power BI helps to entrench a notion of real-time responsiveness and reactivity to the vagaries of the labor market and student performance. The software is a vehicle for carrying out reforms that aim to “squeeze study time, to optimize learning in fewer time units and to discourage what is now termed ‘detours’” (Risager and Thorup, 2016: 13), an approach that understands the purpose of education primarily in economic and vocational terms.
However, both the labor market and student performance are subject to forces outside the control of both individual students and the university. The impact of social structures like gender, race, and class on labor market opportunities and outcomes is well documented (Johansen et al., 2017: 266). Furthermore, Anna explains that “there is a danger that because you can go into Power BI all the time and you can see all these things, and also from above we need to look into this all every year … I think there is pressure to, to find things to put in there.” For Anna, a risk is that Power BI pressures academic workers to identify, measure, and solve problems based on how the system is designed to frame and interpret them.
Metrification's limits
The final theme that emerged from the interviews concerned the limits of Power BI's metrics. At times, participants described this quite explicitly, from incongruence between Power BI's numbers for post-graduation employment rates and the ministry's, to its classification of students according to the gender binary, subsequently erasing the identities and experiences of non-binary students, to the crudeness of categories like “other reason” or “personal reasons” when it comes to self-reported data about why students drop-out, to the absence of attention in Power BI to how recruitment decisions might be shaping students’ post-admission experiences.
Many of the participants also spoke about the risk of “misinterpreting” statistics and reports in Power BI, especially in the case of programs with small numbers of students. For instance, Ida shared that “unless you have a lot of students, and I don’t … stuff can happen, right? So, all of a sudden, five people drop out, and it looks like, oh, something bad happened, but it's just five people…. It's so easy to get caught up in stuff that…just happened.” Power BI's data visualizations, including the color-coded risk schema and arrows representing trends, suggest that the user can and should be reactive (Figure 1), meaning that a problem exists and can be addressed through the future decision-making of academic workers like Ida. Yet, as Ida and several others expressed, not all knowledge is actionable. Additionally, in departments with smaller numbers of students, a small change can be represented in Power BI as a significant one in need of immediate attention (red coded). This intensifies the vulnerability of smaller programs to intake cutbacks and closures. It also requires academic workers in the humanities and other smaller programs to explain during the quality assurance process how the Power BI metrics fail to do justice to the department's circumstances and what is actually in their control. More broadly, given their emphasis on hedging against uncertainty, dashboards tend to prioritize speed and decision-value over the production of “facts” (Tkacz, 2022: 180).
Yet, it is also important to note that there are always multiple ways for Power BI data to be combined and interpreted. For example, basing an assessment on a given program's performance over the course of one year versus four years could provide a very different picture of a given program. Additionally, a 2019 analysis from the Danish Ministry of Education showed that since 2007, an additional 17,000 humanities graduates were employed while the vacancy rate only increased by 1600 people, and that the humanities were no more unemployed than several other fields (Ejlertsen, 2021). Nonetheless, the University of Copenhagen's management proceeded with cuts to student intake, and despite the fact that a dramatic round of reductions in 2015 had yet to be fully implemented and assessed.
In my conversations with participants, it also became evident that Power BI is not consistently used, nor designed, to meaningfully monitor or measure possible discrimination or inequity. Reflecting on drop-out rates, Erik remarked that: We've been reluctant to act upon the knowledge that we have in the management, because it requires us to act different than Danish culture usually does. We are not very fond of saying, ‘oh, you belong to a specific group of students. You need extra care or extra….’ It's not a very Danish thing. We keep thinking we are all equal and the same. So we shouldn't do sort of specific activities directed against, for instance, people with different ethnic groups…. If you have a certain kind of exam from your high school, and if you have a certain background, we can see there are some patterns. But we don't act upon it. And we don't monitor it sort of systematically.
Erik is describing the ways that management actively sidesteps knowledge about the impact of structural inequality on student outcomes. There are significant Nordic data gaps regarding access and opportunity for students from ethnic and racial minority backgrounds, despite research that ethnic background and race, among other imposed markers of social difference, significantly shape educational outcomes (Isopahkala-Bouret et al., 2018). Furthermore, there have been no Danish higher education policies introduced with the explicit goal of widening under-represented groups’ participation (Thomsen, 2021: 160). Without a radical reimagining of higher education policies in Denmark, increased data collection alone about marginalized students could intensify as opposed to ameliorate structural inequities. For instance, the Danish nation-state's extensive tracking of “Western” versus “non-Western” immigrants and their descendants has been used to reinforce racialized forms of exclusion (Bjerre, 2022; Vertelyté and Li, 2021). Additionally, according to Astrid, there is a coming push to incorporate predictive analytics for student success into Power BI at their institution. In the United States, such efforts have often reproduced and intensified historically entrenched inequalities in educational access and intervention (Weinberg, 2024; Whitman, 2020). Recent advancements in Microsoft's Generative AI tool, Copilot, promise to help Power BI users analyze the data and automatically create reports and visuals (VS and Verneen, 2025), raising further questions about to what extent ambiguities, uncertainties, and inequities will be represented.
Conclusion
This article has argued that Power BI is a tool for constructing students as objects of institutional knowledge for meeting performance metrics generated in response to top-down higher education policy in Denmark, with significant consequences for the humanities, for the labor demands placed on academic workers, and for the rhythms and temporalities of student monitoring at this university. Its findings contribute to critical scholarship on the relationship between dashboards and issues of surveillance, transparency (Komljenovic et al., 2025), and labor intensification (Williamson and Kizikcec, 2022), as well as on educational policy's temporal dimensions (Tierens et al., 2024). Furthermore, building on STS scholarship concerning how data visualization software contains embedded values and contributes to complex techno-organizational practices (Ratner et al., 2018; Ratner and Gad, 2019; Vanermen et al., 2024a), it has also demonstrated how academic workers negotiate and navigate this data visualization software in complex and highly ambivalent ways. Further studies are needed to assess the applicability of these findings within a range of departmental settings, including in STEM fields that are significantly less vulnerable to enrollment challenges and de-funding, and across Danish universities.
While outside the scope of this article to discuss in detail, it is important to note that larger shifts under NPM have also made academic workers more precarious in the face of cutbacks, program restructuring, and the rise of temporary contracts (Rogler, 2019). Much like students are increasingly datafied for tracking their progress to degree and theoretically streamlining their entry into the labor market, faculty research productivity, schedules, grant efforts, and cost to the institution are increasingly monitored. Notably, an administrative advisor who works closely with upper management shared that there is a Power BI for tracking faculty, but that few faculty know about it. This lack of transparency was justified on the grounds of the General Data Protection Regulation (GDPR), even though student data is also subject to GDPR requirements. This use of Power BI undermines claims that the software provides an empowering form of data democratization for all stakeholders across the university. Further studies are needed that shed light on and examine the relationship between Power BI and faculty performance monitoring.
Moutsios (2022) warns us that “bureaucratic organisation establishes powerlessness amongst its functionaries, because it hinders deliberation, reflection, and decision-making” (389). What this study of Power BI reveals is the ideologies and practices of NPM at this institution thoroughly pervade the design of Power BI. Yet, neither academic workers nor students are devoid of agency. Several modes of resistance to Power BI came up during interviews, including prepping students on what to say on surveys in order to strategically impact Power BI metrics, incorporating more exams into plans of study in order to generate more funding, and more generally, faculty disengagement with the tool itself. On the one hand, such tactics can be conceptualized as infrapolitical forms of resistance (Scott, 1990) in that they are critical but less visible modes of opposition and protest to the dominant order of institutional governance. However, these modes of resistance can also be limited. For instance, seeking to game Power BI's metrics can nonetheless shore up their institutional legitimacy. Furthermore, disengaging from Power BI is, at present, largely a tactic of individual rather than collective refusal. Given that data-intensive tools like Power BI turn subjects into data that can be aggregated, tracked, sorted, and combined, we might draw from Poster's (1995) invitation to “understand the forms of agency appropriate to a dispersed, multiple subject and to generate strategies of resistance appropriate to that identity formation” (93). Perhaps, then, these infrapolitical forms of resistance need to be channeled into a renewed, explicit, and collective demand from faculty and students for greater involvement in the governance and reform of Danish higher education.
There is historical precedent for organizing against more illiberal and anti-democratic strains of higher education governance in Nordic countries, including the student protests of 1968 in response to technocratic master plans and restrictive curricular reforms, which ultimately led to greater student participation in university governance and a transformation of the national political scene in several countries. Arguably, the rise of NPM is in reaction to the successes of these earlier periods of democratization. More recently, the “Different University” movement at Danish universities in 2014–2015 included a series of occupations of administrative spaces and other actions in response to the Study Progress Reform and Dimensioning Plan. This movement argued that the university should be run by students and faculty rather than an undemocratic managerial leadership structure and emphasized the pursuit of knowledge for the creation of a more just society rather than labor-market relevance. Both of these movements emerged within the context of global struggles happening on campuses around the world (Risager and Thorup, 2016).
The humanities are necessary in order for society to know itself, including how history, power, and social structures shape our complex and highly unequal world. Certainly, a drive to know the world is not always virtuous: university researchers have historically and presently produced knowledge used for systems of social stratification and oppression (Boggs and Mitchell, 2018; Weinberg, 2024). At the same time, the humanities and humanistic social sciences provide crucial ways of examining issues of race, gender, migration, and inequity, including Denmark's imperial and colonial history, which significantly shapes Danish society and education in the present (Bjerre, 2022; Jensen, 2016). Without the ability to train students and produce knowledge in fields that challenge the dominance of economic logics over the mission of the university, free and independent inquiry is imperiled. We must imagine, as Brown (2018) argues, “a university oriented by worldly cries, perils and needs, and to imagine the research and education – basic and specialized, humanistic and technical, big picture and local – that would be responsive to these cries, perils and needs” (54). Such visions of the university are necessary for cultivating democracy and avoiding despotism and anti-intellectualism. It is not too late to demand the just and democratic governance of universities so that all students and faculty can contribute not simply to labor force development, but to the transformation of their society.
Footnotes
Acknowledgments
This project benefited from feedback and discussions with Maja Hojer Bruun, Lone Koefoed Hansen, Cecilie Eriksen, Ciara Kierans, Christian Ulrik Andersen, Lene Aarøe, Nicholas Haas, Peter Danholt, Andrew Latham, Jes Bak Sørensen, and Andreas Roepstorff as well as feedback received through the Aarhus University Centre for Science Studies colloquium. The author is very grateful to the research participants who volunteered their time and effort on this project, and to the Aarhus Institute of Advanced Studies and the Shaping Digital Citizenship Research Center for the fellowship that made this project possible. The author also thanks the editors and anonymous reviewers of Big Data & Society for their detailed and constructive feedback, which greatly improved the manuscript.
Ethical approval
This study received approval from Aarhus University's (AU) Research Ethics Committee, Arts (2023–028).
Consent to participate
Participants consented to participate in writing after reading and signing an information sheet explaining the study, time commitment, risks, and participant rights.
Consent for publication
Participants consented in writing to have excerpts from their interviews quoted anonymously by the researcher.
Funding
The author declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: This project was supported by the Aarhus Institute of Advanced Studies and the Shaping Digital Citizenship Research Center at Aarhus University.
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data availability statement
Participants did not consent to have their data included in a public data repository.
