Abstract
The past few decades have witnessed a global enforcement of ‘governance by data’ in education policy, including a significant increase of assessments and quantified evaluation. Within this context, this article focuses particularly on the intensifying evolvement of new (digital) information technologies and ‘mediated’ infrastructures of data flows. The premise is that such technologies and actors of mediation reveal a crucial potential for implementing a new mode of digitalized governmentality in education, which, as ‘governance by big data’, reaches far beyond policy, into educational administration, school practice and individual learning activities.
The strategic mediation of (big) data in education (such as that generated through assessments) involves actors, structures and technologies that operate between policy, politics, administration, schools and individuals as well as between data production, consumption and the data itself, for example by applying practices of data visualization or technical data services around software and databases. However, as this article seeks to demonstrate, such mediators comprise various types of actors who operate very differently within the diverse sectors of education policy, indicating a highly ambiguous yet powerful composition of digitalized governmentality.
Introduction
During the last 25 years, novel structural transformations have fundamentally changed the modes of public educational governance (Ball et al., 2013). These transformations manifest themselves in a rising globalization or transnationalization of education and education policy, which at the same time enhanced the involvement of new state/non-state, often multi-level or cross-sector actor constellations, eventually shifting governmental power away from state-centred modes of control (Ball, 2009; Hornberg and Parreira do Amaral, 2012). Both trends are closely related to an increasing fabrication of big educational data through new, often digitalized information technologies, designed to improve and accelerate the application of educational knowledge through numbers and statistics (see also Lingard et al., 2012).
Considerable efforts have been made in recent years to study and discuss the complex mechanisms of educational transnationalization (e.g. Barnett and Finnemore, 1999; Lingard et al., 2013; Martens et al., 2013; Robertson and Dale, 2015), which, inter alia through a steady expansion of international performance assessments, has shifted governmental power towards international organizations such as the OECD. 1 There is little doubt that this increasing transnational influence on education policy has implemented ‘new centres of gravity’ and caused a formal convergence of political reform strategies towards intensified standardization, ‘datafication’ and accountability. However, there has also been a growing controversy about how the new modes of global governmental influence really affected education beyond politics (Bieber et al., 2016; Teltemann and Klieme, 2016). In fact, various dynamics and repercussions have been documented, which include ambivalent changes in classroom practices, processes of hybridization (e.g. between standardization and increased school autonomy) or path-dependent resistance (Beech, 2011; Gogolin et al., 2011; Robertson and Dale, 2015; Sassen, 2003; Steiner-Khamsi, 2006). Following Teltemann and Klieme (2016), an essential reason for these diverse findings lies in the complex, often country-specific interplays between cultural traditions, stakeholder relations, political opportunity structures and discursive constellations, which, in combination, operate either as driving forces for or as resisting forces against global reform diffusion and policy convergence.
Building on these works, this article argues that the increasing mobilization of new (digital) data technologies can be identified both as an effect of international assessments and as an ongoing driving force, which has the power to induce various sectors and levels of education with new governmental constellations. Surrounded by digital data and new information technologies, such constellations operate beyond formal political decision-making and opportunity structures by systematically implementing and authorizing new types of actors and regulating structures, which are rarely actively involved in political arenas of reform negotiation, but rather operate as ‘invisible’ data and technology mediators. Following Lawn (2013: 9), ‘th[ese] new managers of the virtual landscape are hidden elsewhere’ and (so far) appear to be rather difficult to observe. For instance, through applying practices of data visualization or data services around the production and consumption of assessments, software or databases, such mediators are located somewhere between data producers, data consumers and the data itself. As new technical policy experts,
2
they are involved in linking the (digital) technologies of software engineering and data processing with various educational assessment structures. For example, it is not the OECD alone that is managing the PISA
3
assessment production, but rather it depends on complex partnerships with different global and national data service contractors (Bloem, 2016; Flitner, 2006), which, when talking about the impacts of PISA, have rarely been the subject of in-depth analysis. Consequently, […] researchers need to examine the actors involved in producing [data] visualizations, ask what data they are using, how those data have been formed, as well as interrogating ‘what software is used in the analysis, what code or algorithms shape the data and the visualization’, in order to ‘treat these visuals seriously as they come to envision the social world’. (Beer, 2013: 118f, cited in Williamson, 2015a: 9).
While ‘mediated’ data and assessment processing comprises a growing number of international organizations, it also involves global scientific or industrial networks, consulting or research institutions, philanthropic investors and private or for-profit actors, who altogether act as ‘partners’ within educational reform projects by providing evidence-based expertise, support or funding (Fabricant and Fine, 2013). However, while considerable efforts have been made to analyse such constellations within the promotion and enforcement of large-scale educational reform, there has, at least so far, been little scholarly study that specifically focuses on the mediators of digital technologies and data.
As this article seeks to demonstrate, data mediators triggered the legitimation of non-state policy actors by equipping them with innovative technology and data-based ‘evidence’. Particularly with the rise of digital reform solutions, many intermediary actors, such as think tanks, philanthropies or edu-businesses, have created in-house ‘policy innovation labs’ (Williamson, 2015c) that focus on the successful implementation of digital data technology in education. For example, the Alliance for Excellent Education (www.all4ed.org), a highly influential US think tank and key player in the American standards-based reform movement (Hartong, 2015), promotes digitalization through its Center for Digital Learning and Policy. The Center […] advances an ambitious agenda for national, federal, state and district policy reform efforts relating to the effective use of technology in K-12 public schools. The Center is a strong proponent of supporting teachers, school leaders, districts and states as they navigate the shift to more robust, digital learning environments. The work of the Center includes the development of publications, policy briefs, and case studies highlighting promising practice. The Center seeks to connect the use of digital learning in support of the Alliance’s major policy areas […]. (http://all4ed.org/about/center-for-digital-learning-and-policy/)
Given this wide spectrum of activities, the Center for Digital Learning and Policy mediates not only between the Alliance for Excellent Education and various national, federal, state, district or local educational actors, but also between digital spheres of technology and (big) educational data on the one hand, and ‘real-world’ spheres of policy, administration and school practice on the other hand. It is argued here that such new constellations of data mediation provoke a new type of governmentality, which ultimately operates ‘across pre-existing territories and scales’ (Lewis and Lingard, 2015: 623), while shifting ‘governance by numbers’ (Grek, 2009; Ozga, 2009) towards governance by big data.
In accordance with this idea, the remaining parts of this article are structured as follows. The second section is devoted to a further clarification regarding the concepts of big data and digitalized governmentality. The third section then provides a deeper observation of new data mediating actors who exercise governmental power by (often) operating (invisibly) within an increasing number of educational contexts. The fourth section focuses on educational subjects, which in a world of digital-era governance increasingly appear as self-disciplinary prosumers. The paper closes (in the fifth section) by discussing the main findings and sketching implications for future research.
Big data, digitalized governmentality and the ‘recomposition’ of governing spaces
The central objective of this article is to demonstrate how new technologies, as well as mediating actors of (big) data flows between the global and the local, have been playing a crucial role in the emergence of a new mode of digitalized governmentality in education, which, by exercising disciplinary power (Michel Foucault), has the potential to reach far beyond policy, into educational administration, school practice and individual learning activities. Digital governmentality hereby supports the idea of a gradual resolution and reconstruction of governing spaces, shifting the regulation of education towards new ‘policy scapes’ (Carney, 2009). In other words, through its various ways of being integrated into knowledge production and consumption as well as into infrastructures of accountability, the strategic mediation of (big) data operates as a powerful instrument, which is fabricating performativity in the direction of globalization while simultaneously transforming the ‘governed self’.
Firstly, however, it seems important to clarify that not all large-scale data, which is for example produced in the context of PISA, automatically qualify as big data. As recently defined by Gartner (2015), Big Data is high-volume, high-velocity and/or high-variety information assets that demands cost-effective, innovative forms of information processing that enable enhanced insight, decision-making, and process automation.
Following this definition, the unique feature of big data would result from its size, increasing rate and range of formats, which together require digital and automated computer processing. In other words, big data production does not refer to static, sporadic calculations, but instead relies on databases, which are simultaneously standardized (hence constantly processible) and dynamic in the sense of interoperable. Interoperable means that users constantly feed the database with as much data as possible. This enables adjustments at maximum speed, so technically mediated real-time regulation.
In this respect, individualized, automated learning technologies might in fact better fall in the category of big data production than, for example, the PISA assessment, which still collects highly generalized and aggregated data at three-year intervals only. Nonetheless, PISA also widely depends on complex interplays between statistical programs, technical expertise and automated data processing, for example when different national data sets are synthesized into one global database, or when OECD subcontractors transfer data between each other. Furthermore, the OECD is constantly developing new or extended instruments of its assessments that, such as PISA for schools or Education GPS (see the fourth section), seek to collect more detailed data through background questionnaires or interoperative platforms. Consequently, the OECD has developed an extensive database, built on the constantly growing amount of PISA data, which is increasingly reused by national data institutions, data service providers, state departments or research organizations. OECD data is hereby combined with different other as well as with new data sources to produce instruments such as (national or state-related) monitoring products or devices for local capacity building (including individual learning monitoring).
In summary, even though PISA, at least so far, does not present big data in the strict sense of thick, individualized and high-speed data, PISA and the OECD still triggered a growing orientation towards large-scale, small-scale and big data, ultimately pushing for more digital and smart data processing solutions.
This trend towards big data production as high-volume, high-velocity and high-variety information in education is closely related to a more general shift in social policy from governing by ‘new public management’ to ‘digital-era governance’ (Margetts and Dunleavy, 2013), conceptualized here as digitalized governmentality.
4
While the new public management approach widely stressed fragmentation, competition and incentivation, digitalized governmentality focuses on the reintegration of (holistic) services for citizens and implements thoroughgoing digital changes in administration. This digital administration of citizens mainly relies on […] discover[ing] patterns and reveal[ing] things about who we are as individuals and populations based on patterns in transactions, activities and conduct[ing] recorded in different databases. (Ruppert, 2012: 117)
Through consulting such databases, politics and administration are then able to generate up-to-date evidence for more efficient, more effective and more rationale decision-making. Ruppert (2012: 117) consequently refers to the increasing relevance of ‘modulating controls’, which are linked to public (but also non-public) structures of administration (Williamson, 2015b: 1) to allow for a direct regulative access. From the perspective of software engineering, such interoperating databases widely work through the application of flexible user interfaces.
Observed from a governmental perspective, software engineering as the programming of codes and software always inheres a performative dimension, because fixed sets of handling instructions reduce options for both perception and interaction (user with software or user with user), all of this mediated through (possible) programming commands (Edwards, 2015a: 265; Williamson, 2015a: 85). Consequently, by ‘performing actions upon data’ as well as by ‘holding data within its boundaries’ (Berry, 2011: 33), codes and software create performativity and reactivity from the software engineer as well as from the user. The users’ perception of digitally, software-mediated and contextualized knowledge is thereby widely limited to what the software or program allows them to see (e.g. searching within the defined categories of a database). Simultaneously, the context of production becomes shifted towards an invisible background, which needs to be taken for granted to effectively work with the software or the database.
In this regard, a crucial mechanism to provoke performativity and reactivity lies in practices of visualization as constructive performances, which intend to improve the readability, visibility and usability of complex data and software. Hereby, specific visualization formats allow for certain user readings, while simultaneously restricting or even excluding alternatives. Particularly, as increasingly demonstrated by sociologists of quantification (e.g. Espeland and Stevens, 2008), numerically ordered visualizations, such as graphs, rankings or distribution curves, exert a tremendous power of performative functionality: Numerical measures [and visualizations] produce a world knowable without the detailed particulars of context and history. The constituent units can be compared and ranked according to some criteria. […] The interpretations lurk behind the numbers but are rarely presented explicitly. (Merry, 2011: 84, see also Heintz, 2010; Lawn, 2013: 7)
With the growing rise of digital technologies, the expansion of data processing and the overall shift towards data-based decision-making, the demand for such functional (and particularly numerical) visualizations has been constantly increasing. Therefore, it has turned into an attractive and lucrative service offered by data mediators.
At the same time, an intelligently visualized and mediated application of digital technologies still leads users to believe they are working with ‘rare data’ or ‘rare software’, which can be manipulated individually, autonomously (Manovich, 2013: 32) and creatively. As envisioned by many promoters of digital learning, using data and digital technology could then even turn into a fun activity, inciting students and teachers to improve their learning and teaching.
Despite this performative logic of technical code processing, the introduction of digital technologies to a growing sector of education also increased the impact of different cultural, social or market-driven influences on the interaction with code and software. Ultimately, this leads to a […] mix of information, expertise, passion, suspicion, […] [negotiation] and pragmatism, in addition to the application of complex mathematics (Gorur, 2014: 67)
which together co-produce the selection, contextualization and presentation of (big) data.
In fact, recent studies have brought a new attention to such ambiguous influences on software engineering and (big) data processing in education. Exemplarily, Bloem (2016) provides an exceptional insider perspective into the PISA production process. She points out a substantial pressure on OECD data analysts and researchers, which emerges from obligations to constantly meet very strict timelines and also extensive quality requirements. Simultaneously, she shows how the OECD secretary’s political interests push the data processing towards a continuous fabrication of ‘interesting’ results, which the OECD can then use to publish attractive policy material.
Another example is Gobby (2015), who analyses the newly implemented platform MySchool in Australia, originally designed to serve as a ‘calculative device’ for parents to choose high-performing schools for their children. Gobby shows that calculations such as those produced in the context of MySchool invisibly oscillate between requirements of algorithmic formulation (how the database is designed to be used) and influences of intuition or judgement (how the database is ultimately (not) used by parents).
In summary, while the growing orientation towards data-based evidence and digital technologies in education is usually celebrated as an elimination of the human bias, irrational or human factors still crucially influence the use and processing of data, software and technology. Such influences are multiplied repeatedly as soon as software-transmitted data is further processed into political, school-practical or market settings. Inter alia within the productive context of monitoring reports, recommendations, evaluations or data-related product and service development, mediatory practices are usually exercised by attaching extra knowledge, interpretations of or services around the reallocation of data (visualizations), so constantly producing new ‘topological relationships’ (Lewis and Lingard, 2015: 623–624). These relationships are simultaneously tailored to the needs of different addressees (Werron, 2012: 342), such as policy actors, researchers, administrators, think tanks or school practitioners.
Hereby, objects (data and technology) and subjects (technicians, administrators, school actors or intermediary organizations) become ‘assembled together’ (Koyama, 2011: 706) in new governmental constellations that are constituted by (digital) data flows. In other words, the integration of (big) data into various cycles of educational (re)production and policy transfer (e.g. between administration and schools or between politics and science) involves both social and technical performances, which, in the way of ‘respatialization’ (Ozga et al., 2011: 87), create new linkages and spaces between the global and the local (also Sassen, 2002: 365). Such spaces no longer exclusively refer to territorial boundaries, but rather to ‘new relational spaces of globalization that operate across pre-existing territories and scales’ (Lewis and Lingard, 2015: 623). Consequently, following Lewis and Lingard (2015: 624), governance by numbers needs to be approached from a more ‘topological understanding of globalization’ (Lewis and Lingard, 2015: 624; see also Ball, 2016), which is conceiving governing power not: […] as emanating ‘outwards’ or ‘downwards’ from a central origin to pervasively ‘fill’ the territory of the nation state, nor to be coextensive within fixed national boundaries. Rather, it is forged through dynamic processes of connection and negotiation between people and places, in which the folding of topological spaces enables individuals and organizations to be physically absent and yet still present in terms of their policy reach and influence. (Lewis and Lingard, 2015: 625)
Building on these ideas, the growing flow of big data in education operates as a powerful driving force in the direction of topological governance, while shifting governmental arrangements into the digital sphere. This is most visible in the growing relevance of so-called cloud solutions for educational data management, as promoted for example by Microsoft (see the third section).
Consequently, data mediators increasingly become the new experts of transferring data as knowledge between this digital sphere and all other territories and governmental spaces in education, which increasingly rely on the functionality of data mediation. In the end, the new constellations between data production, data consumption and data processing transform educational actors into what Beer (2013) and Williamson (2015a) conceptualize as ‘prosumers’ (see the fourth section), hybrid identities who are consuming and at the same time constantly producing governing knowledge as data and software modularizations.
(Big) Data mediators in education
Realized mainly via ‘soft’ policy coordination and consensual and convictional decision-making mechanisms (Garsten and Jacobsson, 2012: 1), the ‘project’ of educational globalization (Robertson and Dale, 2015) successfully produced a globally legitimized role model of education policy, which simultaneously shifted structures and actors around new centres of gravity. These processes of deregulation (Hornberg and Parreira do Amaral, 2012) and the simultaneous re-regulation of power structures especially provided non-state or intermediary actors and organizations with governmental influence, which is increasingly exercised within heterarchical constellations, so somewhere between state, market and network contexts, respectively between hierarchies, competition and consensual decision-making (Ball, 2009; Williamson, 2015b). As was argued so far, data mediators started to play an increasingly relevant role within such constellations by rearranging educational subjects and objects around new digital technologies and by providing non-state actors with functional instruments for the fabrication of ‘evidence’.
Even though scientific attention for new infrastructures around the production, procession and usage of (big) educational data has been growing, it has so far often emphasized questions around (more) efficient and secure data organization, flow and exchange. The existence or production of data then often appears as a precondition, which has to be taken for granted (Manovich, 2013), rather than become a genuine object of analysis (Lewis and Lingard, 2015: 633). In fact, the constantly growing complexity of heterarchical actor constellations between levels and sectors of education policy results in an intensifying need for indirect, ‘soft’ coordination, ultimately increasing options to successfully mobilize data technologies for the production of governing knowledge (Williamson, 2015a: 84).
As a result, services and markets around standardized, digitally transferred educational data have expanded tremendously (Burch, 2009; Williamson, 2015a). This includes a growing number of new actors, but also prompted established actors to extend their portfolio towards data services. One example is (again) the OECD, who, in the initial stage of PISA, intentionally excluded any policy-relevant interpretations from its data production activities. Instead, member states were expected to individually draw conclusions from the study results. Between 2000 and 2015, however, the OECD turned into one of the most influential global policy consultants, using and constantly expanding its PISA database to offer multiple data services for governments, administrators or even local actors (Bloem, 2016; Lewis and Lingard, 2015; Sellar and Lingard, 2013).
Other organizations implemented in-house departments or centres, which offer expertise in data and technology innovation, such as the newly founded Technology Based Assessment (TBA) department within the German Institute for International Educational Research (DIPF). The DIPF is one of the largest and most important educational research institutions in Germany and has inter alia served as a national project manager within the PISA survey (Hartong, 2015). The TBA is a transverse department, which focuses on the integration of information technology with competency measurement approaches, on the improvement of (digitalized) learning and school technologies, and on the data-based support of educational research (e.g. within the process of test item development).
Often, mediatory data service suppliers offer multi-level, statistical analyses to further process (international) assessment results, such as the Data Processing Center (DPC) – part of the International Association for the Evaluation of Educational Achievement (IEA), 5 which is providing international as well as national-scale data services (Lawn, 2013: 21). Mediatory data services further include online-coaching systems (webinars), the digitalized collection and administration of data within sub-divisions of educational departments or the conversion of teaching and learning material into intelligent software.
Finally, a growing number of (private) consultants for districts or schools is offering support in the organization of data production and data reporting, as well as an effective usage of learning platforms or technologies (Williamson, 2015a). Hereby, the providers regularly sell leasing contracts for software access (Burch, 2009: 99–104), which districts or schools can then use (for a certain amount of time) to collect and store data on external servers as well as to analyse this data for purposes between school support and accountability mandates.
One example is Microsoft, who, often in concert with the Gates Foundation, has become widely engaged in the implementation of digitalized governmentality. Microsoft inter alia implemented several global networks and digital platforms to mobilize and facilitate the implementation of innovative digital data solutions in education (see https://mepn.com or https://education.microsoft.com ).Hereby, it addresses policy and school leaders, teachers and students, as well as market actors to become ‘Microsoft partners’. The portfolio Microsoft offers ranges from training and certifications, 6 online platforms for exchanging best practices, to numerous software and hardware products sold to nations, districts or schools. As the OECD, Microsoft hereby exercises soft governance by setting agendas (e.g. digitalizing learning and school leadership), by initiating and coordinating mutual learning activities and by presenting best practices from various countries. Selected best practices are then promoted as ‘success stories’ and published in white papers.
One example is the Department of Education in Tennessee, which successfully implemented a Microsoft cloud system to improve its data management and use. Microsoft promotes the system, which includes both data storage and analytics tools, as cost and time saving, while student performance is ‘boosted at a competitive cost’ (McCarthy, 2015: 1). The performance increase is hereby understood as ‘return-on-investment’, which needs to be taken into account when considering the enormous costs for either the implementation of hardware or the update of software licenses. With the new system, which is externally hosted by a so-called ‘system integrator’, the state department is now able to compare following:
‘Tools needed to plan for the individual needs of a student – against curriculum, against other students, and against other districts
Disciplinary issues
Attendance statistics
Multiple other data sets’. (McCarthy, 2015: 4)
Accordingly, the white paper outlines numerous options to link the cloud data to national accountability measures or to guide ‘interventions’: […] if the data indicates a student may be at risk, based on performance, attendance, health issues, and other factors. (McCarthy, 2015: 7)
The example of Microsoft demonstrates how data mediators successfully operate between the global and the local as well as between sectors of education (such as administration or classroom practice), while gradually shifting education towards digitalized governmentality.
Transforming subjects into prosumers
In the perfect world of digital-era governance, state-organized educational institutions (such as schools) become gradually substituted with intelligent education networks, which operate as interactive online learning cultures, while schools and teachers are expected to secure IT-handling skills. Hence, the futuristic paradigm of digitalized education includes the image of incited users (not only students, but also teachers or administrators) who influence platforms or software themselves, ultimately transforming them into prosumers. The term ‘prosumer’ is conceptualizing educational subjects not only as non-stop consumers of software or data (visualizations), but also as (re-) producers of data who interact with data ‘co-creatively’ (Beer, 2013; Williamson, 2015b: 3)
When focusing on the governmental power of data mediation in education policy, the enhancement of interactivity, non-stop synchronization and the construction of prosumers significantly accelerate the transformative potential of educational databases and software. Hereby, digitalization and ‘self-entrepreneurialism’ (Bröckling, 2007) become increasingly integrated, while the paradigm of lifelong learning is shifted towards a technologized and numerically organized format. Exemplarily through applying interoperative user platforms, smart databases construct and simultaneously discipline subjects as interactive and technically skilled self-entrepreneurs. For example, learning analytics platforms: […] enable individual students to be tracked through their digital data traces in real-time and to provide automated predictions of future progress. […] Such ‘big data’ practices are distinct from the large-scale data-sets used in contemporary techniques of governance [such as international assessment]. The point is that big data are positioned to short-circuit existing educational data practices, enabling data and feedback to flow synchronously and recursively within the pedagogic apparatus of the classroom itself. Thus, while large-scale statistical data systems acting ‘at a distance’ continue to influence national systems of education governance at temporal intervals, new digital data analytics complement them by providing automated feedback intended to govern ‘up close’ through recursive interaction with the individual student in real time. (Williamson, 2015a: 3)
This is especially true as soon as interactivity is linked to social online networks, which integrate the user into digital groups of peers, while at the same time presenting those peers as (indirect) competitors for performance acknowledgement. This acknowledgement can in turn be provided by the peers or by the software.
To illustrate this, the learning software provider scoyo promotes its interactive, ‘game-based’ learning solutions as follows: An important driving force of game-based learning lies in scoyo’s score system. Each learning game includes the achievement of scores, which embodies an exact feedback for the learning performance. By combining them with achievable learning levels and a highscore ranking, children are encouraged to stay motivated by anonymously comparing their scores with others. The learning portal includes a personal learning profile, which integrates all activities of your child and offers deeper insights into subject-related scores, learning strengths and weaknesses as well as into achieved scoyo-levels. (www-de.scoyo.com/Informieren/lernerfolg.html, 21 September 2015, translation by SH)
However, such platforms are not only established for improving individual learning activities, but also in the fields of network-based policy exchange, knowledge (data) transfer or reform cooperation (Hargreaves, 2003). Again, users participate, collaborate and learn at the same time, while presenting their performances to and interacting with both a technical device and a visual group of learning peers/competitors (see also Beer, 2009: 986). Hereby, performative power is produced generatively (Beer, 2009: 994) by initiating a twofold process of socialization along technical and social routines, practices and rules of the game. By operating as automated, ‘connected ecosystems of learning’ (Williamson, 2015a: 92), such platforms co-produce programming and visualization together with the user, ultimately dispensing with teachers (or, more generally, with human coordinators) while shifting more and more attention towards efficient software engineering.
By closely observing the new OECD platform Education GPS (http://gpseducation.oecd.org) as well as the Learning Curve, which has been developed and globally promoted by Pearson Education (http://thelearningcurve.pearson.com), Williamson (2015b) provides recent examples of how smart educational data platforms successfully exercise digitalized governmentality in education. Both platforms use the mode of interactive data production, which offers users a flexible, individual and automated combination and visualization of data, regarding them as prosumers. The data is generated from large global databases such as PISA, which the user can proceed exemplarily into individual or comparative nation reports (Williamson, 2015b: 8f). As a special feature, the software automatically visualizes maps, graphs or ranking, promising an easy way to evaluate education policy at the largest possible scale. As the platforms advertise, time-consuming, scientific analyses can hence be substituted with real-time, flexible evaluation, which can be exercised by anybody.
Similar findings are observed by Burch (2009: 102) in terms of a district benchmark assessment system offered by CTB/McGrawHill
7
in the USA: Educators and administrators purportedly will have the ability to ‘analyze complex data with a few simple keystrokes – Get statewide or individual student achievement information with just a few clicks on your keyboard. The system gives you instant reporting capabilities on any and all national, state, and local test results right at your desktop computer’.
In summary, not only students or teachers become increasingly envisioned to organize their learning and teaching via digital technologies and interactive learning/teaching devices. Instead, all participants in education policy, who are working with and around data, become gradually addressed as prosumers, ultimately co-creating digitalized governmentality.
Concluding remarks and implications for future research
International assessments appear as one of the most salient features of globalization in education. Large-scale studies such as PISA are conducted on a regular basis and produce a constantly growing body of data. Even though PISA, at least so far, only partially presents big data in the strict sense of high-volume, high-velocity and high-variety information, PISA and the OECD still triggered a growing orientation towards digital information technologies and smart data production.
This increasing importance of functional numbers and data as evidence for educational decision-making created a new demand for digital information technologies, data mediators and data service suppliers. The premise of this article was that these digital information technologies and the newly empowered mediatory actors have, after all, the potential to implement a new mode of digitalized governmentality in education, which reaches far beyond policy, into educational administration, school practice and individual learning activities.
Digitalized governmentality hereby recomposes objects and subjects around new regulative constellations between the global and the local, which are constituted by (digital) data flows. These constellations no longer exclusively refer to territorial boundaries, but rather to new topological spaces of governmentality. In other words, new governmental arrangements are created, which link the digital sphere of code, software and programming with various settings in politics, administration or school practice. Data mediators design or facilitate these linkages, for example by developing and providing functional technologies, by supporting effective data use or by proceeding complex (assessment) data into simplified visualizations.
Ultimately, digitalized governmentality inheres tremendous performative power, which is co-produced by software, software engineers and users, as well as a growing number of prosumers, who flexibly interact with smart data platforms or learning technologies. This trend is not limited to the field of education, but is affecting all parts of society that are shifting towards ‘digital-era governance’. Nonetheless, since digital technology depends on the skills and capacities of human actors to enact it, educational institutions have become particularly expected to teach young people how to become successful prosumers. In digitalized governmentality, this ‘learning to code’ (Williamson, 2015c: 265) marks a central requirement to generate governable digital citizens.
Admittedly, at least in most countries and educational sectors, digitalized governmentality has not yet become as smart and powerful as some of the described examples may indicate. In particular, legal restrictions of data access have, at least so far, often limited the options for governmentalizing digital technologies. However, given the significant acceleration of data production and mediation in all parts of the world, it seems all the more important to undergo corresponding research. A particular focus should then be put on conflicts and ambivalences, which are, for example, resulting from diverse actor interests as well as from cultural or legal influences.
This also includes a closer observation of (asymmetric) power constellations within digitalized governmentality constellations. Such a critical perspective would question such constellations as solely opening up education and society towards worldwide and democratic access to functional data. Instead, building on Edwards (2015b: 252), …while there is a surface normative attractiveness in such notions, they do little to engage with the critical research on such issues as access to the necessary hardware, software and bandwidth to be able to access such opportunities, the work of the digital on the forms of data, information and knowledge opened, […], issues of expertise and authority in knowledge production, and the value to participants in relation to their own goals and aspirations.
Consequently, the additional need for research lies in a more critical observation of the different digitalized arrangements of governmentality in terms of closed-ness and asymmetric power distribution. The latter becomes obvious when, for example, considering that data service suppliers often sell their products via leasing contracts for a time-restricted access to particular software. As Burch (2009: 104) notes in her observations of benchmark assessment providers in the USA: […] [Users] lease the shell or software program that allows them to manipulate the test score data. If they terminate the contract, they no longer have access to the shell and may not have access to the data.
Hence, as I tried to outline in this article, observing the current developments in data-based education policy needs to take into account the particular, yet highly ambiguous role data and technology mediators play within the promotion and diffusion of (but also within the resistance against) digitalized governmentality as governance by big data, which is both topological and heterarchical. Such a perspective rejects education data technologies as natural or black boxes, but instead strives for a stronger visibility of dynamics, potentials and ambivalences (Edwards, 2015a). Here, future research needs to further explore if, how and by whom digitalized governmentality might be implemented as a new global, political and digital economy of education (Crossley, 2014).
Footnotes
Declaration of conflicting interest
The authors declare that there is no conflict of interest.
Funding
This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.
