Abstract
Educational governance is commonly predicated around the generation, collation and processing of data through digital technologies. Drawing upon an empirical study of two Australian secondary schools, this paper explores the different forms of data-based governance that are being enacted by school leaders, managers, administrators and teachers. These findings illustrate a range of routinized ways in which digital data is being used within schools as a means of accountability. Alongside data regimes associated with the ‘governing by numbers’ enforced by state and federal governments, are smaller-scale accountability procedures and practices initiated ‘in-house’ by school managers and/or teaching staff. While digital technologies are clearly reinforcing wider trends in educational managerialism, the paper also considers the subtle ways that local enactments of such governance are shaped by schools’ relatively unsophisticated data processing technologies and techniques.
Introduction
Educational governance has altered substantially over the past 20 years or so, with increasing emphasis placed on principles of ‘performance’, ‘effectiveness’ and ‘accountability’. Schools in many countries are now expected to engage in processes of strategic planning, risk management and ‘evidence based’ interventions. Schools are also subject to disciplinary regimes of inspection, evaluation, target-setting and comparison. What was once the (relatively) direct government of school by central and local government actors has devolved into a ‘new governance turn’ characterized by dispersed and disaggregated forms of accountability, ‘answerability’ and self-evaluation between
These shifts have prompted unease amongst many academic commentators. Schools have been described as operating within a ‘climate of hyper-accountability’ (Keddie, 2014), stymied by audit cultures predicated upon imperatives of ‘efficiency’ and ‘improvement’. These conditions have been criticized roundly in terms of failing to reflect educational outcomes beyond high stakes examination results and associated measures of student ‘through-put’. Concerns have also been expressed over the increased responsibilization of individual schools, teachers and students for their relative educational ‘successes’. Rather than being an institutional responsibility, it now falls routinely to ‘individual practitioners to organize themselves as a response to targets, indicators and evaluations’ (Ball, 2003: 196).
The changing nature of school governance is manifest in a variety of ways. The present paper focuses on the part that data (specifically data processed through digital technologies) plays in enabling and/or extending current forms of school governance. Indeed, most of the processes and practices just described are underpinned by measurements, indicators, numericisation, metrics and models. This paper, therefore, seeks to add to recent accounts of the ‘datafication’ of education (Lingard et al., 2014) and ‘new modes of data-driven rationality’ (Sellar, 2015: 138) that now confront ‘data-based’ schools (Finn, 2016). To date, academic commentary has tended to frame this data-turn in passive and somewhat disempowering terms. Thus we are warned of ‘schools and districts becom[ing] data farms, providing an unending supply of harvestable data’ (Dean, 2014: 19). Similarly, education is seen to be rendered ‘machine-readable’ by the ‘proliferating database-related technologies of governance’ (Williamson, 2015: 2). In these terms, data-based governance has understandably begun to attract close critical scrutiny.
Of course, such uses of data
These issues are informing the ways in which the data-based governance of schools is currently being discussed and debated. Yet, to date, these discussions have tended to lack empirical substance. More attention therefore needs to be focused on actual (rather than assumed) experiences of digital data within schools. In particular, the arguments and assertions outlined above need to be tested through empirical studies of ‘the everyday use of data and analytics from a social perspective’ (Couldry and Powell, 2014: 2). As such, this paper investigates the realities of data-based school governance. In particular it explores the following research questions:
(i) What conditions and logics of governance are at work within schools?
(ii) What forms of data generation and data work underpin this governance?
(iii) What are the consequences of different forms of data-based governance?
Methods
These questions are addressed through analysis of research data collected over a twelve month period (December 2013–November 2014) as part of a study of digital data use within two secondary school settings in the city of Melbourne, Australia. 1 These were government-run public schools under the governance of the state of Victoria’s ‘Department of Education & Training’:
Northside High School – inner-city suburban location, roll of 1500 students;
Westside High School – outer-city suburban location, roll of 800 students.
In particular, the study explored the ways in which digital data and data work was being experienced and utilized by senior leaders, managers, administrators and teachers within these schools. The primary focus of enquiry was on data work relating to matters of teaching and learning (as distinct from finance, human resources and so on).
In order to identify the nature and form of the data sources, systems and other technologies that were being used, the researchers conducted initial site visits and ‘data audits’ in conjunction with administrative and technical staff within each school. Interviews were also conducted with staff in the data systems division of the state Department of Education & Training. A series of subsequent in-depth individual interviews were then conducted with staff in each school who were identified as leading data actors – i.e. school principals and assistant principals, data managers, systems managers, heads of school, teachers and administrators. Ten in-depth interviews were conducted with these individuals as well as two end-of-project workshops with eight school leaders, managers and senior teachers. 2 Thematic analysis of the data corpus arising from these research activities (based on the project’s a priori interests in macro, meso and micro levels of governance) resulted in the identification of the following distinct forms of data-based governance:
(i) The use of data for ‘system-wide’ accountability;
(ii) The use of data for ‘within-school’ accountability;
(iii) The use of data for ‘within-class’ accountability.
In addition, a fourth theme emerged from the data relating to: (iv) the perceived limitations of these forms of data generation and data work. These forms of governance and the consequences in terms of the various forms of data work within the schools are now explored in more detail below.
Findings
The use of data for ‘system-wide’ accountability
Digital data was a key part of the formal governance structures enforced in both schools by state and federal governments. A range of indicators of institutional ‘performance’ was being produced each year to inform comparisons between schools. Thus digital technologies were being used to generate, accumulate and process sets of standardized data relating to student results on ‘National Assessment Program – Literacy and Numeracy’ (NAPLAN) tests and end-of-school ‘Victorian Certificate of Education’ (VCE) examinations. Other measures of school ‘output’ were derived from standardized online surveys administered annually by the state government (a ‘Student Attitudes to School Survey’, ‘Staff Opinion Survey’ and ‘Parent Opinion Survey’). These surveys were described by interviewees as producing data relating to ‘school climate’ [1] and ‘organisational health’ [2], and were core elements of the school-specific data released to the public through the nationwide comparative ‘My School’ website:
Every year, all kids are surveyed, all staff are surveyed and a random selection of parents are surveyed about school connection and whether their kids are being supported in teaching and learning and whether they feel safe at school and whether they feel inspired by their teachers. There’s a whole range of questions [3].
This data was all processed through the ‘Computerized Administrative Systems Environment for Schools’ (CASES). This data system was produced by a British company and supplied to every school in the state along customized lines specified by the state government. CASES was used in both schools to process all the externally scrutinized student administration data (e.g. records of attendance, test performance scores and examination grades), thereby acting as a direct conduit between school administrators and state education agencies. One director in the Department of Education & Training referred to CASES as ‘the source of truth for all the core student data’ [4].
Alongside CASES, each school ran a different version of a locally produced management information system – ‘Oracle School Manager’. This system was used on a frequent basis by teachers, administrators, students and parents to record and share data relating to students’ personal information, teacher assessments of academic performance, student target setting, records of attendance and behaviour. Most of the school staff interviewed characterized Oracle as a central point of reference, with one Assistant Principal also framing it as the ‘one source of truth’ [3] within his school.
Regardless of their relative truthfulness, CASES and Oracle were both sites for protracted amounts of data work. Administrators, teachers, students and parents were required to enter data into Oracle for a range of purposes – from logging test results to signing-up for school trips. Similarly, the uploading of data initially into each school’s CASES and then onto central government data systems demanded intense bouts of administrative labour. At the time of our research, the state government was using an aging data system that required the sequel replication of each school’s data onto a central ‘mirrored’ database. In Northside and Westside, processing this data was reckoned to involve hundreds of hours of computer work, with school administrators manually entering data into one system and then copying it across to the other. As Westside’s business manager described:
Anything that’s put into CASES is put in manually, so we manually enter that. You can’t actually upload. Everything has to be manual… So when we get 200 new students, the enrolment is 200 pages long and that all needs to be manually entered 200 times [5].
The subsequent analysis of this data by government agencies also involved an extended sequence of data processing and circulation. A number of different central agencies were involved in analysing the data uploaded from the CASES systems of Northside, Westside and the 218 other government secondary schools across the state. Agencies such as the state government’s ‘Performance & Evaluation Team’ and ‘Curriculum & Assessment Authority’ produced summary reports for each school through a central ‘business intelligence system’. These reports were distributed back to school principals though the state government’s ‘School Information Portal’. One departmental official characterized these reports as ‘summary level data… It’s not individual child data… but Year Ten, 20% of kids, what was the average rate of absence for March? That sort of level’ [4]. As this interviewee conceded, such data reports were provided to schools for primarily ‘superficial’ reasons:
Up until now, really, the major point of data has been so [school leaders] can do a report to the school community that says, this year we had a 4% absence rate, last year it was 5%. Oh we’re getting better. We must be doing something right. And that’s about the superficial level. I was on school council for 25 years. I saw all that from that side of the fence as well [4].
Beyond fulfilling ‘administrative compliance requirements… from which our funding is derived’ [3], school leaders and managers also described this data as being of little benefit. Interviewees in both schools saw the external data work as a one-way upward event. As one interviewee characterized the student and staff survey data: ‘they fill in the forms, off it goes and we never see it again’ [3]. At best, this was data that was ‘sucked up’ by central authorities and much later ‘spat back in report form’ [3].
This limited accessibility was reflected in the configuration of CASES and Oracle. For example, schools were unable to run queries or extract data from CASES. As one manager described her relationship with CASES: ‘I will tell [CASES] but it doesn’t tell me anything. There’s very little that goes back… It’s very isolated. The Department is very protective with their system and they don’t allow very much’ [5]. Similarly, only limited forms of processed data summaries could be extracted from each school’s ‘bespoke’ configuration of the Oracle School Manager System. While these cloud-based systems were set up to produce reports and summary tables and graphs, schools had no ready means of accessing basic data sets. Interestingly, school staff were relatively forgiving of this particular retention of data:
It’s not accessible and [Oracle] make no bones about it. They say that’s just how it is. I mean, they don’t integrate the system with any other products either so they don’t give their data out to other products. They say if there’s another product out there that’s going to do X then we’ll build it ourselves.
Yeah I’m sure they
Well what would we ask for? There’s so much data [6].
As this quotation infers, the inconvenience of such restrictions was tempered by a sense that much of this data was of limited practical use to the schools. In particular, it was contended throughout our interviews that this data contained little information that was relevant to matters of teaching and learning. As one state government official conceded, ‘our big problem with CASES and the data that we have is we have no data about student achievement really. Other than this A, B, C every year, there’s nothing in any common system around learning’ [4]. School leaders in Northside and Westside also conveyed an ambivalence towards the externally processed data:
I’ve got to be diplomatic here as to whether some of the data you get [back] from the Department’s actually useful… ‘Useful’ is a difficult thing because sometimes you get hit with data that doesn’t tell you anything because its system-wide data… but it really doesn’t tell you anything… One of the things we’ve realised here is that the data that’s possibly collected by the government is not necessarily relevant [2].
The use of data for within-school accountability
The data generated for external scrutiny was not the only form of data-based governance in evidence at Northside and Westside. Both schools were also engaged in extensive ‘in-house’ data work relating to forms of within-school accountability initiated by the school leadership. This involved the generation of data relating to areas of teaching and learning that were considered as able to be ‘improved’ through monitoring and intervention. Areas of interest across both schools included teacher and student ‘performance’, the ‘quality’ of teaching provision, indicators of student ‘engagement’, ‘satisfaction’ and ‘well-being’. This data was generated through recurrent cycles of target setting, subsequent self-reporting and then ‘reflection’ on ‘feedback’ from others.
School leaders and managers tended to justify these data processes as compensating for the less insightful ‘system-wide’ data described above. In contrast, these in-house processes were characterized as ‘powerful’ uses of data that could ‘make a difference’ [2]. Parallels were drawn, for example, with the emphasis placed on ‘evidence’ within the Australian education system:
[Evidence-based practice] is massive… we’ve had the message from the Department in Victoria for some time that it’s got to be evidence-based interventions and improvement. It can’t be just subjective and ‘we’ve got a hunch’. You’ve actually got to have the evidence and so our accountability process is built around that [1].
In Westside, this evidence-based imperative was being enacted through regular cycles of internal data collection and reporting for all classes. Every four weeks teachers were required to input indicative grades for student progress into Oracle while also administering online surveys to all classes. This data was then reviewed and summarized by school leaders in what was described as a process of ‘real-time reporting’ [7]. School leaders described these activities as producing ‘useful’ data ‘that gives us an overall global picture which can be useful from a holistic We’ve brought in this area where each kid’s got their own Google site where they put in their semester goals… which in itself is a very rich source of information. And one of the things [the school technology manager] does is rip the text out of their learner portfolio and put it into a spreadsheet for us [2].
Similar activities were taking place in Northside. Here, school management was using the school’s ‘Learning & Teaching Model’ to compel regular cycles of teacher ‘reflection’. In practical terms ‘reflection’ had come to be enacted through the (initially voluntary but then compulsory) use of class surveys administered through the Google Forms application. Each semester teachers were therefore expected to design and administer surveys to their classes. As one teacher described:
We’re supposed to identify areas for, I won’t say ‘improvement’ [laughs]… for ‘demonstration’ through the year. Usually it’s something you want to improve. And so at the end of each semester we give the class a survey. We started off with a default survey, but most people have modified it since then. And it includes things like ‘my teacher cares about me’, ‘I can access my teacher for help’, ‘homework is regular’. Standard things, but a lot of ‘my teacher respects me in the classroom’, ‘I behave well in this class’… so it’s a bit of everything on how the class runs.
Teachers then reflect on… could they improve anything there? So if my students don’t feel that I am sympathetic maybe that’s an area I have to work on. Or maybe my students don’t see that their homework is regular. If I’m sure I’m giving homework maybe it’s just I need to phrase it differently so that [students] realize it is to be done [8].
While all teachers and students were involved in generating this data, acts of data processing, analysis and reporting were conducted by a small number of managers, leaders and senior teaching staff. In Northside, processing these forms of ‘in-house’ data was left to two Assistant Principals and a Technology Manager – all relatively young male staff who were ‘quite interested’ and had ‘a rough feel’ [3] for numbers and statistics. The mediation of data in Westside was led by the school’s Principal and Assistant Principal, supported by a senior teacher who had been assigned part-time ‘responsibility points’ for data management. These staff valued this data as being under their school’s control and ownership:
This kind of stuff where we’re collecting either through Google Forms and from spreadsheets, that kind of stuff is easy for us to get and easy for us to analyse, easy for us to use and has no real implications on data protection. Anything that we gather in spreadsheets is within our domain so it’s all on our servers. It is important, but sometimes there’s red tape around what you can get and what you can’t get [9].
This ‘ease’ of collection and use corresponded with relatively crude forms of data processing and analysis. The analysis of this ‘in-house’ data was characterized as ‘summary data’ [8], ‘reporting that works’ [10], and ‘present[ing] data in a really digestible way’ [11]. A recurring description was one of ‘simplicity’:
Simple, it’s really simple. The actual technology, it’s not a huge system, it’s not a huge database. It is a way of grabbing data really quickly, really simply and using it, I think, quite powerfully… using quite simple technology [9].
In both schools this ‘simple technology’ took the form of ‘off the shelf’ software packages – most notably the Microsoft Excel spreadsheet and Google Forms. The majority of data analysis involved the rudimentary use of these packages’ visual representation features. Often this took the form of default graphing options (e.g. the bar, line and pie chart formats offered in Excel’s ‘Chart’ menu). Perhaps the most prevalent form of data presentation in both schools was the ‘conditional formatting’ of spreadsheet cells along variations of a ‘traffic light’ scheme – i.e. red, yellow, green and blue (in Westside)/purple (in Northside). As one school manager explained, this formatting was generally felt to render data comprehensible:
I add conditional formatting to the spreadsheet and the conditional formatting just reads the number and changes the colour of the cell. It’s just really helpful visually for us to analyse it quickly. When that’s just 3’s and 2’s and 1’s and 4’s, it doesn’t mean anything and it’s really hard to understand. So we use that conditional formatting there to allow us to quickly analyse that data. Otherwise it’s just raw, that’s just raw numerical spreadsheet data which comes from [the surveys] – so we export that to CSV so it works [9].
The use of data for ‘within-class’ accountability
Alongside the processes and practices described so far, there was a further layer of data work relating to teaching and learning that was being enacted in both schools. This was more ad hoc in nature, initiated ostensibly by small groups of staff keen to use data to ‘inform’ their own teaching practices. These data practices were characterized by teachers as personally-driven and person-centred, contrasting with the mandated data regimes described above. As one teacher in Westside reasoned:
[While] the mechanisms are there to get that data, I just don’t think that… as teachers we do depend on our gut feeling and knowing our kids really well… So there is a kind of danger in using data that doesn’t take into account any of that fine-knowing your kids. [Otherwise] it’s pure data – this is your postcode, this is how you did on this stage doing that and that in your tests [12].
Despite such aspirations, these teacher-initiated data activities and data practices were often focused on quantified indicators of ‘performance’, ‘improvement’ and ‘effective’ teaching and learning. Some teachers tended to talk of such data in strategic terms – for example, as ‘intell’ [6] and offering a ‘heads-up’ [1] of upcoming situations. Elsewhere descriptions ranged from generating data to provide a ‘quick glimpse’ and a ‘day-to-day snapshot’ [10] to using data predictively to look ahead to ‘possible bumps in the road’ [7]. All told, the underlying sense was that of generating data that could be used in a ‘just in time’ manner to inform and adjust teachers’ in-class work. As one Westside teacher concluded, ‘the more information you’ve got to try and draw out what you need to do when you’ve got ten minutes [with a student] makes life easier’ [12]
In this manner, some teachers were attempting to regularly measure and gauge different aspects of their classroom work. This often involved the use of in-class student quizzes and self-report exercises as indicators of progress and possible adjustment. As one maths and IT teacher described her surveying of students using Google Forms:
I’m trying to do a bit of work collecting data at the start or during a lesson which informs what I do next. So one of the simple things that I had a go at is using a Google Form with confidence ratings. So, ‘I’m really confident when we talk about emerging technologies’ or, ‘I’m really confident using variables’. It ranges from ‘I’m really confident…’ or ‘I’m not so…’. So from 1 to 5, rate the confidence level at these different skills that you’re developing in this lesson or you’ve looked at previously. And then that comes into a spreadsheet which is pre-conditionally formatted with a traffic light system. So if you look at the variables column of all these kids who have said ‘I’m red or amber on variables’ it gives me information that I haven’t covered that well or they haven’t picked it up. So I know I need to revisit this. So it’s in that learning analytics that I really find the immediacy of it [9].
As this account suggests, such individually-initiated data practices often echoed the ‘in-house’ data practices of reflective surveys, colour coded data sheets and so on. That said, a few efforts were decidedly more ad hoc in nature. In Westside, the maths department was using in-class digital photographs to provide immediate indicators of understanding and engagement. These ‘silly’ activities were described as constituting some of the most useful and ‘powerful’ applications of data within the school:
We’re gathering data to make learning more visible – using ‘confidence’ based questions in quizzes or just really silly things like ‘thumbs up’, ‘thumbs middle’, ‘thumbs down’. In the middle of a class everyone will put their thumb up and we take photos. We have some signals. If you understand a number of things, you can put three fingers up. So this kid here with the one finger, I know he understands one of the things I’ve asked him. This kid’s got three fingers up so he understands several. The kids who’ve got their hands joined up here, feel confident that they understand how these things link together. That for me is really powerful data. It’s not recorded officially but it is recorded in the photograph and it creates visible learning… People often don’t think that’s data. ‘Oh that’s not data, it’s not in a spreadsheet’ or ‘It’s not a number’. But it’s pretty powerful [9].
The limits of data-based accountability
All the examples described so far illustrate the diverse but often mundane ways that Northside and Westside had become orientated towards the everyday use of digital data as a means of accountability. This everyday-ness was reflected in the different ways that this data was presented and performed throughout the schools. In material terms, graphs, charts and summaries were frequently laminated and pinned to noticeboards and walls in communal spaces. Data reports and summaries were regularly printed out on paper and circulated amongst staff and parents. The presentation of data was also a key element of many staff meetings – as one teacher put it, ‘they put up the charts on the big screen in the staff meeting and … we’re expected to engage with the data’ [8]. Schools leaders and managers were therefore keen to foreground data as an integral element of their institutional cultures. As one Assistant Principal in Westside contended, data was a key feature of school ‘talk’:
We talk about it as data. So we’re increasing the talk. We need to talk more but data’s everywhere. So we had a staff meeting two weeks ago where we just talked about attitudes to the school survey and the data that’s thrown up by that [2].
While mindful of the prominence of data within the schools’ managerial cultures, some teaching staff remained unconvinced of its practical significance. As one teacher reflected on Westside’s ‘data culture’:
Well management certainly really likes data and ostensibly are quite data driven. But the actual data that we have and we actually use and analyse I think is quite limited… there’s no hiding the fact that it’s more individual people and individual faculties doing their own thing [13].
Indeed, interviewees would sometimes point inadvertently when discussing their own practices to the limited ways that data was being used. As one manager conceded, little was done with a regularly collected set of ‘student reflection data’ other than ensuring that students had entered the data: ‘All we use it for at the moment is to check the quality and… to check…. check they’ve done it. [Laughter]’ [12]. Another teacher recounted her regular surveys of student opinion in similarly ineffectual terms:
We do some surveys. So I do the learner voice, I go in and I record all of those [responses] and then share it back with the maths department so they get that feedback… [PAUSE]… Actually, just by verbalising that, I realize that we don’t count that data in
Doubts were also raised by teachers and managers over the usefulness of different forms of data being generated within the schools. Often these doubts related to a perceived lack of relevance to future work. As such, the ‘backward’ looking data generated through annual, semester or monthly activities was described as being of little use in future planning:
Currently data is used on a more analytical ‘Let’s look backward’ basis… we don’t actually have the data to ‘Let’s look forward’… kids get surveyed, we get the results then we look back and try and interpret it as a school… And inherently with semester reports, by the time [people] get the reports that’s not necessarily useful. Yes, it’s a great record, and all of that sort of stuff, but if you wanted to identify areas of growth for students in a better way it would
Also recurring throughout the interviews was acknowledgement of the partial representativeness of the data being generated within the schools. For example, teachers talked of using ‘traffic light’ data to identify students, teachers or classes that were regularly ‘coming up red’ or ‘flagging red’ as ‘cases’ that required further investigation and intervention. For example, as was described in Westside:
Our Assistant Principal for student empowerment will work on picking out those kids who have got lots of red… So all of this triggers events at a different level [9].
While a prevalent use of data across both schools, this data work was often acknowledged as a crude distillation of what were initially more nuanced measures of well-being, behaviour and performance. As another Westside teacher described the school’s use of the traffic lights: ‘it’s not particularly rich and it’s very blunt’ [13].
Most interviewees therefore conveyed a resigned sense that data generation within the schools was partial in its coverage. This was notable in terms of what was [Oracle] is not a perfect system for behavioural problems… I would say only 70% of behavioural stuff ends up on there. I still get teachers just stopping me in the hallway and saying, ‘so-and-so told me to fuck off’. And I say ‘can you put it on Oracle, please?’ And it doesn’t always get on there [11].
Interviewees were uncertain, however, whether the reductive nature of these data practices could be improved. As one Assistant Principal bemoaned, ‘I need good data about what’s happening with the kids academically, behaviourally, socially, spiritually… everything’ [1]. Generally, it was accepted that compromised data practices such as ‘flagging red’ and maintaining partial records of behaviour were pragmatic means of directing staff attention towards areas of concern and/or interest. As one senior teacher concluded, ‘[partial recording] is an on-going problem with data management in schools… How do you document it all? Oracle is definitely our best option, but as I say, it doesn’t always get used’ [11].
Discussion
As is the case across many national and regional education systems, the schools in our study were sites of various forms of data-based self-evaluation, comparison, accountability and general answerability. From high-stakes examination results to in-class surveys of student ‘well-being’, digital technologies were being used to generate and circulate various data-based ‘indicators of teacher, school or system performance’ (Sellar, 2015). On one hand, much of what we found could be seen as reflecting the forms of technology-based ‘mundane governance’ that have begun to take hold across most aspects of late modern society – from municipal waste disposal to airport security (see Woolgar and Neyland, 2013). In particular, much of the data work described in this paper would appear to fit the general logic of technocratic governance. Both case study schools were certainly sites where data was being used to measure, monitor and record key variables of ‘performance’ with the implicit intention of disassembling complex social situations into solvable (or at least improvable) problems (Kitchin, 2014). In this sense, our findings could be explained simply as broad forms of managerialism that are prevalent throughout society coming to bear on schools.
On the other hand, elements of our research did
While obviously an important element of how the schools were operating, we need to develop a balanced understanding of the relationships between digital data and school governance. In one sense, then, much of what has been reported in this paper reflects well-established trends of measurement, monitoring and auditing that sustain the ‘neoliberal corporate accountability’ of contemporary schooling (Ranson, 2003). Data technologies and techniques were certainly integral to the enactment of established power relations between state government and schools,
These logics were also being reinforced through staff concerns surrounding what makes for appropriate and useful data. For example, the concern with using data to ‘look forward’ replicates – albeit unwittingly – dominant understandings of ‘performance’ where one is judged only in terms of current and future performance rather than what has passed (Ball, 2003). Similarly, the desire to use data as a means of anticipating ‘bumps in the road’ and ‘intell’ replicates dominant logics of risk management and a ‘risk prepared, professionalized’ culture (Wilkins, 2015: 195) that pervades many contemporary education systems. As such, the data work and data infrastructures currently prevalent in schools certainly contribute to the continuation of long-standing conditions of managerial control and management of educational performance.
That said, our findings do seem to point to some differences in the particularly
As suggested earlier, this reductionism was also apparent in what was …experience off-the-grid [is] lost, it escapes regimes of measure without the harness of the archive or database. The ephemerality of experience beyond accountability.
At this point, it is also worth noting the restrictive and reductive presence of the specific digital platforms and software that were being used within each school. The ways that computer code and programming can shape and condition human action has been well discussed in the social studies of digital media. As Lev Manovich (2013: 2) puts it, ‘software has become our interface to the world, to others, to our memory and our imagination – a universal language through which the world speaks, and a universal engine on which the world runs’. From this perspective, the digital technologies being used within our case study schools were clearly setting the tone for how data was being used and circulated. This was apparent, for example, in the relatively impermeable configurations of systems such as CASES and Oracle. Even more significant, perhaps, were the limited data practices, circulations and materialisations that resulted from the standard settings and default options offered by the ordinary ‘off the shelf’ software packages being used by school staff.
In particular, Microsoft’s Excel spreadsheet application appeared to constitute a ‘meta-language’ (Manovich, 2013) through which data was being operationalized and acted upon in both schools. This was certainly the case in terms of what was being calculated, and how these calculations were visualized and spoken about. In particular, the presentation of school data appeared to owe much to the default graphing and formatting options of Excel. The preference in both schools towards the visual ‘resolution of complex information’ (Gregg, 2015: 39) was understandable. Staff in both schools were certainly willing to be led by Excel’s offer of the ‘conditional formatting’ of separate data cells from numbers into colours. While a standard feature of Excel, this raises the obvious tension between ‘the fantasy of command and control through seeing’ (Gregg, 2015: 37) when compared to the limited insights being acted upon in schools as a consequence of ‘flagging’ a student as a ‘red’, ‘yellow’, ‘green’ or ‘blue’.
Conclusions
Our investigations found digital data to have been established within these Australian school contexts as a key tool of self-evaluation, comparison, accountability and means of sustaining managerial forms of governance. As this paper has illustrated, data-based governance continues to be enacted ‘system-wide’ by state agencies that hold schools as institutions to account for their actions. Yet data-based governance is also apparent in the ‘in-house’ way in which school leaders and managers are holding individual staff and students to account. Moreover, this paper has highlighted the ways in which digital technologies reinforce existing trends in educational governance while also acting to shape local enactments of this governance along restrained and reductive lines. Of course, it is worth reminding ourselves that the overall result of these practices was one of increased complexity rather than a simplification of schooling. As Sellar points out, the complexities of schooling that are glossed over in commensurate digital data are not superseded by these reductive measures and metrics, but further complicated by them:
…[education] data are constituted as simplified abstractions from complex qualities, but these simplifications are also added to the world. We are always left with more, not less. We are left with the complex qualities subject to commensuration and the simplified representations produced through this process. (Sellar, 2015: 133)
In short, then, the limited data processing technologies and techniques being deployed with our case study schools could be seen as an additional layer that supports a distilled – and therefore even more pervasive and punitive – enactment of school governance. That said, it is important not to position the state government officials, schools and teachers in our study as passive and/or unthinking in their actions and responses. As reflected throughout this paper, most interviewees retained a degree of scepticism towards the different forms of data work being enacted within the schools. If anything, staff appeared to be positioned in a double bind by the data systems and data regimes in their schools – obliged to work with (or, at best, work around) dominant arrangements and understandings rather than directly challenge or refuse them altogether. As with most forms of audit culture within public sector organizations, the majority of individuals within each school were attempting to engage with these systems in as ‘morally proper’ a manner as was possible (Keddie, 2014). Indeed, these conditions of data-based governance appeared to be accepted resignedly as a relatively ‘good thing’ by most staff. As Kitchin (2014: 164) concludes:
It is not a case… that data are used simply in either good or bad ways; it is more complex than that. Often seemingly opposing outcomes are bound together so that people can be both liberated and coerced simultaneously – they gain personal benefit at the same time they become enmeshed in a system that seeks to gain from their participation.
In this light, critical commentators need to consider the possibilities of ‘thinking otherwise’ about schools and digital data. If we are at odds with the conditions described in this study then what alternatives are there? From what has been observed in this paper, it is certainly unhelpful to suggest that schools simply reject the notion of digital data outright. Yet while accepting that the data regimes and conditions described in this paper cannot be refused outright, it is still worth considering how better arrangements might be reached. How, then, might digital data be used to
Footnotes
Acknowledgements
The author would like to thank Michael Henderson and Shu-Hua Chao, as well as the participating schools and interviewees. The author would also like to thank the editors and anonymous reviewers for their comments on earlier drafts of the article.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Spencer Foundation (award number SG201400114).
