Abstract
In this paper, we investigate the temporal implications of data in law enforcement and border control. We start from the assumption that the velocity of knowledge and action is defined by heterogeneous formations and interactions of various actors, sites, and materials. To analyze these formations and interactions, we introduce and unpack the concept of “data temporality.” Data temporality explicates how the speed of knowledge and action in datafied environments unfolds in close correspondence with (1) variegated social rhythms, (2) technological inscriptions, and (3) the balancing of speed with other priorities. Specifically, we use the notion of data temporality as a heuristic tool to explore the entanglements of data and time within two case studies: Frontex’ Joint Operation Reporting Application and the predictive policing software PRECOBS. The analysis identifies two key themes in the empirical constitution of data temporalities. The first one pertains to the creation of events as reference points for temporally situated knowledge and action. And the second one pertains to timing and actionability, that is, the question of when interventions based on data analysis should be triggered.
Introduction
Over the past decade, datafication in security organizations has given rise to allegedly accelerated forms of governance and control. In law enforcement, data analysis is supposed to provide instant insights into ongoing and future criminal activities, enabling police departments to come up with quick and flexible crime prevention measures (Perry et al., 2013; Beck and McCue, 2009). Similarly, in border control, datafication is considered key in establishing real-time awareness of mobility flows and migratory movements, allowing for the efficient “management” of people on the move (Dijstelbloem and Meijer, 2011; Leese et al., 2022). The underlying idea in both cases is that if knowledge about relevant phenomena can be produced more quickly, then corresponding interventions could be carried out more effectively and efficiently (Wilson, 2019; Glouftsios and Leese, 2023; Aradau and Blanke, 2017). The ultimate case of such acceleration through data would be what Walters (2017: 797) has, in regard to border monitoring, deemed “live governance,” that is, a “set of arrangements that aspire to monitor and act on processes and events in near real time.” The rationales of law enforcement and border control thus seem to align with broader social theoretical diagnoses of modernity as characterized by a general desire to speed things up (Rosa, 2013; Tomlinson, 2007; Wajcman and Dodd, 2017).
Others have, however, uttered skepticism as to the capacities of data and algorithms to provide significant speed advantages in everyday practice. As Lohmeier et al. (2020: 1522) argue, “even the most efficient and powerful technologies cannot provide real-time results or seamless temporal transmission.” What they call “temporal leeway” highlights how the velocity of knowledge and action hinges on a wide array of sociotechnical relations. Such a stance ties in with empirical studies on data and acceleration. In border control contexts, Sontowski (2018) has for example shown how automated gates for faster border crossings have instead led to frictions and congestion, and Pollozek (2020b) has illustrated how data circulation among border control agencies causes “turbulences” that decelerate rather than accelerate intervention capacities. Similarly, in law enforcement contexts Andrejevic et al. (2020) and Leese (2020b) have demonstrated how within police organizations, data become tangled up in conflicting logical and organizational rationalities that tend to slow processes down rather than speeding them up.
In this paper, we investigate the temporal implications of data in law enforcement and border control. We start from the assumption that the velocity of knowledge and action is defined by heterogeneous formations and interactions of various actors, sites, and materials (Weltevrede et al., 2014; Kitchin, 2023). To analyze these formations and interactions, we introduce and unpack the concept of “data temporality.” Data temporality explicates how the speed of knowledge and action in datafied environments unfolds in close correspondence with (1) variegated social rhythms, (2) technological inscriptions, and (3) the balancing of speed with other priorities. Specifically, we use the notion of data temporality as a heuristic tool to explore the entanglements of data and time within two case studies: Frontex’ Joint Operation Reporting Application (JORA) and the predictive policing software PRECOBS. The empirical analysis identifies two key themes in the constitution of data temporalities. The first one pertains to the creation of events as reference points for temporally situated knowledge and action. And the second one pertains to timing and actionability or, in other words, the question when interventions based on data analysis should be triggered.
The analysis contributes to the literature by addressing a so-far understudied aspect of data assemblages. Ideas of accelerated knowledge and action have arguably been mostly implicit but seldom explicated themes in digitization narratives. Although attention has been paid to the speed of the availability of data (as witnessed, for example, by the velocity dimension of Big Data definitions) (Kitchin and McArdle, 2016) and to the temporal dimension of datasets themselves (Letondal et al., 2009), the sociotechnical forms of temporality engendered with and through data have not yet received systematic scholarly attention (for exceptions, see Coletta and Kitchin, 2017; Kitchin, 2014). Our analysis implies that data temporalities come into being in complex and situated forms that command close empirical attention. While in a vacuum, data and algorithms might in fact significantly speed up knowledge and action, these capacities tend to become undercut by practical entanglements.
The paper proceeds as follows: First, we introduce our case studies and revisit the methodological considerations and empirical data that inform the analysis presented here. We then develop the notion of data temporality. The subsequent empirical analysis illustrates how data temporalities in JORA and PRECOBS come into being through the production of events as well as trade-offs between organizational preferences. Engaging the particular empirical forms of constructing bureaucratically treatable phenomena and resolving frictions between data validation and actionability across the two cases allows us to identify and compare how different solutions are crafted, applied, and result in differentiated forms of speed. The paper concludes with a call for more empirical research on complex and at times contradictory data temporalities.
Methodology and empirical data
The analysis we present in this paper builds on qualitative empirical research that has investigated the use of Frontex’ JORA system in the context of “Operation Poseidon” (the Frontex mission in Greece) as well as the implementation and use of the predictive policing software PRECOBS (PreCrime Observation System) by six police departments in Germany and Switzerland.
Work on JORA was carried out in the context of a sociological research project on the bureaucratic and organizational rationalities of the work of Frontex in the Mediterranean, with a particular focus on how Frontex operationally supports Greek authorities in the areas of border surveillance as well as the identification and registration of migrants (Pollozek, 2020b; Pollozek and Passoth, 2019). JORA is an IT system designed to provide informational support for Frontex operations, both in terms of reporting and situational awareness at the EU external borders. Its main features include, among others, a reporting tool for border-relevant incidents, an administrative workflow that allows for the informational and operational management of recorded incidents, and a dashboard that is supposed to allow for seamless information flow throughout involved agencies. In this capacity, the system contributes to Frontex’ aspirations of “live governance” by providing “actionable information and analysis to enable the effective and efficient functioning of the European Border and Coast Guard, based on integrated risk analysis and vulnerability assessment.” 1 The focus of JORA is on any kind of criminal events and irregular border crossings that are encountered by Frontex and its partner organizations. Its main rationale, as Tazzioli (2018: 282) puts it, is to enable preventive strategies to “respond to future migratory risk, migrations and changes in the logistics of crossing” by providing timely information to border control actors.
PRECOBS was studied in the context of a criminological research project on the implementation and use of data and algorithms for targeted crime prevention (Egbert and Leese, 2021). PRECOBS is a commercial software tool for the prediction of increased crime risk that is used by police departments in Germany and Switzerland. It has been one of the first predictive policing applications that have been integrated into everyday police work on a regular basis in Europe. The scope of PRECOBS is to provide police departments with insights into ongoing criminal activity and enable them to reallocate their resources for crime prevention in a targeted fashion, in the best-case scenario preventing further offenses from occurring in the first place (Schweer, 2015; Balogh, 2016). The basic rationale of the software is to algorithmically identify patterns in crime data, enabling timely intervention strategies into ongoing developments (Kaufmann et al., 2019). To do so, it mainly focuses on residential burglary and builds on near-repeat victimization theory, rational choice, and situational crime prevention as conceptual underpinnings of its functionality (Egbert, 2017).
The selection of JORA and PRECOBS as case studies in the context of data temporalities arguably commands some justification. Although law enforcement (and especially domestic policing) and border control are usually legally separated domains and pertain to different empirical phenomena, we believe there are several commonalities in their larger organizational and operational logics that are targeted by datafication. Both police agencies and Frontex are public sector organizations whose existence is directly linked to the production and maintenance of a particular, legally prescribed form of social order. To do so, they both seek to identify and manage events that could threaten the social order that they are tasked to uphold. In law enforcement, these events take on the form of crime, whereas in border control, they take on the form of irregular border crossings. Both types of events are increasingly supposed to be discovered, managed, and/or prevented through the production and analysis of digital data.
Most importantly, both JORA and PRECOBS are strongly driven by the aspiration to accelerate knowledge and action processes within security organizations. JORA is supposed to provide “a constantly updated picture of the irregular migration situation at the external borders of the EU” (Frontex, 2014: 35) and foster increasing “reaction capability” (European Commission, 2008) for border control actors by interlinking various data streams and information systems. Similarly, PRECOBS comes advertised as a tool that allows police departments to move from reactionary to preventive action, accelerating crime analysis and subsequent targeted prevention capabilities to such an extent that the police would be able to intervene in ongoing events (Schweer, 2020). Both JORA and PRECOBS are thus driven by the ambition to instill security organizations with a heightened sense of awareness and increased reaction capacities that are as close to real time as possible. In the context of this paper, the juxtaposition of the two case studies allows us to show how in everyday practice, different forms of data temporalities come into being in contextualized forms that depend on the ways in which security organizations create “events” and decide on actionability requirements.
Empirical data in both contexts consist of interviews with involved practitioners, participant observation, and document analysis. The resulting datasets have been separately analyzed via in-vivo coding, resulting in topical clusters that allowed for the identification of pertinent themes around datafication and its implications for organizational forms of knowledge and action. In both cases, temporality featured as a prominent cluster, inspiring us to engage with data and speed across the two case studies. Although the research design should not be mistaken for a comparative study in a formal sense, engaging with data and speed in JORA and PRECOBS allows us to identify similarities and differences in how temporalities come into being through the interplay of different social rhythms, technological inscriptions, and the balancing of speed with other organizational priorities.
Data temporality
Before turning to the empirical analysis, in the following we introduce and unpack the concept of “data temporality.” As opposed to “time” as a universal concept, temporality describes concrete and contextualized manifestations of time and their relations and ordering effects (Ho, 2021: 1668; Hoy, 2009: xiii). In this capacity, temporality draws attention to the relational ways that determine who gets to make sense of time and in which ways (Sharma, 2014). Importantly, temporality is closely entangled with the social and technical tools that are used to bring relationships with time into being. As Koselleck (2002) puts it, there is no temporality independent of the devices through which time is conceived and experienced. Temporality is thus the product of an interplay between social and technical elements that need to be studied both with regard to the interactions that surround it and its technological registers (Münkler, 2009). Data temporality, in this sense, describes the temporal ordering effects that are produced with and through data within larger sociotechnical environments. Drawing on literature from criminology, critical data studies, and STS, we introduce three registers that affect how data temporalities come into being and unfold meaning.
The first register pertains to the variegated social rhythms within which digitally mediated knowledge and action are inevitably embedded. The most pertinent definition of rhythm in regard to social life arguably goes back to Lefebvre (2004: 15), for whom it is constituted of the “interaction between a place, a time and an expenditure of energy.” The social sphere is thus from a temporal point of view characterized by a multiplicity of simultaneously existing rhythms and the ways in which they blend into or clash with each other. Felson (2006: 6), in applying the concept of social rhythms to crime and urban life, has noted how “the daily life of a city provides the targets for crime and removes them,” as for instance “the daily movement of activities away from residential areas makes burglary easier.” For him, the temporality of crime is thus deeply affected by the rhythmic interactions that form the “metabolisms” of both the city and the crime itself.
In regard to how data temporalities intersect with variegated social rhythms, Coletta and Kitchin (2017) have retraced how data-driven “smart city” programs in practice enter complex temporal relations with different forms of social activity in urban environments. To account for these relations, they decompose specific form of time-related status or change (e.g. speed, acceleration, or slowness) through the rhythmic forms of the phenomena that they relate to. The proverbial “heartbeat of a city,” so they argue, thereby provides the resonance frame within which data-driven interventions in urban governance must be integrated. What Coletta and Kitchin (2017: 12–13) term “algorhythmic governance” then describes the analysis and making of rhythms, “measur[ing] and reveal[ing] the polyrhythms of the city [and] actively […] mediating and calibrating repetitions and rhythms in the world.” Consequently, an analysis of data temporalities needs to include the sociotemporal ordering of phenomena into which predictive analytics seek to intervene into.
The second register concerns the inscription of temporalities into technological tools. The notion of inscription is used by STS scholars to capture how science practices involve particular forms of written and visual communication, calculation, and other technologically mediated ways of doing and saying (Latour, 1990). Importantly, each mode of inscription comes with its own set of rules that define which statements can be formulated within the boundaries of a given system (e.g. technical drawings or sensor measurements) and how they must be expressed. Inscriptions thus prestructure the ways in which scientific facts are assembled and stabilized. In this capacity, the notion of inscription can also provide important hints at the role of technological tools in the constitution of temporality.
In regard to how data temporalities intersect with technological inscriptions, Bourne et al. (2015) have analyzed how a portable integrated CBRNE (Chemical, Biological, Radiological, Nuclear, and Explosives) detection device assembles and mediates multiple temporalities. As they show, it is through a “dialectic [design process] of resistance and accommodation” (Pickering, 1995: 22) that different chemical reaction times of substances, international standards for radiation and nuclear detection, estimated durations of border practices of detection and identification, as well as tempi of data processing regarding different forms of detection are stitched together and produce a “Total Detection Time” (Bourne et al., 2015: 317−318). Those distributed temporalities then “prescribe” (Latour, 1990) forms of data processing in advance by the “everyday decisions of scientists and engineers in the laboratory, the security experts they engage, and the material components of the device itself” (Bourne et al., 2015: 307).
The third and final register of data temporality concerns the balancing of speed with other organizational priorities, requiring an analytical focus on how different bureaucratic and operational needs are articulated and related to each other in particular contexts. When different priorities cannot be aligned without frictions, balancing is often formulated in the logic of a trade-off, that is, the idea of a reciprocal influence between two or more concepts, whereby an increase on one side would automatically mean a decrease on the other side(s) (Da Silveira and Slack, 2001). Although trade-offs are a common framing for apparent clashes such as between speed and the reliability and accuracy of data (Maguire and McVie, 2017; Mayhew, 2014), they can also pertain to less obvious relations such as between accelerated data processing and legal data protection obligations (Bellanova, 2017).
In regard to how data temporalities intersect with trade-offs, Mackenzie (2017) has for example shown how in high-frequency trading the development of data infrastructures is affected by tensions between considerations of speed, cost efficiency, and reliability. And Pollozek (2020a) and Pelizza (2020) have pointed to the different operational priorities of national border guards and EU forces in border control and migration management, illustrating how some actors are in favor of accelerated registration and data production to resolve administrative logjams and quickly clear overcrowded camps, while others articulate concerns about resulting sloppy data entries into European migration control databases and conflicts with larger EU policy goals.
Data temporalities in JORA and PRECOBS
The remainder of this paper retraces empirically how data temporalities come into being in JORA and PRECOBS. Using rhythms, inscriptions, and trade-offs as analytical sensitivities, the analysis identifies and engages two key modes of how data temporalities emerge in sociotechnically mediated ways. The first one pertains to the creation of events as temporal reference points for knowledge and action. And the second one pertains to timing and actionability, that is, the question of when knowledge should be turned into action.
Events
As discussed earlier, the work of both law enforcement and border control to a considerable degree revolves around the detection, identification, and treatment of events. In JORA, events are irregular border crossings or other criminal events in relation to the EU external borders. And in PRECOBS, events are criminal offenses, particularly in the form of serial criminal activity. The notion of the event, in abstract terms, provides a bureaucratic reference point that renders empirical reality processable and manageable within the scope and instruments of an organization. For Anderson (2010), from an ontological perspective, events can be understood as disruptions of the established order of things – and this disruptiveness challenges routine ways of governing. Involved actors – usually those organizations that carry out security-related tasks – thus need to address and resolve events to prevent the further unsettling of social and political order.
At the same time, the event itself is a necessary condition for government, as it marks a point of departure for either the reestablishment of the previous order or the erection of a new order (Anderson and Gordon, 2017). Ingram (2019: 166) notes in this regard how “in the course of an event, the orders of materiality, politics and publicity themselves intersect with, and cross over into, each other.” For him, the disturbance created by events is particularly relevant for how we think about security, thus speaking closely to the social ordering functions of security actors such as law enforcement and border control organizations. From a security point of view, as Anderson and Gordon (2017: 160) argue, events then need to be “drained of their eventfulness,” that is, their disruptive nature needs to be tamed through the assembly of information that can inform interventions that subsequently fold any disturbances back into the desired course of things.
Understood through the notion of the event, the purpose of both PRECOBS and JORA is to provide the possibility for the accelerated creation of events based on data as a baseline for subsequent action. Notably, as we have argued earlier, a central idea in both cases is to speed up knowledge production to an extent that interventions are made possible early on and events can be drained of their eventfulness as – or even before – they emerge. Analytically, a major focus must thus be on how events are made up and rendered treatable through the production, assembly, and validation of data. As we explicate below, the creation of an event is not an automated process but is accomplished by a variety of different activities that are in turn informed by rhythms, inscriptions, and trade-offs.
In the case of PRECOBS, our empirical data show how multiple social rhythms and inscriptions interact in the production of events. After a crime has been reported, police departments usually dispatch a patrol car to the crime scene where police officers collect evidence and trigger a bureaucratic process that creates an official record to which data about the characteristics of the event as well as metadata are added. A first complication thereby tends to occur in relation to the uncertainties that surround the temporal classification of an offense. The point in time when a domestic burglary has happened, for example, can often only be approximated due to the absence of the resident during the offense and the discovery only at a later point upon return from their absence (which can range between a couple hours and multiple weeks). In the context of PRECOBS, the time of an offense is an important analytical variable due to the aspiration of intervening into still ongoing serial criminal activity. Outdated data, in other words, rather quickly lose their analytical value as resulting interventions would already be too late for preventive purposes. A key task for police officers is thus to substantiate the temporal characteristics of an event as quickly as possible throughout the course of the investigation, for example based on the statements of witnesses or other information sources (Leese, 2020a; Leese, 2020b).
Once new information on the temporal dimension of criminal offenses is available, crime data need to be subjected to practices of updates and corrections. What is usually referred to as quality control, that is, the activities that are supposed to make sure that data are as accurate and complete as possible such that they can be trusted as the foundation for further knowledge and action, is, however, in itself determined by the rhythms of police work. Crime data are, for example, often not subjected to checks right away but only with considerable delay. As our empirical data show, most burglary cases are usually reported between 16:00 and 02:00, that is, during the night shift when fewer officers are on duty at the station, leaving little time for quality control. We also found that in many police departments, the officers who produced the data in the first place are tasked with corrections and updates. Depending on how work schedules are handled internally, they might, however, only be on duty again multiple days later, thus effectively delaying the correction of errors and/or updates of crime data based on new insights gained in the meantime (Leese, 2022).
Moreover, the creation of events in PRECOBS is prestructured by the technical devices and interfaces that police officers use for reporting and data production. At a crime scene, the ways in which crime is turned into data are mediated by the reporting templates and underlying classification system that structures police databases and the production of aggregate statistics. To produce uniform records that can be aggregated and analyzed, reporting templates to define criminal events mostly by means of standardized categories that specify the characteristics of an offense – in case of residential burglary for example the location, stolen items, and the so-called “modus operandi,” that is, a detailed description about how the burglar gained access to a dwelling (Egbert and Leese, 2021: 86). In regard to temporality, reporting templates require a data value for the (exact or approximated) point in time when the offense took place. Temporal variables, although often subject to change, are thus fixed – at least in a preliminary fashion – through the inscriptions that mediate how crime data come into being and make crime analyzable and governable in a systematic fashion.
Importantly, once crime data have been produced and put into the central police database, they branch off into two related but operationally separate versions. The first version is used for process management, containing all relevant information about how the investigation is handled by the police. The second version is the actual case file that contains all relevant factual information that is produced throughout the process. The case file is part of the police database that would be intuitively associated with knowledge, as it contains detailed information about the characteristics of a particular offense. In relation to data temporalities, the branching off of crime data into two related yet separate files means that data need to be subjected to separate update processes. These processes, in turn, can take multiple iterations, rendering the event and its representation in data contingent on recursive adjustments (see Figure 1).

Criminal events and recursive updates.
This should, however, not be mistaken for a smooth or straightforward activity. Rather, as our empirical data show, practices of investigation, reporting, and updating touch upon the domains of multiple specialized units within police departments and accordingly require substantial forms of alignment and coordination. Notably, whereas technical personnel and crime analysts are concerned about data quality and the importance of corrections and updates, frontline officers often have to deal with a massive overall workload that includes many different tasks that they might (have to) prioritize over quality control and updates (Egbert and Leese, 2021: 93). Hence, the loose collaboration of actors coming from different social worlds with differing activities, backgrounds, relevancies, and pragmatics make data updates challenging and error-prone (Star and Ruhleder, 1996).
In summary, a perspective on data practices in PRECOBS sheds light on how absent criminal events are made present through a lengthy process of manual labor that operates at the intersection of different rhythms and inscriptions. The creation of crime events as defined by the reporting template usually demands an ongoing investigation that goes beyond a shift of a patrol officer. Throughout this process, the temporal characteristics of burglary are continuously updated and the corresponding event is readjusted. The interplay of a classification set inscribed into the reporting template, the constraints and pragmatics of patrol officers, and the contingent course of the police investigation are thus key elements in shaping the temporal unfolding of data collection. The recursive process of investigation and the rhythms of data collection, reporting, data validation, and data upload produces frictions that lead not only to incomplete data but also to severe temporal delays in terms of how quickly crime data can be analyzed and inform interventions (Leese, 2020a; Leese, 2020b). It is in this sense paramount for police departments to decide when data are “ready” to be analyzed. We will engage with this question in more detail below.
In JORA, events carry equal temporal significance but take shape in a slightly different way. A “border event,” that is, a border crossing that warrants more or less timely intervention on the side of border control agencies, is here not represented in a centralized fashion but in a dispersed way that includes multiple actors. A second key difference is constituted by the fact that different from police officers who reconstruct an unknown phenomenon ex-post, border guards are usually directly involved in the events that they report. Rather than updating, the key practice can here thus be described as synthetization that assembles various reports to create a single “master” version (Pelizza, 2016) of a border event. Analytically, the focus must then be on the coordination of distributed activities of reporting across various events, actors, and places.
Generally speaking, a border event is created through the JORA “incident report” template, an online entry mask featuring more than 60 data fields. Within this classification system, border events are split into several parts and coordinates, relating to different border guard units that carry out different tasks. Importantly, the reporting template asks for specific geographical and temporal dates that refer to the detection of a vessel at sea (which can be done by thermo-vision, land patrol units, boats, or airplanes). Another item battery specifies time and place of the “interception of migrants,” which is conducted by sea vessel or land patrol units. It also asks for the condition and equipment of the boat, as well as for smuggled goods, which implies that border guards at site should inspect the boat, search it for drugs, cigarettes, or other goods, and produce documentation, for instance, take pictures of marine engines. Notably, also data on “facilitators” or “operators” including photographs are requested (Frontex, 2016).
Moreover, there are categories concerning the identification of migrants, including nationality, age, gender, or country of departure. For these, additional documents are checked and migrants are interviewed. In the context of Operation Poseidon, these tasks are conducted by the so-called Frontex Advanced Level Document Officers (ALDOs), Frontex screeners, and Frontex fingerprinters deployed in registration and identification centers. There is also an entry field that inquires about the “impact level” of a border event that requires an assessment of the “level of intervention, technical tools, people and money that are estimated to be necessary for managing a certain migration phenomenon” (Tazzioli, 2016: 566). Last but not least, the incident reporting template asks for details regarding any search and rescue (SAC) operations in relation to the event.
What becomes clear from these categories is not only how the characteristics of border events come into being in technologically mediated ways but also how border events epistemically hinge on heterogeneous information from different sites, practices, and actors of a border monitoring mission. Similar to the rhythmical intersections of crime, police work, and data production discussed above in the context of PRECOBS, in JORA the temporal dimension of event data intersects with the complex and layered rhythms of a border mission. Frontex activities in the context of Operation Poseidon include the coordination of coast guard maritime units, aerial units, and thermo-vision vehicle units that are instructed to detect incoming boats and subsequently inform the control center which then dispatches units for interception. After migrants have been brought to shore, screened, and provided with first aid, they are taken to the closest registration center where they are identified, fingerprinted, medically screened, and legal documents are issued (Tazzioli and Garelli, 2018). The rhythms of each phase of a border monitoring mission may, however, evolve in very different ways.
First of all, the detection and interception of migrants is entangled with the rhythms of migratory movements and patrolling. Patrolling is about the positioning and movement of border guards through terrain for the “tactical domination of space” (Nail, 2016). From an operational point of view, the question is thus where and when to position border guards and which patrol routes to take. At the same time, migrants and smugglers are usually aware of established patrol routes and seek to evade the tactics of border guards (van Reekum, 2019). Empirical data conducted in the context of Poseidon activities on Lesvos show how the rhythm of migratory movements from Turkey to Lesvos undergoes continuing alterations. During some weeks, boats primarily headed to the north of the island, while during other times southern landing points were preferred. Just as well, the daytimes of border crossings changed from morning hours to the night. News about pushbacks by Hellenic border guards led to intensified attempts to avoid Turkish and Hellenic coast guards altogether, or to ignore Hellenic coast guards as long as possible to reach Lesvos shores.
In response, border authorities adjust to the changing rhythms of border crossings by reorganizing shift schedules and patrol routes. To keep track of the overall situational picture, trend maps are created and discussed in the daily and weekly meetings of border officials. A particularly noteworthy aspect in this regard is the reorganization of border work along with the temporal dimensions of border crossings. In the context of Operation Poseidon, local coordination center officers usually consult different officials from the Hellenic coast guard, Hellenic police, and Frontex units, and align differing working hours and availabilities. However, during shift changes when boats return to harbors or headquarters in Mytilini, there can be some hours when borderzones are effectively not monitored at all. As a consequence from the interplay of different social rhythms, there can thus be cases when border guards are not capable of detecting and intercepting migrants at sea.
The rhythms of a border operation, coupled with the number of involved border guard units from different authorities, complicate the creation of border events as reference categories for governance and interventions. From a practical perspective, incident reporters need to act as coordinators that need to communicate with all units involved in order to receive and distribute information. They do so via various communication channels (e.g. mobile phones, messenger apps). Moreover, they “negotiate” with involved actors when data need to be delivered such that the characteristics of border events can be substantiated and acted upon. The required synchronization between different parts of border monitoring activities can, however, often be complicated by conflicting relevancies and prioritizations. First aid, SAC activities, the securing of a landing zone, or other tasks must, for example, often be prioritized by operational forces – while “paperwork” such as filing reports is pushed back and taken care of only when there is an opportunity to do so. As a consequence from such prioritization, border guards sometimes create their reports only after their deployment in a mission and subsequently submit them to the incident reporter with considerable delay.
Furthermore, the production of border events tends to be slowed down additionally by the mandatory nature of many data categories in the incident reporting template. The template requires, for instance, details of the identified and registered migrants, which can only be collected at the very end of a border mission, that is, after migrants have been detected, intercepted, transported, registered, and fingerprinted (Pollozek, 2020b; Pollozek and Passoth, 2019). Through the interplay of the reporting template, the course of a border mission, and the data collection practices of the border guards, incident reporters coordinate a “crooked” process of data collection that can take up to a full day and involves the handling, cross-checking, and consolidating of different accounts into a single version of the incident report in JORA (Pollozek, 2020b). The process of creating a border event should thus be primarily understood as a form of synthetization that creates a unified narrative that – although preliminary and still subject to validation – provides a common reference point for further knowledge and action (see Figure 2).

Border events and synthetization.
In summary, events in PRECOBS and JORA can be considered mutable phenomena that are contingent on how they come to be represented as data. Their temporal aspects, in particular, are coined by the social rhythms that they relate to and the technological inscriptions that they are based on. As we have seen, the temporal character of both criminal events and border events is generally considered volatile and thus needs to be subjected to different forms of updating (PRECOBS, Figure 1) or synthetization and validation (JORA, Figure 2). In both cases, digital reporting templates to a large extent define the data properties of an event and require forms of data production that are socially (i.e. among many involved actors) and/or temporally (i.e. extending beyond the original point in time) distributed. Understood through the concept of data temporality, processes in PRECOBS and JORA illustrate the entanglement of the social rhythms of criminal events and border crossings with larger organizational contexts of law enforcement and border control, especially shift work and availabilities, as well as cycles of data production, updating, and synthetization. Interactions between these different elements produce waiting times and delays as well as frictions and errors, raising the question whether and how data can be further processed and put to use. In the following, we engage retrace how decisions about the optimal timing for data analysis and resulting interventions come into being.
Trade-offs and actionability
As outlined earlier, both law enforcement and border control are driven by the idea of faster and more efficient interventions through data. This requires minimized time lag between data production, analysis, and ensuing action. However, as we have shown throughout the previous section, the creation of events and their subsequent substantiation needs coordination and takes considerable time in the first place. Additionally, doubts about the trustworthiness of the data created in the context of crime or border events interfere with the imperative of fast analysis and accelerated action. As a result, alleged data speed in law enforcement and border control already slows down considerably and might undercut the actionability of data, that is, the timely creation of knowledge that enables interventions before it is too late. This situation is in practice usually framed as a trade-off between data speed and data quality, suggesting that data can either be quickly available or trustworthy, but not both at the same time due to the duration of validation processes. In practice, this tension is dealt with pragmatically by seeking a balance between “good-enough” data and data that are not “too outdated” – with tendencies toward one or the other side depending on organizational priorities and ecosystem requirements. Below we reconstruct how this trade-off is resolved differently across our two cases.
In regard to PRECOBS, crime analysts are faced with the question of whether it is preferable to analyze data at a preliminary stage, enabling early operational measures while accepting the possibility of missing values and inaccurate classifications – or whether it might not be better to wait a little longer and work with updated data but run the risk that insights into ongoing criminal activity might already be (partially) outdated once corresponding crime prevention measures would be implemented (Leese, 2022). In practice, this dilemma is resolved through the data infrastructures that police departments use to manage and process their data. As already briefly discussed earlier, crime data are usually branched off into separate process management and case management files. The former primarily serve administrative purposes and consist of a unique identification number to which primary information about the event as well as metadata about the bureaucratic treatment of the event are attached. Process management files are created automatically once a citizen gets in touch with the police and triggers an administrative process by reporting a crime, filing a complaint, or otherwise creating a task that must be documented and tracked. The aim of process management files is to produce rudimentary knowledge as quickly as possible and to enable the police to manage work processes internally.
Case files, on the other hand, are geared toward traditional knowledge production throughout criminal investigations. They are structured in a similar way, that is, they consist of a unique case file number to which information, reports, documents, and other media files can be linked. Compared to process management files, case management files are less formalized but the data they contain are generally considered more reliable. They are, however, also considerably slower, as new data might only be added over the course of a (lengthy) investigation. Upon the decision when to trigger algorithmic data analysis, police departments are thus not simply faced with the question at which point in time data could be considered sufficiently trustworthy but also with the question of which system to extract them from.
Our empirical data show that police departments using PRECOBS generally considered the need for action more prevalent than the need for accuracy. In other words, they were willing to implement data-based crime prevention measures in the “wrong” neighborhoods (i.e. areas that would not have been flagged as susceptible to increased criminal activity based on validated data) rather than carrying out crime prevention on a completely randomized basis (Egbert and Leese, 2021). In practice, it is usually left to the discretion of crime analysts to run analyses on existing data at different points in time and eventually decide when data could be considered “good enough” to guide action. In doing so, several police departments found that invalidated process management data could, despite their shortcomings in terms of trustworthiness, in fact be used as input for predictive policing software. This means that for the use of PRECOBS, they could opt for the repurposing of data that were not originally meant to provide the basis for analytical purposes (Egbert and Leese, 2021: 84).
Notably, when process management data provide the primary data input for PRECBOS, police departments put the knowledge that is produced in crime analysis on shaky epistemic foundations. Process management data have a deliberately provisional and volatile character that speaks to the uncertainties that can hardly be avoided in police work and the production of crime data. The choice to work with potentially unreliable data, however, is largely preconfigured by the theoretical assumption that the likelihood of follow-up crime decreases rapidly after the first 72 hours following the initial incident (Townsley et al., 2003). In other words, the need for timely data to intervene into ongoing criminal activity essentially renders the more analytical yet slower case management data useless. At the same time, police departments are fully aware that the potentially precarious data foundation of predictive policing means they must pay even more attention to quality control and that the analyst must act as an additional fail-safe that double-checks the plausibility of algorithmically created alerts based on the underlying data(base) (Egbert and Leese, 2021: 85).
In the case of JORA, the temporal trade-off between “good-enough” and “not too outdated” data is dealt with rather differently. Data on border events are used for several tasks that come with different demands regarding data quality. As all Frontex units refer to the same data, any trade-off thus needs to take into account different use cases for these data. Data on border events are, for example, key for an overall situational picture for purposes of planning and adjusting staff requirements, work shifts, and patrol routes and schedules. To do so, not only data on current events are required but also on recent ones. Only then, so the rationale at play, could trends regarding the movements, routes, times, vessels, and tactics of migrants be accurately deduced. Although a “quick and dirty” form of data collection might thus be considered sufficient for operational management, such an approach would clearly be at odds with other use contexts and the global data quality requirements of JORA.
Data uploaded to the JORA database are, for example, displayed on an interactive map where all border events appear within operational areas in the form of dots, indicating the number of migrants intercepted and the expected level of the impact of operations. This map not only provides an overview of entire operational areas but is also an important tool for (re)allocating personnel, vessels, and equipment, as well as for demanding more resources from the European Commission or from EU member states (Paul, 2017). Additionally, data on border events are used by the Frontex risk analysis unit that produces monthly, quarterly, and annual reports and other outputs. The unit also shares risk analysis with other member state authorities through the Frontex Risk Analysis Network. For these contexts, accurate and reliable data are required.
To ensure the trustworthiness of border event data, Frontex has set up a validation cycle to balance data quality and speed. To do so, the online system coordinates a validation process of multiple iterations of crosschecks. In the context of Operation Poseidon, after the JORA incident reporter has created and submitted a report at the local coordination center, it is forwarded to the Hellenic coast guard for incident validation. Upon receipt, a credibility check is conducted based on scanning through the data entries for obvious mistakes (e.g. implausible temporal or geographic coordinates). Moreover, the incident report is compared to the accounts of involved border guards for consistency. Only after that, the JORA report is sent to the Frontex situation center in Warsaw, where an “incident validator” (usually a border officer) conducts a final check on the data – however without being able to consult the involved border guards’ shift reports. The design of JORA thus aims to streamline distributed practices of data production and validation by means of setting deadlines and time markers (see Figure 2).
However, as outlined above, the creation of border events remains contingent on actual border missions and is difficult to schedule. Moreover, incident validators at the international coordination center are not permanently available, thus further complicating the time frame for “finalizing” a report and often extending the timeline until the next morning/day (Pollozek, 2020b). A further complication is presented by the reversibility of the quality control cycle. In practice, inconsistent or implausible data can be rejected and sent back to the incident reporter who then needs to revise and resubmit the report (Pollozek, 2020b). Through this reversibility, essentially a trade-off resolution toward quality is inscribed into JORA's validation procedure. There are, however, loopholes that make it possible to undercut the prioritization of high-quality data. Technically, JORA provides the possibility for selected users to access data that have been saved in the system but not yet finalized. That means that even though these data are not considered trustworthy enough for risk analyses or for the compilation of a situational picture, they can be used informally for operational contexts (Pollozek, 2020b).
In summary, we can see that in both PRECOBS and JORA, the resolution of temporal trade-offs is based on balancing and prioritization choices that correspond with organizational preferences and the use cases that data need to be subjected to. Although in PRECOBS the recursive process of data updating is contingent on the course of an ongoing investigation (see Figure 1), in JORA the process of data quality is defined and accomplished through a procedure that defines data as “good” if the different reports are consistent with each other and after the data have been reviewed in at least two validation steps (see Figure 2). Another important aspect concerns the nature of data and the tasks they are produced and mobilized for. In the context of PRECOBS, data are branched out into different databases early on, allowing for different quality standards that speak to internal management processes on the one hand, and judicial proceedings and official statistics on the other hand (see Figure 1). Notably, police departments analytically prefer less trustworthy but more quickly available data, as their analytical value might otherwise be outpaced by the rhythm of crime itself. In the case of JORA, having only a single database for multiple use cases means that validation must be prioritized over early availability – however with a notable loophole that nonetheless provides informal access to invalidated data (see Figure 2).
Conclusions
In this paper, we have critically engaged the notion of doing things faster with and through data. We have suggested that from an empirical point of view, despite the growing availability of data and computing power, we are unlikely to encounter any such thing as straightforward acceleration or speed. Rather, we have suggested to study the temporal ordering effects that come into being with and through data through the notion of data temporality. The concept highlights how the sociotechnical speed of data comes into being in relational ways within complex ecosystems. To study the emergence of data temporalities empirically, we have suggested to engage the social rhythms that they correspond with, the technological inscriptions that enable representations of time, and the prioritizations to which they become subjected in sociotechnical settings. To illustrate the benefits of this approach, we have analyzed two cases where the aspirations of acceleration and resulting intervention capacities loom large: law enforcement and border control. Studying the software tool PRECOBS for the identification of and intervention into ongoing criminal activity used by police authorities as well as the database system JORA for border monitoring and awareness used by Frontex, our analysis has shown how the temporal ordering effects of data are contingent on the interactions between rhythms, inscriptions, and trade-offs.
Specifically, we have shown how events are brought into being in relation to the rhythms of the phenomena that they correspond with, the duration of data production processes, and not least the technological scripts of how data about crime and border crossings are imagined and enunciated. By relating data to the larger ecosystems within which they are embedded, we have demonstrated how processes are slowed down, delayed, and kept in loops. In PRECOBS, we have retraced how criminal events take shape through an iterative process of updating over the course of a criminal investigation, resulting in temporal relations that clash with claims of instantaneous analytical capabilities and real-time situational awareness. Rather, recursive processes of updating here create frictions in regard to the coordination of the work practices of patrol officers and analysts, resulting not only in delayed processes but also in erroneous and/or misclassified data that need to be subjected to additional quality control processes. In the case of JORA, on the other hand, we have seen how events come into being as the result of a synthetization process that assembles and coordinates several reports created by different units involved in a border mission. Although an event can be in principle constructed based on detailed accounts of border guards, the practical challenge here is to coordinate all involved actors, their priorities, and their paces of data production and incident reporting.
Working through the notion of the trade-off, we have also shown how executive decisions around data revolve around the need for actionability and different use cases in relation to the length and depth of validation processes. As we have seen, in both cases there is a tension between speed and data quality that is framed as a trade-off between “good-enough” data and “not too outdated” data. While in PRECOBS this trade-off is resolved with a priority for early availability, in JORA we have seen a prioritization of validity. Both instances are, however, closely entangled with the technological design of the information systems within which they are embedded – in both cases allowing for pragmatic choices. In PRECOBS the database structures used by police departments enable analysts to work with invalidated data rather than wait for updates and corrections. And JORA includes an “early access” option that gives insights into preliminary data for selected users, such that these data can already be mobilized for purposes of operational management without interrupting or corrupting the actual validation process.
Overall, so our analysis demonstrates, data temporalities should be understood and conceptualized as intricate and multilayered processes that add up and intersect within particular institutional ecologies – and that must be subjected to empirical study in order to understand their implications for how fast or slow things can be done. There is a clear need for more research into the complex, twisted, and seldom straightforward relations between data and speed, if only to curb the sometimes over-enthusiastic takes of technology providers and policymakers regarding the capacities of data as a tool for social and political ordering. The domain of security is a particularly pertinent one in this regard but by no means the only one. It is our modest hope that our work can serve as a starting point for a more substantiated debate on data and temporality within critical data studies and beyond.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
