Abstract
How do border security practitioners engage with data and technology, and what difficulties or limitations arise from these engagements? Responding to calls for critically examining how technological ‘solutions’ are enacted, we analyse the notion of e-Borders in the UK context as an assemblage comprising abstract conditions, concrete objects, and agents whose roles often manifest themselves through perceptions and practices. We draw upon interviews with former and currently serving senior staff from the UK Home Office, UK Border Force, intelligence services, and private sector suppliers. Practitioners’ reflections reveal how political, social, and human factors—including intuition and management cultures—both construct the e-Border assemblage and introduce discontinuities and frictions within it. Using a more tightly specified theory of assemblage, we highlight how human agents contribute to datafied phenomena like border control. In total, our study emphasises how assemblages are dynamic, never entirely coherent, and always being re-made.
Introduction
One idea behind e-Borders is that the name of each person who enters or leaves the United Kingdom will be checked against the lists of criminals and terrorists kept by the security services and the police. It sounds simple, but the problems with making it work have turned out to be formidable. One of them has more to do with human fallibility than computer problems …. (Palmer, 2014)
Both the abstract notion and practical implementation of e-Borders in the UK context provide ways to consider how data and technologies feature in modern border security. At one level, accessing and using personal data for security purposes highlights well-established concerns about privacy. But at another level, the fact that these intentions manifest themselves at international borders makes them especially powerful for states in political and symbolic ways: ‘“the border”, whilst the skin of the state literally, rhetorically is at its heart’ (Megoran et al., 2005: 735).
We aim to link developments in border studies and critical data studies to show how border security practices and their associated technologies are embedded within political, social, and interpersonal contexts. Borders increasingly function as selectively permeable ‘filters’, allowing some people through while restricting others. To an extent, this echoes Megoran et al.’s (2005) image of physical skin: data and technologies contribute to the integrity of states’ ‘bodies’. But our study also reveals discontinuities and frictions in border security by presenting a closer examination of how practitioners use data and technologies to enact this filtration function. For example, it finds that being free of data-based risk flags or suspicious records—having ‘clean skin’ in the words of an informant—is not always enough to successfully pass through borders.
Taking cues from approaches in critical data studies that locate people and politics in contemporary algorithmic life (Dalton et al., 2016), we connect with ongoing conversations about the changing nature and composition of border security. Several recent studies compellingly show how data and technologies of surveillance leave traces on, and exert power over, bordercrossers (Adey, 2009; Amoore and Hall, 2009; Pötzsch, 2015). These traces and practices do not even have to correspond with the formal border, instead being implemented virtually and remotely (Ajana, 2015; Guiraudon, 2003).
Such changes are increasingly theorised within the framework of ‘assemblages’ (Deleuze and Guattari, 1988), an approach emphasising how heterogeneous elements combine to produce contingent arrangements. For example, Haggerty and Ericson (2000) called the interaction of objects and practices relating to security, especially when they involve tracking human bodies, the ‘surveillant assemblage’. More recently, scholars of critical data studies have used the concept of assemblage to demonstrate how data contribute to wider social structures in domains beyond security or borders (Iliadis and Russo, 2016).
Given the growing attention given to assemblages as ways to explain social worlds (Anderson et al., 2012; Dittmer, 2014), we also aim to illustrate the value of a more tightly specified version of assemblage theory comprising conditions, elements, and agents (Nail, 2017). In the case of UK border security, these generic elements correspond with the rhetorical and intended policy programme of e-Borders; data and technologies used in bordering efforts; and the people who select, enact, and engage with the other assemblage elements. In particular, we argue that greater focus on the perceptions and practices of those tasked with actually implementing the idea of e-Borders gives valuable insight into the ways that border security assemblages change and might be challenged.
Situating border security in relation to critical data studies
In the context of transnational flows of people and capital, borders are subject to contestation, representation, and social construction. Numerous scholars have signaled this feature via concepts such as ‘borderwork’ (Rumford, 2012), ‘border narratives’ (Pickering, 2006), and ‘bordering practices’ (Van Houtum and Van Naerssen, 2002). Expanding the concept of a border to include new spaces and socio-political relations complicates how borders are understood, where they can be found, and what they are made of (Johnson et al., 2011; Vollmer, 2017a). These developments re-open questions about identity, the state, and how people make sense of places (Ernste et al., 2009). As a concept continually in flux, the ‘border’ holds elements of people, politics, practices, and possibilities in time- and space-specific groups. Thinking in this relational way demands that scholars pay attention to the differences, inconsistencies, and frictions inevitably thrown up by real-world border experiences (Allen, 2015).
Security, or insecurity, often characterises these experiences. Freer movement of people and goods has raised the stakes for states’ perceived security: ‘state border strategies reflect an attempt to reconcile the economic imperatives of globalization and regional integration with mounting political pressures to erect more exclusionary barriers’ (Andreas, 2003: 84). As a result, policed borders and policing strategies, where the goal is to filter undesirable non-state actors (Pickering and Weber, 2006), display increasingly multi-sited, symbolic, and physical aspects.
Data and technologies contribute to these aspects in several important ways (Vaughan-Williams, 2010). Here, the term ‘data’, and in particular ‘Big’ data, refers to machine-readable information that can be aggregated and analysed (Mayer-Schönberger and Cukier, 2013; Taylor, 2016). Information about passengers and migrants in the forms of facial photographs (Sparke, 2004) and biometric data (Amoore, 2006; Amoore and Hall, 2009) plays a crucial role in identifying potentially problematic individuals. This information is often collated and linked to create flags, maps, and scores—‘data derivatives’ (Amoore, 2011)—that define and identify risky behaviours: ‘what can be imagined and inferred about who we might be—on our very proclivities and potentialities’ (Amoore, 2011: 28). Furthermore, data collection and decisions concerning migrants are moved away from the border proper in a process of ‘remote control’ (Guiraudon, 2003; Zolberg, 1999). This control takes several forms: for example, carrier sanctions that place responsibility on private transport companies (such as airlines, railways, and trucking) to identify unauthorised passengers (Gilboy, 1997); or legal practices occurring within detention centres (Hall, 2012).
Critical data studies, comprising ‘diverse sets of work around data’s recursive relationship to society’ (Dalton et al., 2016: 1), is a potentially useful field in which to locate studies involving contemporary border security. The ‘critical’ element draws attention to the ways that a host of actors—individuals, governments, corporations—deploy and curate data or algorithms to exert power in many areas of life (Iliadis and Russo, 2016). At the same time, testing the limits of, and identifying the practical gaps created by, data and technologies invites opportunities to ‘contest the creation, commodification, analysis, and application of data’ (Dalton et al., 2016: 1). This signals the need to attend to serious questions about how data enable and disenable relations and activities that have uneven—even divergent—effects (Dalton and Thatcher, 2015; Dalton et al., 2016).
Combining these ideas from border studies and critical data studies has implications for how scholars can understand border security practices. First, they suggest data and technologies enable and sustain borders’ filtration functions—not only based on border-crossers’ past actions but also on their future, potential selves. Fingerprints, photographs, travel histories: by harnessing the ‘volume, velocity and variety’ (Laney, 2001) of these and other data to infer patterns and establish ‘normal’ or ‘acceptable’ levels of danger, border security rhetorically and practically becomes about risk management and future projection (Ajana, 2015; Amoore and Piotukh, 2016). This exemplifies Adey’s (2009) point that modern border control is simultaneously about individuation and massification. It identifies specific people and their unique characteristics, but also groups individuals together to establish norms and make predictions based on that constructed normality. These practices are sometimes intentionally collected as part of standard border operations, while others times are by-products of travellers’ interactions with technologies.
Second, the objects arrayed within, at, and around borders—gates, passports, computer chips—interact with processes of decision-making to produce ‘datafied’ encounters with security. This is seen in practices relating to pre-screening and categorising migrants using documentation and data gathered elsewhere. Such encounters can centre on human bodies themselves, as Pötzsch (2015) observes in his study of how borders become individualised through technologies. They also rely upon various devices and objects such as scanners and databases (Amicelle et al., 2015).
Third, technologies and data are social: they do not exist in apolitical vacuums, but rather are implemented, negotiated, and sometimes re-directed or even subverted by humans for diverse purposes. Risk profiling, as shown in Amoore’s (2006) study of US border practices rolled out as part of the so-called ‘war on terror’, attempts to distinguish legitimate mobilities—tourism, leisure, business—from illegitimate (and presumably dangerous) ones. Others have extensively studied how devices, data, and cameras contribute to surveillance objectives (Ajana, 2015; Boyce, 2016; Pickering and Weber, 2006). But, as political and social enterprises, these efforts are neither uniformly enacted nor achieved: ‘to assume that they are is to overstate the power, scope and coherence of new systems of security’ (Walters, 2011: 59).
Reflecting on these three points, there is a need to theorise border security in ways that account for (1) a growing emphasis on borders’ filtration function to preserve ‘normality’ and ‘security’ as constructed via data and technologies; (2) the physical technologies that border-crossers engage with as part of their security experiences; and (3) the perceptions and practices of those who enact or contribute towards these functions. Other scholars have also commented on the need to particularise border security and reveal the roles that people play in its creation. Amoore, for example, observes that … in the security domain, because the entire array of judgments made—their prejudices, their intuitions, sensibilities and dispositions—are concealed in the glossy technoscientific gleam of the risk-based solution, there is a place for critical thought to retrieve this array and arrange it differently. (Amoore, 2011: 38)
Border security assemblages: Conditions, elements, agents
‘Assemblage theory’ comes from the work of Deleuze and Guatarri (1988) and later taken up by DeLanda (2006). Assemblages have three basic elements: conditions, objects, and agents (Nail, 2017). ‘Conditions’ are the abstract, governing ideas and sets of relations that connect objects and agents in meaningful ways. ‘Objects’ are the concrete parts that get arranged in particular ways. ‘Agents’ are those people who arrange the objects according to the prevailing conditions. All three elements appear together and constitute one another in assemblages: ‘[e]ach one presupposes and is immanent to the other’ (Nail, 2017: 28). Therefore, analyses that use assemblage theory must be aware of the interdependent tensions and processes contained within every assemblage. To use assemblage is to try to understand how ‘spatial forms and processes are themselves assembled, are held in place, and work in different ways to open up or close down possibilities’ (Anderson et al., 2012: 172).
Haggerty and Ericson (2000) applied this idea to security when they developed the concept of the ‘surveillant assemblage’. For them, practices and physical objects— monitoring, tracking, cameras, sensors—combine to form larger wholes that leave deep impressions upon, and exercise control over, human beings. Previously disparate flows of people, signals, data, and desire are ‘fixed temporarily and spatially by the assemblage’ (Haggerty and Ericson, 2000: 608). So, the concept ‘introduce[s] a radical notion of multiplicity into phenomena which we traditionally approach as being discretely bounded, structured and stable’ (Haggerty and Ericson, 2000: 608). Meanwhile, Sohn (2016), exploring what the concept of assemblage might contribute to border studies, observes that physical components of border control, including surveillance technologies, interact with symbolic and expressive features such as beliefs, rituals, or policies. For him, assemblage theory provides a way of understanding of how ‘practices connect and disconnect, placing borders in the process of unfolding and becoming’ (Sohn, 2016: 186).
We both build upon and differentiate our contribution from these two interventions by using Nail’s (2017) more tightly specified explanation of assemblage theory. Using this specification, we argue that analysing each element of the UK’s e-Border assemblage, particularly its agents who are often hard-to-access, reveals how border security unfolds in contingent ways. Our empirical material shows how these agents’ perceptions and practices play important roles not only in enacting security as conditioned by the state through physical and datafied objects, but also in producing moments of discontinuity within border security. These moments—where, for example, technologies suggest one outcome but instincts suggest another—are important parts of the e-Border security assemblage.
Methods and data sources
We draw upon 28 in-depth interviews that were undertaken between 2014 and 2016 with officials whose work directly connected them with UK border security operations. These included senior Home Office management officials, regional and local immigration enforcement staff, and former and currently serving high-ranking officers of UK Border Force as well as the intelligence services. They also included private contractors and collaborators with UK Border Force and the department of Visas and Immigration (UKVI) who were involved in developing and producing biometric scanners, such as fingerprint and face recognition devices, as well as maritime sensors.
Access to these officials depended upon gatekeepers who were either already known through previous research, or accessed by attending security technology fairs. Sometimes, participants agreed to be interviewed at these events. In most cases, interviews took place outside the participants’ current or former organisations in favour of more comfortable, neutral environments.
Since border security can be a politically delicate area, particularly when asking about practices and strategies, participants were thoroughly informed about the nature of research and its objectives. Data produced by, as well as collected for the study—notably audio recordings—were treated confidentially, and all informants were offered anonymity. Furthermore, during some interviews, the confidentiality agreements were revisited when participants were communicating potentially sensitive information or data. In some cases, this resulted in conversations either not being recorded or being deleted after the interview. From an ethical point of view, this kind of arrangement was important for protecting informants.
After transcribing these interviews, or notes where full audio was not available, we used NVivo software to organise and code the qualitative data. This revealed portions of conversations that focused on e-Borders—either in rhetorical, policy, or practical senses—as well as technologies and data. Fitting with the objective of identifying how these ‘agents’ arrange and make sense of the ‘objects’ that constitute the e-Border assemblage, we paid particular attention to how these officials perceived and engaged with technologies in the course of their border work.
There are limitations to these methods, data sources, and research design. Interviews with higher ranking officials, whether formerly or currently serving, potentially access only particular perspectives on an organisation, although interviewing regional and local staff helped round out the picture. Furthermore, as a qualitative, single-case study of the UK, the research design does not claim to generalise the experience of these officials to other country contexts. What the study does contribute are in-depth reflections from the perspective of border practitioners themselves on the ways that data and technologies figure in day-to-day border operations. This kind of access is itself an important empirical contribution. At a theoretical level, these interviews also show how human and social factors, alongside the objects and conditioning promises of data and technologies, contribute to the e-Border assemblage.
UK e-Border conditions and objects: Changes in policy and practice
The first task is to identify the conditions and objects in the UK’s e-Border assemblage. ‘Conditions’ are the abstract sets of relations that animate the assemblage. In this case, the title of e-Borders captures how technologies, data, and officials are arranged to produce a particular assemblage of border security. But what wider societal and political factors have motivated this idea in the first place? Over the past decades, states have deployed many changes in border institutions, policies, and practices, largely with the stated intent of controlling migration flows (Vollmer, 2016, 2017a). Restrictive legislation, department reorganisations, modifications to social and welfare policies: these developments in many European states place border security front and centre in the imaginations of policymakers and members of the public (Vollmer, 2017b). ‘Border security’ has become both an indicator for how well states are physically controlling movements of people and goods, as well as a means of communicating the potential impacts of threats such as terrorism. These reasons, according to the perspective of states, demand and justify expanding security measures (Vollmer, 2014).
The UK is no exception to these changes: commentators note how Britain’s borders have extended, intensified, and deepened (Loftus, 2015). One way this has happened is by expanding the operational mandate of border institutions beyond mere counting of border-crossers. In 2007, the Immigration and Nationality Department was replaced by the Border and Immigration Agency (BIA) that was subsequently replaced by the UK Border Agency (UKBA). By April 2013, the UKBA was abolished and split into two new directorates: UK Visas and Immigration (UKVI) and Immigration Enforcement. The purposes of these institutions are diverse, from securing the UK border and controlling migration in the UK to facilitating legitimate international trade and travel. However, of these objectives, scholars argue that the emphasis has been on securing and protecting the physical border—a wider trend noted by Huysmans (2000) as the ‘securitisation of migration’. But, these numerous institutional changes and investment of vast amounts of intellectual and financial efforts have attracted frequent criticism by politicians and independent bodies. Recently, the Independent Police Commission cited ‘organisational failure and malpractice’ and a ‘dysfunctional structure’ (2013: 27) on the part of police at the border.
Given these failures, ministers and managers have turned to machines, technologies, and (Big) data in the hopes of improving efficiency and performance (Vaughan-Williams, 2010). Facilitated by the UK Border Act 2007, new powers gave priority to technologies such as biometric registration procedures. In the words of the Home Office, this was in response to two sets of challenges: ‘mobility that globalization has brought to our country’, and ‘identity fraud, illegal immigration, organized crime, and international terrorism’ (Home Office, 2007: 2–3). It is important to observe how the integrity of the physical border was—and still is—elided with the movement of people. The government called these changes the e-Borders programme, representing the largest efforts to harness the perceived power of data and technologies for enhanced border security. In this shifting assemblage, e-Borders became the organising condition—the named rationale—that changed how technologies and data related to each other.
Moreover, these changing conditions and objects made clear how new kinds of extra-territorial measures and sophisticated technologies could also create ‘a new offshore line of defense’ (Home Office, 2007: 2). These technologies offered new ways of biometrically controlling and observing border-crossers (Budd et al., 2009). Furthermore, non-state organisations including private companies increasingly have developed, maintained, and run these systems. This aligns with broader patterns across many states, where ‘power is nowadays exercised by delegating practices of state sovereignty to local, transnational and private actors outside the state apparatus and away from traditional state actors’ (Côté-Boucher et al., 2014: 196).
Describing the conditions and objects in the UK’s e-Border assemblage reveals important insights into the nature of modern border security, as well as the roles that data and technology play in shaping that nature. First, it tells the story of a complex and shifting assemblage that is extensively resourced with intellectual and institutional investment from states and private corporations. Second, it exemplifies broader beliefs in both the problem-free workings of machines and in the necessarily ‘better’ innovations introduced through technologies. But whether these assumptions actually work in the ways stated is still debatable: as the next sections show, agents—who bring their own perceptions and practices—also constitute the e-Border assemblage by selectively engaging with its conditions and objects.
Agents of the e-Border assemblage: Practitioners’ perceptions of borders, data, and technologies
In assemblages, agents ‘are the mobile operators that connect the concrete elements together according to their abstract relations’ (Nail, 2017: 27). We suggest these connections manifest themselves as agents’ perceptions and practices. As Bergson (1912) argued, perceptions are not simply objective, fixed views on an object or phenomenon. Instead, they are active ways of bringing something into view: ‘the carving out of a series of still images’ (Amoore and Piotukh, 2015: 344). Similarly, practices as developed by Bourdieu (1977) involve ‘the acting out of social life’ (O’Reilly, 2012: 9) through daily or routine actions. These include explicit rules and procedures as well as implicit worldviews, heuristics, or shared assumptions (Wenger, 1998). So, practices are ways of ‘knowing (and working out) how to go on in given circumstances suspended within networks of other people and groups’ (O’Reilly, 2012: 9). These understandings emphasise how both perceptions and practices, as objects of study, are not permanent. Instead, they depend upon the temporal, relational, and contextual situations in which they exist. The following two subsections focus on each of these aspects to illustrate the roles that agents play in the UK’s e-Border assemblage.
Perceptions of assemblage conditions and objects
Practitioners perceived data and technologies in different ways. On the one hand, officials recognised how the UK’s borders are sites of filtration, drawing upon massive amounts of data that are often gathered via physical objects like passports: The border is really the pinch point where all this data is checked. At the modern day border you’re expected to know everything about everybody really and that’s why sometimes there are queues at the border because we have to put everybody’s passport in a slot in a machine-readable zone which matches against a whole raft of watch lists to determine whether anybody has got any interest in you. So it’s a much more complicated process than it used to be. (Former senior management staff, UKBA) … economic refugees or people who are coming to the country for worst reasons like terrorists and everywhere you look there could be a hidden threat and a need to control either by CCTV, video surveillance systems or constant checks of identity at different places. So you’re aware when you look at people and you think ‘oh my god is this person legally here or not?’. (Former staff, UKBA)
Data and technology—the concrete objects in the assemblage—play their roles in mitigating this danger. Here, again, the organising assumption of secure, policed borders draws these objects together via prospects for deterrence: Technology is not always about catching people, it’s about deterring people from trying to do it in the first place. That’s why you have lots of security built into bank cards and passports. It’s about trying to deter people from producing false passports so you try and make the document as secure as you can. [I]f you subsequently find [fewer] forged passports, you don’t necessarily say we’re doing something wrong, you might actually think we’ve done something right because we’re deterring people. (Senior management, Independent Chief Inspector of Borders and Immigration) [P]eople will want to have security in airports and everywhere so that is an inevitable direction where you can have questions on privacy. But if you’re travelling, I think you will be monitored maybe even in the future from your house right up to the airport, right up to the aircraft, on the aircraft right up to your hotel in your destination. And there will be so many sensors along the way that the government can perfectly reconstruct your movement. (Private sector security and surveillance representative) To use human beings to do it [conduct security checks borders] is a big expense … Now however e-Borders: that’s got to be the answer because you have a passport number, you have a name and a date of birth, you know when they arrived because API [advance passenger information] tells you that, and you know when [they] leave. And then you can run algorithms that tell you ‘[when] did this person come here, oh they came here on a visit and they had a 6 month visa, did they arrive and depart within the 6 months, yes or no’. Then you can start getting some idea about the number of people who have overstayed because they haven’t left, but those statistics are not available…. (Senior management, Independent Chief Inspector of Borders and Immigration) The e-Borders programme has been going on for years and hasn’t really delivered much. We do not count in and count out people electronically: that is e-Borders and as per the Australian model and others. Although we are trying to get that, we’re nowhere near having it so….there are no exit checks at all! We are a long, long way from knowing actually who came in and who has stayed beyond their authority to stay, and I think that is a major drawback for the Home Office. (Former senior management staff, UKBA)
Practices involving data and technologies at the border
Perceptions manifest agents’ attempts to relate and connect objects together under certain conditions. Bordering practices—comprising sets of assumptions, actions and behaviours, either spoken or physical, that authorities engage in as they go about their border work (Allen, 2015)—are another important window onto agents’ roles in assemblages.
Border security staff described several kinds of practices across the interviews. Some mentioned how physical examination had become less important, expressing appreciation for the practical advances that technology and ‘data science’ conferred to their work. It’s a bit Luddite to say ‘oh no, it’s much better to have everybody examined by a human being’. That’s the past. It’s getting us to the point where we’ve got a proper system. (Former senior management staff, UKBA) … 20 years ago it was much more about looking up and dealing with the person in front of you and what they were saying and how they looked, whereas now you have to do a certain amount of looking down as well to look at the finger print machines and so on and so forth. With the increase in visa regimes and the amount of people who are now pre-cleared overseas, the role has gone a lot more from determining whether somebody should be granted entry or not to actually just verifying that is the person who has already been granted entry. (Senior management staff, Home Office) There is a danger that you assume because all the boxes are ticked that they’re not a problem, whereas just because somebody has never been here before and is the person who applied for the visa you still need to satisfy yourself as to their credibility. I don’t think that’s a problem with the technology. I think that’s just a training issue. You just need to make sure you reinforce the message with staff that the absence of negative information is not necessarily a positive thing…. (Senior management staff, Home Office) We have our own databases, we have watch lists, and we have our fingerprint machines that we would use to biometrically verify people which again would point to particular suspicions around somebody. You have your experience and your training so some of that is visual; body language will give people away on occasion, if people are giving certain signals in terms of their nervousness … [O]bviously with EU nationals you don’t really have that ability to ask more than one or two questions because they’re not subject to control in the same way foreign nationals are but with foreign nationals you can pretty much ask them what you want for as long as you want, and part of that package is to assess their credibility: what they’re saying, does it add up? (Senior management staff, Home Office)
Discontinuities and frictions in the e-Border assemblage
Earlier, we argued that searching for discontinuities and frictions in the ways that the notion of e-Borders has been enacted, practiced, and perceived reveals important aspects of assemblages. Some of these have been touched upon in the previous two empirical sections, but the issue warrants further discussion on its own.
One important discontinuity is the gap between what data and technologies promise for ‘better’ border control, and how practitioners actually go about their work. Amoore (2011) called this the ‘risk-based solution’ for border security. But interviewees expressed a degree of scepticism towards this idea. One informant tellingly used the term ‘clean skin’ as a way of explaining their approach: The worry I think is we become a slave to a machine and we forget actually the data is only as a good as the people who put it on there. [I]f you’ve got rubbish in you’re going to get rubbish out and actually some of the best catches we’ve had have been on what we would call ‘clean skins’. These are people who don’t come up on the computer but by talking to them and by interviewing them and by examining their baggage you can identify things that only a human being can really do. (Former senior management staff, UKBA)
Another important friction occurs in the ways that enhanced, datafied borders are rationalised by higher-level policymakers, yet enacted and dealt with in practice by officials at the border proper. These rationales, which exist in political and social contexts that demand some outcomes while foreclosing others, are also part of the assemblage of control built up around, by, and through border security. From the perspective of the government, ‘secure’ borders result from reliable decisions about whether to allow people to enter that can be taken remotely and automatically. But security isn’t just about creating an environment of safety—perceived or real. It also is a concept with political and economic dimensions. The UK government's own internal audit argues that ‘increasing the automation of border processes and making earlier and better-informed decisions about those wanting to cross the border have the potential to bring both financial and security benefits that are essential in the current environment’ (National Audit Office, 2015: 11).
What are these kinds of benefits? A former senior management staff of the UKBA confirmed ‘a huge industry which has built up around the border’. Although ministers justified simultaneous reductions in border staff and increases in border technologies on grounds of time and cost savings, practitioners expressed reservations about this approach: [I]f I look at a border, I look at it more from a strategic view point about what the main problems facing a country [are] before advising them to invest in some new IT architecture because a lot will depend on the geo-political state of the country. (Former senior management staff, UKBA)
But in practice, these technologies and systems possess significant vulnerabilities stemming both from their actual use and their rationalisation. For example, one practitioner observed how ‘the scope for breakdown of all these complex systems is immense as anybody who has tried getting through the electronic gates at an airport knows’. Also, as shown above, border authorities themselves exercise some degree of discretion, which is subject to some fallibility—despite the intention of identifying ‘clean skins’.
Finally, another discontinuity exists within the bureaucracy and organisation of the border itself. Interviewees raised concerns about the differences between management cultures that seemingly conflicted with practitioners’ reality: [M]anagement culture and style … is a very crude business culture of outputs, outcomes, targets, and you’re managing a very complex situation. So they’re always wanting to put a simple solution to a complex problem, and actually you’ve got to engage with the complexity and broader strategies. (Former senior management staff, UKBA)
Conclusion
In 2014, well-past the stated deadline for implementing the UK’s e-Borders programme, commentators and independent auditors alike observed how the project faced ‘formidable’ problems largely connected to human failings rather than the technologies themselves (National Audit Office, 2015; Palmer, 2014). By documenting the conditions, objects, and agents (comprising their perceptions and practices) involved in the UK’s e-Borders assemblage, we aimed to reveal some of the frictions that arose from its implementation. These frictions highlight how the notion of e-Borders, viewed as an assemblage that is dynamically constructed through heterogeneous physical and symbolic elements, is neither coherent nor stable.
Our interviews with practitioners suggest that technologies and data support the commonly-observed function of borders as a filter (O’Dowd, 2010). But, importantly, so do ‘irreplaceable’ human judgment and perceptions. Border staff expressed ambivalence towards, even criticism of a technologised organisational culture by questioning the efficiencies and capabilities actually afforded by machinery. These perceptions and practices are situated in managerial, political, and social contexts—not least of all within an empowered private security industry working in collaboration with government. Yet these contexts are not entirely deterministic, as illustrated by the mismatch between the rhetorical promise of e-Borders to count migrants in achieving politically-motivated ‘net migration’ targets, and the practical reality among officials who were actually implementing this programme.
What implications does the e-Border assemblage have for bordercrossers as surveilled subjects? This is an important question to consider, especially in light of the growing imperative from critical data studies to examine ‘where the interpellation of the individual emerges in algorithmic culture and, through that, where the cracks and seams, the spaces for resistance and alternatives, might be found’ (Dalton et al., 2016: 1). In some ways, the UK border has never been more ‘secure’ in the sense of observing, controlling, and monitoring border-crossers in great detail. But this mostly technological view of border security is rather limited. Just as the field of border studies has placed the notion of ‘the border’ under intense scrutiny to move it away from purely territorial concerns (Johnson et al., 2011), so also do social scientists have an opportunity to view border security as relational, existing in political contexts, and subject to perceptions and practices (Haggerty and Ericson, 2000).
Our study also contributes to assemblage theory by focusing attention on agents’ perceptions and practices as manifestations of how they relate and link the conditions and objects present in every assemblage. Responding to Amoore’s (2011) call to strip away the gloss of techno-scientific, risk-based solutions, we examined how senior authorities— themselves often difficult to access—both express and constitute the e-Border assemblage. A key theoretical and empirical challenge going forward is to consider what other security assemblages look like from alternative perspectives—from non-citizens (Anderson, 2013) and from those who are surveilled themselves.
Re-orienting our thinking to include migrants’ and bordercrossers’ own practices, perceptions, and subjectivities is an important part of understanding how individuals' changing contexts and positions contribute to different kinds of migration politics. For example, Squire (2016) advances an analysis of ‘acts’ to highlight how unauthorised migrant crossings disrupt established scripts, create new ones, and generate different political subjects in the process. Moreover, these disruptions have temporal aspects: informal, everyday acts of resistance ‘may last or they may lose their significance as time goes by or as surveillance authorities learn how to extinguish [them]’ (Trimikliniotis et al., 2015: 1040). A well-specified theory of assemblage—one that acknowledges and seeks out seemingly ordinary moments and acts from multiple perspectives—can readily accommodate these aspects.
Having a clearer picture of how these situations are constructed and changing is important not only for developing critical data studies as a field, but also for transforming assemblages into tools for reflective action: ‘once we understand how the assemblage functions, we will be in a better position to … direct or shape the assemblage towards increasingly revolutionary aims’ (Nail, 2017: 37). In this regard, concepts such as ‘sousveillance’ (Mann et al., 2002), or surveillance from below, open up valuable avenues for exploring how assemblages can be challenged, resisted, and changed (see Bakir et al., 2017).
At their broadest levels, assemblages question how stable or permanent a given phenomenon may be, inviting reflection on the possibilities of new kinds of politics that may emerge from contingent relationships (Vollmer, 2016, 2017b). They ‘loose[n] the deadening grip abstract categories hold over our sense of political possibility’ (Anderson et al., 2012: 186). But, remembering Bergson's (1912) argument that perceiving involves actively bringing objects into view, perceiving new kinds of politics requires being open to creative ways of seeing. Combining different modes of enquiry at multiple levels of individuals, organisations, and states is likely to be part of responding to that challenge in the future.
Footnotes
Acknowledgements
The authors would like to acknowledge feedback on an earlier version of this article from attendees at the ‘Coding/Decoding the Borders’ conference (Brussels, Belgium; 13–15 April 2016), as well as helpful suggestions provided by the anonymous reviewers and editorial team.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported by the Leverhulme Trust, grant number ECF-2012-248, and the German Research Foundation (DFG), grant number FOR 2496.
