Abstract
Digital twins—virtual representations tied to material objects, processes, or environments via two-way flows of information—are hyped as the backbone of a new industrial revolution. We argue that digital twins propose and pursue a logistical mode of prediction that seeks to operationalize simulated futures and in doing so subordinate human labor, decision-making, and governance to computational systems tied to material environments. Even when digital twins fail to work or maintain frictions in underlying systems, their value is tightly bound to the question of who can own, control, coordinate, and capitalize on the logistical architectures of data and computation that anticipate as-yet unrealized returns from future probabilities. Following a genealogical examination of twinning techniques, the article examines Alibaba’s City Brain platform in Xiong’an, China, and the predictive capabilities of Palantir’s Foundry platform to show how digital twins seek the preemptive capture of futures in the present.
Keywords
Think tanks, management consultancies, and tech companies have all breathlessly touted digital twins as the next and best permutation of technological “smartness,” promoted as the “backbone of Industry 4.0” (Colombo et al., 2017) and “the foundation of the industrial metaverse” (McKinsey, 2022). Some of the world’s largest technology companies, from Microsoft to NVIDIA, have made enormous investments in products and services to support digital twinning, while smaller firms and startups entering the digital twin market space have attracted large amounts of venture capital. Governments, particularly across the United States, Europe, and Asia, have also made strategic and resource commitments toward advancing digital twin research, while global consultancies and think tanks have been major drivers of their discursive heat. Despite the scale of both investment and hype, digital twins are often described in nebulous terms. Consulting firm McKinsey’s (2023) definition exemplifies this: a “digital representation of a physical object, person, or process, contextualized in a digital version of its environment.” This kind of definitional blurriness contributes to the challenge of distinguishing digital twins from other simulation technologies, but it also points to an industry effort to expand the concept to new domains, from manufacturing prototypes to Internet of Things (IoT) dashboards to urban governance operating systems.
Digital twin startups and tech giants all market their technology as enabling design, diagnostic, experimental, and analytical procedures to be run on identical virtual proxies and then implementing them “in the real.” The value proposition of digital twins resides in coordinating and synthesizing digital and physical environments to make complex systems run more autonomously, optimizing performance at lower risk and reduced cost. This article argues that digital twin projects are best understood as logistical media, commodified operating systems underpinned by a discursive and material shift from real-time monitoring and calculation to prediction, from daily management to forward planning, and from dependency on multiple data streams and tools to their holistic fusion. While predictive analytics are hardly new, what marks digital twins as novel is their effort to advance a mode of prediction predicated on a coordination of real-time data, simulation, and management that “fuses” physical and virtual environments into a single operational timespace.
In practical terms, digital twins can be understood as virtual models linked to sensors that allow physical objects, persons, and processes to be projected into detailed and dynamic digital environments. A digital twin of an aircraft engine, for example, might be a three-dimensional (3D) computational model that represents how the engine operates, the effect of wind and air pressure, and stress placed on its materials. With sensors embedded in the engine and throughout the aircraft, real-time data on its operation through take-off, flight, and landing can be collected, analyzed, and (at least to some extent) directly acted upon. While the term was first used in the high-cost and high-stakes world of spacecraft design at NASA (Shafto et al., 2010), digital twins have diverse antecedents: ancient granary records that enable long-distance trade (Hockenberry, n.d.); the rise of statistical management with Charles Babbage’s Difference Engine, electronic calculators, and early computers (Lewis, 2007); simulations of physical systems and nonhuman ecologies (Edwards, 2010; Vehlken, 2019); nuclear wargaming at RAND Corporation (von Hilgers, 2012); the emergence of three-dimensional graphical environments (Gaboury, 2021); the adoption of enterprise resource planning (ERP) software by global firms (Rossiter, 2016); and the predictive step change of high-powered computation and machine learning (ML) algorithms (Mackenzie, 2015). These diverse precursors are reflected today in the numerous applications for which digital twins are now proposed or implemented in manufacturing (Colombo et al., 2017), population health (Armeni et al., 2022), ecological sustainability (Rothe, 2024), and urban traffic management (Alibaba Cloud, 2018), to name a few. Among the more ambitious are those applied to complex and explicitly logistical problems, such as the long-term project to develop and implement a digital twin to automate the Port of Rotterdam, the busiest in Europe (IBM, 2019).
This article argues that digital twins enact an emergent logistical mode of prediction, whose value is tightly bound to the question of who can own, control, coordinate, and capitalize on the logistical architectures of data and computation that anticipate as-yet unrealized returns from future probabilities. We examine how digital twins—as global logistical media—anticipate and materialize new circuits of global capital and computation. These circuits are forged through an interplay between (1) logistical affordances of digital twins, understood through their operations of prediction; (2) practices of coordination, calculation, and investment by diverse political-economic interests; (3) extractive dynamics aligned to predictive technology markets and speculative operations of capital; and (4) the mobilizing effect of industry hype on firms, institutions, states, and other actors.
To demonstrate this interplay, we turn to two digital twin projects that show how the logistical mode of prediction operates within administratively, politically, financially, and geographically complex and high-stakes contexts. First, we examine the integration of this logistical mode of prediction into urban planning and governance, a trajectory in which China—and the “city brain” of Xiong’an—is at the forefront. As we show, a heady mix of imagination, aspiration, investment, speculation, and, of course, failure animates the world of urban digital twin implementation and their orientation to a set of wider political and economic dynamics determining urban resources and stakeholders. Next, we show how their logistical mode of prediction underpins the digital political economy through an examination of the digital twin market and Palantir’s Foundry product, which seeks to move digital twins from bespoke product to platform service twinning an entire firm or operation. Here, we focus on the electrical supply infrastructure of PG&E, whose poles and wires stretch across the state of California at significant risk to people, places, and more-than-human environments. In both cases, our analysis is not trained on the technical viability of these projects but on how digital twins produce a coordinating substrate for surplus value generation by global tech firms and their political allies.
We show that, even where this pursuit seems largely to fall short, if not fail altogether, the hype and investment continue largely unabated. In their predictive operating mode, twins do not even need to function or arrive at all. Digital twins might be both underwhelming tech products and coordinating architectures that seduce with the promise of a coordinating computational layer to the modern firm, city, and or state that tames the complexity of the present and the radical uncertainty of the future. As such, digital twins depend upon continued frictions and contradictions within the underlying systems they seek to coordinate to justify their own existence, as well as the significant costs incurred in their implementation. Despite their questionable efficacy, the scale, ambition, and momentum of commercial digital twin projects make their critical interrogation necessary, especially in terms of what their political and financial propositions mean for people and places actually or potentially subject to their coordinating imperatives. As our case studies of Alibaba’s City Brain and Palantir’s Foundry platforms show, digital twins often depend upon the existence of ongoing frictions within cities, workplaces, data inputs, and much more. Rather than reduce the underlying messiness of contexts such as urban governance or electrical infrastructure, digital twins introduce an operative computational layer that promises to make the mess manageable, even profitable, but in doing so flatten out the rich textures of life in the name of optimized logistical management.
Coordinating urban futures
Xiong’an is a “technology and innovation” city currently being built from scratch in the hinterlands of Hebei Province, about 100 km south of Beijing. From its inception, the project was envisioned as fully cyber-physical urban space. The planning, construction, and development of the both the city and its digital twin have been organized around the sometimes-uneven coproduction and technical coordination of digital infrastructure, information management platforms, data and management standard systems, and city-level intelligent collaboration and operation capabilities (Liu and Tian, 2023). Central to the integration of this scheme is an urban operating center, a $156m Urban Computing Centre (UCC) or “city brain,” to host the models and computing infrastructure to support Xiong’an’s urban digital twin. This twin is envisioned as a virtual, spatiotemporal “mirror” of the city, integrating maps and models of fixed physical assets (buildings, roads, landscape features, etc.) and dynamic systems (human and transport mobility, energy, waste, water, etc.) into a single computational environment, conjoined by four informational infrastructures: integrated data, video networks, information modeling, and an IoT network (Lei, 2023). Both the city and its twin are still in development; the realization of a full-scale, unified twin, much like other urban projects, has been limited by technical and social capabilities on the ground. Urban digital twin projects have been announced across Europe, North America, and Asia; today, some of these projects linger in early stages (Ferré-Bigorra et al., 2022), some have progressed but never lived up to their operational promises (Jeddoub et al., 2023), while others such as Deloitte’s Optimal Reality twin of Melbourne, Australia, have failed to materialize much beyond defunct websites and flashy press releases. Still, several of Xiong’an urban functions—namely traffic and energy—are now operational and managed by digital twin technology. Xiong’an presents an environment where the practice of urban digital twins—preliminary as they might be (Liu and Tian, 2023)—is in full swing. We consider the rise of digital twins in China, and the case of Xiong’an in particular, in order to understand how digital twins, conceived as logistical media, generate the conditions of possibility for modeling, simulating, and predicting urban futures, and how they produce value by coordinating the interests of those who seek to stake claims (and make returns) on their ideological and material commitments toward future urban—and even global—governance.
While not all digital twins originate in explicitly logistical contexts such as supply chain management, digital twins in general can be understood via their logistical function. Logistical media are “technologies, infrastructure, and software” that “coordinate, capture, and control the movement of people, finance, and things” (Rossiter, 2016: 4–5). Rapid advances in ML, high-speed computing, and accelerated access to powerful graphics processing have produced a surge in hype about the capacity for twins to do more than simply mirror or even support the management of material operations. Much of the excitement around digital twins is now animated by the powerful and seductive fantasy of a cooperative human-machine decision-making platform for managers and planners of all stripes, to coordinate decision-making across space and time. Even early digital twin initiatives, which tended to be limited in scope and bespoke in their development, were heavily indebted to this logistical logic—a way of thinking and knowing that presumes a calculable ontology that can be audited, optimized, managed, and coordinated by automated “operating” systems.
Urban contexts, then, present a logical leap for digital twins and were an early speculative domain for their application. “Intelligent” automation is anticipated, if not already integrated, in many of the material and social dynamics of urban life (Barns, 2021). Public agencies are learning to govern, plan and manage complex urban settings through these virtual simulations built using real-world data (Cugurullo, 2021; Luque-Ayala and Marvin, 2020). Urban design and planning have become fundamentally logistical in their orientation and organization (Shapiro, 2020). In many cities, sensors and the “logical linkages of software” (Mitchell, 1996) slip over a physical matrix already primed by complex conventions (land use), protocols (transport), and management regimes (waste, energy, etc.) In rapid urbanization projects like Xiong’an, this logistical vision of infrastructural and informational integration and technological management is being designed in from the ground up. Through its technical arrangements, Xiong’an and its digital twin are being coproduced by an “algorithmic coordination of productive processes in space and time” (Mezzadra and Neilson, 2013: 10), whose primary function is “to extract value by optimizing the efficiency of living labor and supply chain operations” (Rossiter, 2016: 4). These capabilities are central to the enthusiasm for digital twins, seen as key technologies in the planning and management of more “optimized,” “resilient,” and “forward-facing cities” (Nochta et al., 2021).
The rendering of urban space as “a calculative machine” is central to its capacities for coordinate real-time monitoring, accounting, simulation, and visualization of event-based urban data, wherein digital modeling tools become the drivers of urban futures by and the way of governing the urban decision-making process (Luque-Ayala and Marvin, 2020). This is a step change from the datafication logic underpinning “standard” practices of smart urbanism, which have aimed to improve the efficiency of urban governance, where “greater transparency (e.g., open data platforms), visibility (e.g., data visualizations), ease of use and . . . were arguably the main drivers for the digitalization of the urban” (Luque-Ayala and Marvin, 2020: 130). In the predictive logics of digital twins, “the future” recedes from the strategic horizon of city planning toward a present condition of real-time decision-making organized by computer-simulated and analyzed “what-if” scenarios. In Xiong’an and other Chinese cities, digital twins are coordinating infrastructure for local governors to mobilize urban resources and respond to urban events in an “anticipatory” fashion, such as the dynamic monitoring and management of Xiong’an’s electricity grid. In Haidian, for example, Xu et al. (2024) explain how flooding is managed by sensors that anticipate impending emergency, delivering “real-time” instructions to drainage department officials who receive automatic alerts, via urban dashboards, with information on incident location, images, priority level, and so on. Rainfall data and surface runoff simulation are used to predict future flooding risks, though at the time of writing these predictive capabilities were limited to a window of a few hours. The day-to-day management of energy in Xiong’an also demonstrates how digital twins direct and allocate urban resources. Schneider Electric worked with the state power provider to develop a digital twin of the city’s 10 kV switching substation (Liu et al., 2022a). The twin uses sensor and 3D modeling to produce a full-scale digitized model of the substation’s equipment and operating environment. Sensors detect maintenance issues and relay information to the twin’s diagnostic systems which provide “early warning” alerts and auxiliary decision-making around potential faults within the system. Looking to the future, simulations generate predictions based on “life-cycle” analysis. Twins transform substation equipment operation and maintenance from “passive preventive maintenance” to “active predictive maintenance” (He et al., 2022) and they integrate the control and management of workers within their flows of anticipatory decision-making, autonomously pushing alerts, providing instructions and directing personnel onsite. If smart cities proposed a planning environment that moved “beyond policy-based decisions to reshape cities with insights gained from data” (Vanolo, 2014: 890), digital twins are envisioned as the planning environment—the central decision-maker and planner, not just enacting the present but anticipatory urban governance (Xu et al., 2024) laying claim to what is required for urban futures.
Like other forms of urban artificial intelligence (AI) experimentation in China, digital twin implementation is not only framed by the central government’s AI strategic vision but needs to be understood in terms of its complex unfolding on the ground within the context of corporate and public interests and actors (Carter and Crumpler, 2019; Ding, 2020). The city has been hailed by China’s Central Government and President Xi as a project of national priority, developing on-schedule according to a multidecade development plan (ChinaDaily, 2024), but it has also been accused of being a near “ghost town” whose so-far limited technical accomplishments have far preceded its population (Bloomberg, 2024). Five years into its development, Xiong’an represents a more than $94 billion investment in AI-driven urbanism (CGTN, 2024) as well as a complex experiment in multistakeholder, AI-led urban management (Liu et al., 2022b). Startups, scale-ups, and China’s major technology firms have all sought to generate business models from new hardware and software platforms for AI-driven urban service delivery and management. Here, we argue that, beyond its direct operating context, Xiong’an’s digital twin has become both a technical and discursive mechanism to manage multilateral interests, coalition building, and competition in the urban field.
Within the day-to-day operation of the city, the digital twin has not so much replaced older modeling, simulation, and prediction systems as united, organized, and integrated them within a framework packaged from off-the-shelf solutions delivered by China’s biggest AI “vendors”—Alibaba, Huawei, Baidu, Tencent, and JD.com among others. Alibaba has been a major financial and technological driver of digital twin technologies in China, and its proprietary City Brain platform provides the conceptual and technical foundation for the entire city (Zhang et al, 2019). City Brain is a commercial urban artificial intelligence (AI) platform developed by Alibaba Cloud and described as a digital twin technology that uses virtual models, simulation and prediction for urban management, including traffic, public safety, environmental monitoring, and overall city operations (Alibaba Cloud, 2018). In Xiong’an, City Brain currently manages traffic using prediction and automation. Most of China’s other major AI vendors have also signed cooperation agreements with Xiong’an’s local government to test new products and services that will be coordinated and controlled in part or whole by the digital twin, including self-driving vehicles. The city’s digital twin has, then, not only been central to a massive urban virtualization and simulation project but is also a project of standardizing and disciplining urban information (formats, protocols, etc), corporate actors (proprietary systems, technical expertise, etc), and local stakeholders (policies, frameworks, etc) into a unified whole. This unified whole seeks—even if it has not yet succeeded—to resolve the so-called “information island” problem and work toward a system made fluid by the imposition of compatible formats and logics.
As governments in China and abroad rush to define and establish frameworks, set policy, and invest capital in urban digital twins, the rationale behind these decisions goes far beyond the planning, organization, and governance of urban assets. Xiong’an’s “digital twin” points to an assemblage of operations, applications, and protocols that seek to leverage and coordinate computational models, abundant data, and high-speed processing for simulation capabilities that enable its urban stakeholders to lay claim to a “predictive window” on the future. This form of urban governance seeks to mobilize AI technologies as much to predict the future as steer the present toward a given political ideal through strategic intervention (Anderson, 2010; Guston, 2014). Alibaba’s City Brain, for example, is an urban operating system that developed and marketed as a smart city product that is both city-specific, but that can also be modified and applied to other urban contexts. Alibaba’s AI “Brain” products are now sold and deployed domestically and exported internationally and have been designed expressly as a Chinese digital innovation product for global urban development markets. As Alibaba Cloud President Zhang Jianfeng said at the launch of City Brain 3.0,
A hundred years ago, London exported the subway to the world, Paris exported the sewer, and New York exported the power grid. Today, the value of digital leadership in Chinese cities has been highlighted, and the establishment of a new digital foundation has become the foundation for the evolution of future cities. (Cavanaugh, 2022)
The value of urban digital twins is, therefore, not only extracted through the coordination of practices of calculation, investment, and negotiation to make decisions and manage uncertainty within specific cities, but also generates opportunities to become a stakeholder in the decision and policy making of global futures (Zhang et al, 2019). Here, cities are envisioned as scalable units, and foundational building blocks in the transformation of nations and regions as interoperable technical elements of a large territorial system. Liu Feng, Dean of Yuanwang Think Tank Digital Brain Research Institute and deputy director and secretary-general of the Urban Brain Special Committee of the Chinese Society of Command and Control, put it this way:
The digital brain will gradually expand from the city brain, to the provincial brain, national brain, and finally the [world digital brain (世界数字大脑) or a world digital nervous system]. The construction of the world’s digital brain will be the third important opportunity to establish the world’s technological ecological standards and systems after TCP/IP and the World Wide Web. (Weber, 2023)
In Malaysia, Alibaba’s City Brain is now part of Kuala Lumpur’s smart city management platform, complementary to the company’s e-commerce and logistics initiatives in Southeast Asia and China’s broader export-driven Digital Silk Road ambitions. Indeed, Alibaba’s digital twin products became politicized when they featured within the frame of sensitivities about their coordinating capacities for China’s soft and “hard-wired” territorial aspirations (Gordon and Nouwens, 2022). Outside of China, the possibility of scaling city digital twins to establish a national network has already been studied in the United Kingdom (Rogers, 2019) and Finland (Ruohomäki et al., 2018). Companies like 3DS or Bentley Systems (whose OpenCities Planner was used to create urban-scale digital twins for cities across Scandinavia, including Stockholm, Helsinki and Gothenburg) are primed and positioned as regional integrators, even though the technical and security challenges of delivering interoperability and integration remain staggering (Cureton and Dunn, 2021). While platforms are ecosystems of value extraction and capital accumulation, we recognize that they also generate the mundane connectivity and interaction of the urban everyday (Barns, 2018; Leszczynski, 2020; Van Dijck et al., 2018), which can be understood beyond political-economic approaches that “risk-reducing how we ‘think the urban’ to its transactional logics” (Barns, 2018 in Leszczynski, 2020: 190).
Xiong’an reveals the ambitious desires that animate digital twins in urban environments, in which seamless technical architecture and control enable the management and coordination of uncertain urban futures. As logistical media, digital twins are enrolled in an ideological commitment to circulation and efficiency as ways of seeing, engaging, managing, and governing through the optimization of flows. On the ground, this digital twin urbanism is still very much in formation, unfolding within the context of still-emergent practices and public and private interests that hinge on data and finance capital flowing across circuits materialized between local government, China’s leading tech companies and new tools like Alibaba’s City Brain. Even when their technical circuitry faulters, digital twins can offer a framework to integrate and coordinate the interests and activities of urban stakeholders. As logistical media of “orientation,” digital twins serve not only as technical infrastructure, but devices of cognitive, social, and political organization and control (Peters, 2012). In urban governance, these ideological components do not manifest as profit-seeking but rather the optimization of social, political, and economic control manifested through infrastructures of logistical prediction. It has also demonstrated how digital twins can realize value for private firms and governments, even when their operational ambitions have yet to be realized.
Who owns the means of prediction?
Prediction, of course, has enormous value beyond urban governance. Understanding incipient changes in markets before they arrive makes extracting value easier, whether trading in stocks, goods, or services. Prediction holds value within capitalist economies across timescales, from the microseconds at stake in algorithmic trading (MacKenzie, 2019) to macro shifts in global economies. As a result, the capacity to predict transformations of a social, economic, political, and technological nature holds the allure of shaping investments by business and government—and accruing lucrative fees in the process. Global management consultancy firms such as Booz Allen, Deloitte, and Accenture offer clients services that enable them to capture new markets, extract new value from existing customers and contracts, and rationalize operations by applying interpretive frameworks and creative thinking. Business-to-business technology providers like IBM and Oracle sell database and predictive analytics services that allow other organizations to capture, organize, and interpret data about themselves, their customers, and the markets in which they operate. More recently, they have been joined by the rest of big tech through the expansion of cloud services such as Microsoft’s Azure, Amazon Web Services (AWS), and Google Cloud, as well as surveillance and analytics company Palantir.
Co-founded by chairperson Peter Thiel, CEO Alex Karp, and several others, Palantir is emblematic of the business of prediction and the value of coordinating its production. As a software-as-a-service (SaaS) company, Palantir’s principal client base is in government—the Central Intelligence Agency and the Pentagon are among its oldest and most loyal (Biddle, 2017)—and its role in national security, intelligence gathering, and predictive policing has been examined by surveillance studies scholars (Brayne, 2020; Egbert, 2019; Iliadis and Acker, 2022; Knight and Gekker, 2020; Munn, 2017). For Palantir, however, future expansion depends upon significant inroads into more prosaic commercial arenas. Following an IPO in 2020, Palantir’s market capitalization at the time of writing was US$45.86 billion on the New York Stock Exchange. In 2022, the company generated US$1.9 billion in revenue. With a reported 132 clients in 2022, the crudest of calculations suggests that they paid on average roughly US$14.5 million dollars each for Palantir’s services. Its current and future value depends in no small part on how successfully it can deliver on its coordinating function to deliver more modular and accessible predictive products.
Integral to these efforts is Foundry, the commercial equivalent to its better-known government services platform Gotham. Palantir (2023a) describes Foundry as an “operating system for the modern enterprise” centered on the Ontology, a pretentious name that is nonetheless revealing of the ambition animating the company’s products. Here, it is worth quoting Palantir at length:
The Foundry Ontology is an operational layer for the organization. The Ontology sits on top of the digital assets integrated into Foundry (datasets and models) and connects them to their real-world counterparts, ranging from physical assets like plants, equipment, and products to concepts like customer orders or financial transactions. In many settings, the Ontology serves as a digital twin of the organization, containing both the semantic elements (objects, properties, links) and kinetic elements (actions, functions, dynamic security) needed to enable use cases of all types. (Palantir, 2023b)
As Palantir points out, its Ontology operates as a digital twin: a computational representation of the spatial, temporal, and relational arrangement of an enterprise that enables analysis, simulation, and implementation of decisions. While digital twins have typically been applied to the processes and actors within distinct physical environments such as the factory, warehouse, port, or city, Foundry aims to also twin the more abstract, informational, and distributed systems that make up the firm or institution. In doing so, the Foundry twin gives operative and interactive form to what might otherwise be disaggregated or invisible. Palantir calls Foundry an operating system because it is one that users can customize and evolve via modular, drag-and-drop tools that require little or no coding (although online forums report mixed results and variable ease of use). This, then, is the fundamental business proposition of Palantir: out-of-the-box digital twins as enterprise operating systems that are scalable, mobile, and adaptable.
In 2021, Palantir began working with Pacific Gas & Electric (PG&E), California’s largest investor-owned utility, to use Foundry’s digital twin capabilities for asset management, electronic operations, and grid safety and reliability. PG&E was responsible for more than 1500 wildfires in the period from June 2014 to December 2017 alone due to equipment failure and mismanagement, leading to over a hundred deaths, tens of thousands of homes destroyed, and hundreds of thousands of acres of burned forest (Gold et al., 2019). After being required to pay $25.5bn in lawsuits, fines, and other costs, PG&E entered bankruptcy and emerged in 2020 with a commitment to pay out $4.5bn immediately and place 22% of its stock in trust for fire victims. In this context, contracting with Palantir to predict and manage probable futures through the Foundry system holds clear appeal—and, for Palantir, arrived at a time when it was facing investor pressure to diversify its client base (Daniel, 2021).
As the logistical media operating system for PG&E, Foundry’s digital twin pulls in data from and about the company’s electrical infrastructure. In a marketing video (Palantir, 2023c), PG&E’s risk management and data science staff describe the breadth and diversity of data: between 8 and 10 billion daily data points, 25,000 miles of electrical cable in high fire risk areas of California’s forests, and incomplete information about the type and distribution of grid management devices throughout the system. In logistical terms, this is a process of standardization that responds directly to the multiple sources and formats of data within the system, as well as its incomplete or inconsistent quality. As in Xiong’an, PG&E’s organizational data are spread among many devices and systems and held in various formats, existing in abundance but without cohesion or interoperability. A huge amount of work is required to overcome this “data friction” (Edwards, 2010) and fit those datasets together, both in the technical matching of formats and standards and in the labor of organizing stakeholders and their potentially competing interests.
Through Foundry, data and models feed into the PG&E Ontology, within which real-world objects, actions, relations, sites, and functions are situated and connected in spatial representations of the electric network. Here, discrete processes that might otherwise be bound to individual locations (a plant, a section of power lines, and an individual switch) or functions (generating, distributing, and halting) become interoperable such that data now become mobile. PG&E thus generates fidelity across the system between data sources and between the system and the real world, much like barcodes and other scanning and target systems maintain fidelity within material logistics. On top of the PG&E Ontology, apps provide purpose-built functionality: bouncing electrical signals down lines to identify switch types, custom apps that can swiftly switch off entire sections of the grid in response to incipient fire data, and the capacity to test those tools via simulation.
Switch failure, line stress, wildfires, and a host of other potential futures constitute threats of varying severity, always hovering in the future even if they fail to arrive. Threat’s constitutive futurity means that it can never be banished, dismissed or fully prevented but also preemptively governed through management of the present (Massumi, 2015). As technologies of ontopower, digital twins mediate future virtualities into actualities as they come into being. Uncertainty is not tamed so much as rendered the site of value extraction, enabled through the operative qualities of Foundry’s Ontology. Its combination of standardization, interoperability, and fidelity means that Foundry can promise a degree of stable scalability, in much the same way that a robust logistic apparatus might scale delivery from city to region to globe. Foundry, as a digital twin operating system, enables the production and circulation of logistics data mobilities that can modulate material actions within the PG&E network of electrical infrastructure as it threatens or is threatened by environmental conditions. Safety, ecology, and community harm manifest as risk that can be calculated and mitigated via twin modeling and modulation of the material grid.
A litany of similar case studies can be found online and on the Palantir website, spanning an array of industries from automotive (component monitoring to reduce warranty expenditure) to health (working with the United Kingdom’s NHS to optimize COVID-19 vaccine distribution). What is striking across these examples is the push to sell Foundry as simultaneously readily translatable and intensely customizable. In contrast to the bespoke, slow development of early digital twins, Palantir offers a modularized product portfolio that reduces friction in implementation and thus makes both the purchase and sale of twins more accessible. The what of prediction does not matter if the right tools can be packaged and sold as a platformatized production for prediction and preemptive control. Foundry and products like it thus aspire to be the computational media that exist “prior to and form the grid” (Peters, 2012: 41) through which logistical world-making can take place. This logistical mode of prediction enables Foundry to operate preemptively, relying on “total information capture in order to arrest emergent processes” (Andrejevic, 2019: 39). Foundry thus collapses emerging futures into present mechanisms of control, coordination, management, and decision-making, promising the power to bring into being itself, or what Brian Massumi (2015) calls “ontopower.” Palantir packages this capacity—or, rather, the desire for it—into a logistical software solution that can travel into any context to deliver modeling, simulation and, crucially, prediction. Provided, of course, that Palantir remains on contract to provide support services to maintain the system the organization is now locked into as a steady stream of updates rolls out.
As Alibaba’s City Brain demonstrates, Palantir is not alone in recognizing the potential value to be reaped from the logistical coordination of data, analytics, and operations. Services from two of the largest cloud computing service providers—AWS TwinMaker and Microsoft’s Azure Digital Twins—demonstrate the formation of a market in which digital twins appear as a service that can be bolted onto cloud storage, data analytics, and AI/ML services to provide what industry refers to as a single source of truth (SSOT) within an organization. IBM partnered with Palantir in 2021 to become a reseller, a move that boosted the share price of both companies based on the potential for Foundry to supplement IBM’s Watson and Cloud Pak for Data services (Noonan, 2021). For Palantir, the deal provides access to IBM’s 3500 global sales staff, as well as its existing analytics and cloud customers (Stone, 2021). A similar deal with AWS provides Foundry for manufacturers already dependent on cloud services (Manufacturing Business Technology, 2023). Reviewing the digital twin offerings of these companies, the stark similarity of the products stands out. This similarity points to both the market value of digital twins—big cloud and analytics companies all play the game—but also to the fundamental lack of differentiation in the products themselves. For now, there is no dominant global force in digital twin services or products. Alongside the tech giants, Palantir, and other early twin companies, management consultancies such as Booz Allen and Deloitte play an important intermediary role in advising organizations on technological solutions but themselves typically lack the expertise to actually provide digital twins and instead direct clients to SaaS providers like Palantir or IBM. Within the digital twin economy, consultancies and market research firms like Gartner fuel the hype in two ways: extolling the financial benefits of digital twins for companies that use them to optimize operations and control risk while publishing reports and blog posts that expound the current and future value of the digital twin market itself.
In engineering and manufacturing, digital twins are more entrenched as part of the so-called fourth industrial revolution that centers on the digitalization of manufacturing, optimization of processes, machine sensors, real-time data rendering and analysis, and virtualization of industrial activities. German behemoth Siemens (revenue of €72bn in 2022) sells a range of SaaS products to other engineering and manufacturing companies, particularly in highly complex sectors such as aerospace, superconductors, and energy. In a similar vein, Nvidia’s “industrial metaverse” platform Omniverse contains a suite of products pitched as an interoperable solution that standardizes digital representations and allows for the construction and management of twinned assets, processes, and environments in our familiar logistical media arrangement. Unlike other market players, Nvidia stands to gain in both directions: owning a universal platform that provides control over the standard and profiting from the future hardware sales that achievement would lock into place.
Digital twin products are typically offered by mature firms carving out a new space by leveraging their existing infrastructures, expertise, and market power. Enterprise digital twins provide an alternative case study in Silicon Valley innovation. Mature technology and consulting firms vie to sell services that put data to use in computational forecasting. Digital twins distill those promises of prediction, and the move to make them modular and accessible as the operating systems of enterprises seeks to contain market power within a loose ecology of already big players. Whether twinning a city or an enterprise, access to the best and most powerful predictive models now depends on owning the means to coordinate, coalesce, and process data that are already captured but not operationalized by large enterprises. The quality of simulation and prediction relies on globalized conditions of constant “intelligence amplification” and agglomeration that exceed the capacities of most universities and government agencies. But the contradiction at the heart of all this is that the burgeoning digital twin industry depends upon the continuation of friction, infidelity, incompatibility, and non-scalability. After all, digital twins do not displace the data streams of existing enterprise software or the proliferation of local technological solutions within large organizations. Quite the opposite. Whether they work or not in practice, digital twins promise to introduce a new operative, singular, and ontological layer within organizations that conceals and overcomes problems of data collection, management, analysis, decision, and action without ever addressing those problems directly. Owning the means of prediction thus becomes justification for claiming to be the operating system of the present.
Conclusion: the logistical mode of prediction
Even if digital twins have not yet become the global industrial and managerial operating system their purveyors desire, they have proved quite capable of organizing and coordinating corporate profit and political agendas as well as significant resource investments. Digital twin platforms such as City Brain and Foundry produce technical configurations that normalize the corporate-led, top-down, tech-mediated, algorithmic governance of things, society and space. This is reason enough to be wary, as it is abundantly clear that their models, simulations, and predictive outputs threaten to sideline the intricacies of life and flatten its layered realities. But digital twins are valuable—very valuable—as a coproduction of tech firms and governments vying to lay stakes in the process and maintenance of a technical architecture of urban and territorial governance that extends the predominance of corporate power and specific, contingent, and sometimes competing visions of urban, industrial, and enterprise futures. This strategic value is intimately tied to the computational engines that drive them.
Built into the logistical media apparatus of digital twins such as City Brain and Foundry, ML prediction allows for probabilistic modeling of potential futures rather than the determination of singular outcomes based on the application of frameworks for analyzing past events. Generating multiple futures differentiated based on computed probability brings uncertainty and potentiality into the fray of the present and makes operable the uncertainties of the future. Projecting probable futures enables actions in the present to actualize or foreclose multiple strands of potentiality. Emergence rises to the fore: to extrapolate, model and map futures rather than predict a future singularity, to gain a grip on the many probable (and less probable) possible futures, depending on the choices that are made. Rather than the future singular of oracular prediction, it pays to think probabilistically of many futures and to possess the logistical architecture required to mobilize and integrate those futures with systems for the management of the present. In their orientation toward multiple future horizons, digital twins are in the vanguard of a logistical front that aims to capture futures on behalf of capital and to bring about those most desirable through the preemptive coordination of resources in the present.
Digital twins propose a convergence of technological and political commitment—a throughline between big tech and governments’ prior investment, ambition, and competing global futures. As saleable products, digital twins pull together a host of promises and platforms, but they also depend upon established discourses, technology companies, and consulting firms. Digital twin software provides the logistical media necessary to enact the smartness mandate within complex physical, social, and ecological environments. They aim to coalesce desires and demands for smartness from the countless “smart” devices, sensors, tools, analytics, and policies into logistical media platforms that privilege efficiency, optimization, autonomy, and predictability as fundamental mechanisms for controlling emergent futures. Palantir’s Foundry and Alibaba’s City Brain attempt to operationalize ontopower as a platform product: adaptable, scalable, and interoperable across the messy flux of existing data systems, with an orientation toward the preemptive capture of futures in the present.
Digital twins are animated by the proposition that those who predict the future shape it. By modeling the impact of change within the system, digital twins promise to turn the future into a probabilistic environment from which decision-makers can make choices, as if from a menu. They are speculative projects that yearn for technical conditions of the near future, where they can become a management software layer for political and territorial integration and consolidation. Digital twins propose and pursue a logistical mode of prediction and, as such, are all too valuable no matter how well they work in practice. Or so the companies and consultancies that build and sell them would have us believe. As we have argued throughout this article, digital twins are contradictory entities: they promise to capture futures and yet often fail to function in the present; they seek to optimize operations of all kinds and yet depend upon friction within existing systems for their value; they offer to tame emergence in all its uncertainty and yet necessarily flatten complexity to make it computationally actionable. And still the market for digital twins continues to grow—not despite these contradictions but because of them. The lure of digital twins lies not in their smooth operation but in the power they promise: the power to yoke prediction to the logistical coordination and control of the present.
Footnotes
Acknowledgements
The authors would like to thank Professor Mark Andrejevic, Dr. Luke Heemsbergen, and Professor Janet Roitman for invitations to share early versions of this work at workshops and conferences, as well as Professor Ned Rossiter, Professor Brett Neilson, and Dr. Thao Phan for their comments on early drafts. They also thank the editors of this special issue for the opportunity to include this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was conducted by the ARC Centre of Excellence for Automated Decision-Making and Society, and funded partially by the Australian Government through the Australian Research Council grant CE200100005.
