Abstract
This article looks at how the emergence of the disinformation field in Aotearoa-New Zealand is indicative of the territorial power of American platforms in policy settings. The country has been a key flashpoint in the debates around platform harm. Former Prime Minister Jacinda Ardern was a champion of technocratic and compassionate leadership in navigating the country through the Christchurch mosque terror attack and COVID-19. At the heart of Ardern's political legacy was
Keywords
Introduction
Since Trump's first election victory in 2016, the problematic of disinformation has been at the heart of populist convulsions and diminishing trust in democratic institutions. In Aotearoa-New Zealand 1 the Labour government of Jacinda Ardern (2017–23) became the epicentre of global discourse around platform harm, following the livestreaming of the March 15th 2019 Christchurch mosque terror attack. Ardern became a global symbol of technocratic governance and Western liberalism's moral crusade against disinformation with her post March 15th advocacy and subsequent management of the COVID-19 pandemic through restrictive public measures. During her administration, national security and communications policy in Aotearoa-New Zealand was shaped by the emerging field of disinformation expertise. It was a policy response that privileged partnership with Silicon Valley platforms through the use of big data epistemologies at the expense of public interest media and a national platform strategy. It also reified a technocratic ‘warification’ (Eberle and Daniel, 2023: 17) of political unrest in which disinformation experts are given a definitional authority over the complex forces of the political. When the populist backlash eventually struck in the form of a vociferous anti-vaccine, anti-government movement in 2022, the culture war framework for politics had been forged in the terms of platform imperialism. Both the disinformation technocrats and the liberated subjects of self-publishing are ontologies that eschew structural critiques of platform power.
This article analyses the emergence of disinformation expertise in Aotearoa-New Zealand by placing it within the context of platform geopolitics and the synchronous emergence of ‘disinformation’ as the key signifier of social media harm. Coined by Jin (2015, see also Bannerman, 2024), ‘Platform Imperialism’ looks critically at how platform governance and infrastructure ‘are built upon … within and on historical international power structures’ (Bannerman, 2024: 1817). The term therefore draws attention to the close connections between the governance of digital infrastructure and the geopolitical utility of the internet for warfare and economic dominance (Pohle and Voelsen, 2022). Disinformation studies functions as an administrative research paradigm (Pickard, 2021) that reifies the political economy and cultural hegemony of American platforms. This research paradigm creates consensus between government, academia and platforms through big data epistemologies that can attribute specific effects to malevolent actors disrupting normative democratic values. This consensus was crucial to undermining structural and anti-Trust critiques during the ‘Techlash’ but has since frayed with the return of Trump to the presidency in alliance with Silicon Valley.
The formation of disinformation as a field has relied upon cohering a technocratic cultural capital which permeates different fields, from academia, journalism to government. The methods and doxa of the field view political identity, antagonism and truth as trackable through dashboards and analytics. This desire for a strategic vista over the social space appeals to all fields in which tactical, communicative interventions are a dominant form of capital. It also derives from a history of administrative research for the national security state that views communication ‘as an instrument for imposing one's will on others’ (Simpson, 2015: 45). The hybrid war thesis (Eberle and Daniel, 2023) updates Cold War theories of communication with rival cyber powers threatening social disharmony via social networks. As the disinformation field is uniquely able to grapple with this hybrid war ‘reality’ it cuts across the field of power conflating social media harms and a democratic malaise within a national security framework. Disinformation expertise is a practice that remains within the spatiotemporal power of platforms as Silicon Valley coheres the academics and NGOs that define the field, through access and funding.
We chart the emergence of
This diagnosis of social pathologies comes at the expense of digital and broadcast policies that might shield democratic processes in Aotearoa-New Zealand from the vagaries of American platforms. The ultimate closure of TDP following a change of government and a right-wing backlash marks the passing of this policy prescription in favour of a libertarian self-publishing ontology.
Platform imperialism
This article uses ‘platform imperialism’ as its core theoretical framework (Jin, 2015; Bannerman, 2024) for the global emergence of disinformation studies as a policy prescription. Platform imperialism is understood as a form of political economic and territorial power that reflects the centrality of platforms to American geopolitical interests. It is an approach that sits alongside other uses of imperial and colonial concepts in the digital economy. Mejias and Couldry deploy ‘data colonialism' to describe the extractive frontier logic that informs big data capitalism (2024). Kwet has identified the capture of economies in the global south by platforms party to imperial state surveillance systems (2019). McElroy writes of Silicon Valley imperialism as a spatial logic of capitalist mobility that remakes cities for the privileged class of digital nomads (2024). What the formulation of platform imperialism offers is continuity with the legacies of political economy approaches to communication and empire (Bannerman, 2024; Fuchs, 2015; Jin, 2015; Jutel, 2013; Mansell, 2017; Schiller, 1975; Simpson, 2015; Thompson, 2010, 2019) that provide a structural critique of propaganda and disinformation. The political economy tradition has long been concerned with global multinationals entering and controlling new markets through privatization and regulatory capture. This has meant the cultural imperialism of American media industries, the extraction of profit through intellectual property regimes and now structural dependence through ‘cloud first’ (GCSB, 2023) procurement policies favouring Amazon, Google and Microsoft. There is a convergence of economic, cultural and geopolitical forms of imperialism through the spatial logics of platforms. As Morozov puts it ‘technology is geopolitics by other means’ (Bherer, 2023).
What constitutes the new dynamic of platform imperialism is a capture of the digital economy that allows for a ‘spatiotemporal projection’ of power and control ‘beyond ownership’ (Rikap, 2024: 1). This encompasses regulatory settings, cultural imaginaries and the infrastructural power of Silicon Valley firms. Specifically, this means creating new economic territories, social spaces and affordances that are governed by American platforms. Jin's formulation draws on theories of the ‘new imperialism’ (2015: 36) in describing relations between states as governed by the spatial logics of finance, technological dependence and forms of cultural soft power embedded in platform governance. Notions of network openness and Web 2.0 democracy have been potent forms of American platform power that have greatly diminished the imaginary for public media policies and national digital sovereignty. The spatial and uneven contours of this imperialism are a reflection of the platform as ‘a convergence of different socio-technical phenomena’ (Narayan, 2024: 2). Through government partnerships in network infrastructure, territories of surveillance and power projection have taken shape on a ‘compressed timescale’ (Narayan, 2024: 2). As a hybrid territory of corporate, state and network power, platforms are prone to the convulsions of finance capital, geopolitical competition and shifting regulatory environments. The transition from the era of Web 2.0 network openness, the Techlash around the first Trump presidency and now the open alliance of Silicon Valley oligarchs and Trump is characteristic of these contested class and national interests.
It is the contention of this article that Disinformation Studies has emerged as an ‘administrative research’ (Pickard, 2021) paradigm that corresponds to the imperialist logics of Silicon Valley firms during the Techlash. Lazarsfeld (1941) famously denoted two types of communication research; critical research which ‘scrutinized media systems’ structural bases (Pickard, 2021) versus administrative research which is concerned only with its effectiveness. Not only does disinformation research elide the consideration of political economy, but it marshals academic, civil society and state resources to both protect the political economy of platforms and to produce a platform space technocratically governed by the rubric of disinformation harm. These notions of harm and propaganda rely upon analytic concepts that reify myths of American exceptionalism, epistemic consensus and a cold war civilisational enemy (Kuo and Marwick, 2021). It sees commercial and American platform domination as ‘part of the natural order’ and so eschews structural criticisms of ‘breaking up’ monopolies and ‘building out’ public infrastructure (Pickard, 2021). The emergence of TDP in Aotearoa-New Zealand is emblematic of the compressed timescale of platforms and their perceived political impacts. It also provides rationales for state action that are guided by administrative research, platform partnerships and big data epistemologies.
Platform imperialism shapes governance and the space-making practices of online while ultimately retaining decisive enforcement and infrastructural power in corporate hands. The process is uneven as resource conflicts between American platforms, rival jurisdictions and industries have created a patchwork. The Australian government has forced some revenue sharing for local media through the News Media Bargaining Code. The EU's Digital Services Act requires platforms to disclose governance practices alongside the Digital Markets Act aimed at increasing competition in the digital economy. Brazil's internet bill of rights has enabled courts to suspend Twitter/X for compliance failures. Brazil's case is exceptional in framing national digital sovereignty in opposition to the social harms of American platforms, disinformation and oligarchy (Meir, 2024). More recently, a US federal anti-trust case against Google has signalled the potential for a breakup of Google search from Chrome and the smartphone market. However, what continues to define this patchwork is a begrudging compliance that allows platforms to retain corporate control over their spatiotemporal domain of power, particularly cloud platforms (Lambach, 2020: 499).
This patchwork is also subject to the fits of American political contestation and shifting priorities between state and corporate power. The era of Web 2.0 openness, the Techlash and the Trump/Silicon Valley alliance attests to Mansell's (2017) typology of differing governance imaginaries. The socio-technical governance of platforms may oscillate between notions of digital civil society, national security and the regulatory state, or the free market (Mansell, 2017). Within these contested policy responses a libertarian self-publishing ontology remains a consistent ideal of American platforms. We may think of this in terms of the regulatory carve out CDA 230 that made the platform economy and this ontology possible; while also embodying John Perry Barlow's screed 2 against this very regulation as government tyranny. As Golumbia has identified, the ‘deep incoherence’ (2024: 27) of cyberlibertarianism matters very little when wedded to platform power and notions of American exceptionalism. Trump's alliance with Elon Musk and use of social media publics deploys this self-publishing ontology of libertarian freedom. Musk is typical of the cyberlibertarian posture of free speech and free markets while being a government contractor who threatens political enemies with reprisals. Crucially for disinformation research their national security and regulatory governance imaginary has been vociferously attacked by the MAGA-tech alliance. Musk regularly attacks disinformation research (Kuklychev, 2023), researchers are now denied API access to X and Meta has discontinued its key research tool CrowdTangle (Orotutay, 2024). As an administrative paradigm, disinformation research depends on platform openness and partnership while forgoing the structural critiques that help explain the current conjuncture of Silicon Valley and Trump.
For policy makers and researchers concerned with platform harm and digital sovereignty there are important legacies of anti-imperialism within the political economy tradition. This critical field is indebted to the 1980 MacBride Commission under the auspices of UNESCO. Developing and post-colonial nations identified American domination of global telecommunications and media as a neocolonial threat to national and cultural autonomy (Fuchs, 2015). The threats identified by the MacBride commission are prescient today in the light of large language models that exploit developing world labour and are built upon a language of super imperialism that threatens to make users ‘prisoners of American culture war’ (Morozov, 2023). In Aotearoa-New Zealand the fragile commitment to indigenous self-determination, bi-culturalism and efforts towards indigenous data sovereignty (Ruckstuhl, 2022) are undermined by platform imperialism. The state may lack the current capacity to build out infrastructure for national platforms and data localisation, but thinking critically though platform imperialism and ‘relational sovereignty’ should be a policy priority (Bannerman, 2024). More recently, political economy scholars have set out a research and policy agenda, that echoes the MacBride Commission, in calling for a ‘digital non-aligned movement’ to reclaim digital sovereignty (Rikap et al., 2024).
Aotearoa-New Zealand, disinformation and political economy
Aotearoa-New Zealand is a useful case study of platform imperialism as a Five Eyes partner that is integrated in the spatiotemporal territory of American platforms and whose politics have taken on the characteristics of platform-driven culture war. The country's populist rupture came in 2022 with the occupation of Wellington Parliament's grounds by anti-vaccine mandate protesters. This conflict has been understood by local disinformation researchers as platforms being weaponised by foreign threats. Cold War theories of communication have been reheated with domestic politics and protest made subservient to questions of disinformation and hybrid war (Jutel, 2023a). This has the effect of making the disinformation expert the agent of universal truth and consensus reality. Conversely, the populist right have railed against public media and disinformation experts as woke censorship. Culture war has emerged in Aotearoa-New Zealand within the terms of platform imperialism; technocratic post-politics against populists championing a self-publishing ontology.
Disinformation's emphasis of geopolitical analysis is an awkward fit for the local political context. The online counter-publics that emerged to protest Jacinda Ardern's government during the month-long occupation of Parliament's grounds in early 2022 were a heterogenous mix and part of long-standing right-wing currents. They included the followers of prosperity gospel evangelist Brian Tamaki; the anti-communist farmers of Groundswell; a network of alternative health influencers ‘Voices for Freedom’ and the online platform Counterspin Media that is hosted by Miles Guo and Steve Bannon's GTV. These forces have not made for an easy coalition but reinvigorated the established parties of the right during the subsequent 2023 election. The incoming coalition government channeled populist sentiment while explicitly forcing out list candidates who espoused anti-vax conspiracies. Their rise has been aided by ‘earned media’, business activism and negative polarisation as opposed to disinformation and online mobilisations. 3
This populist environment is also aided by the country's media being among ‘the most deregulated and heavily commercialised media markets in the world’ (Thompson, 2010: 3). Multinational corporations and private equity have long dominated while public broadcasting is in perpetual crisis, increasingly becoming the target of right-wing demagoguery. The overshadowing of national media and platform policies by disinformation concerns came into stark relief in 2024 with rounds of mass layoffs in local media dubbed the ‘Media Apocalypse’ (Edwards, 2024) following the axing of Newshub by Warner Brothers Discovery. The collapse of the news industry put the right-wing government under pressure to revive the Labour Party's failed ‘Fair Digital News Bargaining Bill’, loosely modelled on Australian legislation. The bill passed one reading before being withdrawn following pressure from Google (Currie, 2024).
Aotearoa-New Zealand's media and democratic challenges can be viewed within the long-standing concerns of the political economy of the media tradition in which deregulation leads to corporate oligarchy, the slashing of journalism and the ascendancy of a free market “common sense”. This is the story of Newscorp, Fox News and the rise of an American right-wing populist style globally (Jutel, 2013, 2018a, 2018b; Moffitt, 2016; Peck, 2019). Conservative media networks and political infrastructure have flourished under these conditions (Meagher, 2012), along with counter-publics of hyper-individuation. One of the distinctive features of the new far-right populist movements is the self-publishing ontology of platforms as
Disinformation studies as administrative field
Disinformation studies emerged as a professional, administrative field to shape platform governance and policy during the Techlash. It is a field in the sociological sense defined by Pierre Bourdieu (1996), producing the habitus and cultural capital of a seemingly autonomous profession distinct from the field of power (i.e. the state and economic capital). Disinformation research claims access to a unique social scientific capacity for understanding the networked and weaponised social terrain. The innovation is in taking up platform and big data methods which reify the spatiotemporal power of platform imperialism in understanding a perceived new reality. The claims to use new digital methods to gain a vista over politics and online spaces are attractive to other fields (business/government/journalism) that value strategic communication and threat appraisal. The boundaries between academic disinformation research and private consultancy in the field are porous; as demonstrated in the case of TDP. It’s in this way that disinformation research avails itself to administrative fields and research; accentuating crisis communication, public relations and national security concerns in democratic political processes.
As an administrative field it naturalises the political economy of platforms and the media, while operating within long-standing effects research paradigms that have been at the heart of American imperial propaganda (Simpson, 2015). In describing this administrative paradigm we draw a distinction between the field that embodies platform epistemologies and national security imperatives; and critical disinformation studies (Marwick et al., 2021; Pickard, 2021; Young, 2021). The critical denotes an integration of critical theory and ‘the structural and social aspects of misinformation’ (Young, 2021: 1). Our intention is to take up the
This administrative field considers disinformation and fake news as empirically provable objects with measurable effects on the polity (Miró-Llinares and Aguerri, 2023). In this way it engages in myths of an American ‘epistemically consistent past’ (Marwick et al., 2021) treating social antagonism as pathological and offering political actors in the field the pretense of ‘post-politics’ (more below). The disinformation expert embodies the technocrats’ aspiration to universality as an actor of public and civic integrity beyond reproach (Bourdieu, 1996: 382–383). Disinformation expertise is increasingly hegemonic across the field of power as a new discipline engendering an ‘organic solidarity’ (Bourdieu, 2020: 40) between fields (government, academia, business, national security) around its normative view of democratic threats.
As an outgrowth of platform imperialism, it is a discipline shaped by American geopolitics, institutions and international alliances in creating ‘complex circuits of legitimating exchanges’ (Bourdieu, 1996: 386). American universities, national security think tanks and Silicon Valley NGOs are prominent in developing its doxa and methods. Platform partnerships, shared epistemologies and a revolving door between organisations grow out of the organic solidarity of the field. The NATO think tank the Atlantic Council is prominent in the field through its Digital Forensics Research Lab (DFR Labs) which has partnered with Meta in superficial election integrity efforts (Jutel, 2023a). Prominent disinformation researchers John Kelly and Ben Nimmo have created the consultancy firm Graphika and are strategic advisors to the Oxford Internet Institute. A co-founder of DFR Labs, Nimmo has subsequently gone on to serve as Meta and Open AI's threat intelligence lead investigator. Another key node in the field is the Stanford Internet Observatory (SIO). SIO was founded by Facebook's former chief of security, is under the directorship of the former US ambassador to Russia, and its research director was a prominent disinformation policy entrepreneur. 5 Located in Palo Alto with Silicon Valley benefactors such as the Omidyar Network, the SIO describes its work as ‘developing a new field of study and professional practice’ (2022: 9) by working with industry and academics on data sets and tools to understand threats to democracy.
The intra-field processes of legitimation mean that academic research in disinformation is shaped by the functions of platforms within American geopolitical power. Social harm is understood in national security terms of hybrid war and draws on a long history of administrative communication theory. Communication research in the service of America's Cold War efforts established psychological warfare theories that viewed communication as principally a form of coercion and domination (Simpson, 2015). At its most triumphalist, services like Voice of America were viewed as part of a ‘push-button’ mode of warfare where messages could be constructed mathematically and transmitted to win the Cold War (Simpson, 2015: 66). With the disinformation paradigm this model has been inverted to theorise hybrid war threats. In the New York Times, Shoshona Zuboff writes of a new algorithmic push-button reality in which ‘you are now remotely controlled’ and that modern information warfare ‘arrives carrying a cappuccino, not a gun’ (Zuboff, 2020). Notions of fake news as a virus or contagion (Jutel, 2023b) have flourished as part of the lexicon of the discipline and embodying the hysteria of a reanimated anti-communism (Marwick et al., 2021). The threats to democracy that emanate from network communication are seen as attempts by nefarious foreign actors to enlist and weaponise local dupes (Eberle and Daniel, 2023: 5–6).
In insulating American platforms from a political economy critique administrative disinformation studies reifies platform epistemologies and ontologies. This research adopts the large US platform's ‘algorithmic episteme’ (Fisher and Mehozay 2019) – a behaviourist way of understanding humans only through their online traces, while negating any other theories of the human subject. It shares conceptual categories and ontologies deployed by platforms ‘to measure what its users … do in very particular ways’ (Anderson, 2021: 55). Miró-Llinares and Aguerri's meta-analysis of disinformation and fake news studies found that 90% of studies on disinformation do not conduct internal analysis (2023: 364). This means disinformation is identified ‘not by its content but by its source’ (Miró-Llinares and Aguerri, 2023: 364). This is at odds with discourse theory concepts of politics (more below) as it ascribes discrete political beliefs to sources, content and channels as opposed to looking at how fascism is articulated through libidinal inflections, chains of equivalence and half-truths, predominantly articulated in legacy media channels (Mouffe, 2005; Phelan, 2019). These determinations are made by disinformation indexes created by fact-checking organisations in the field, however these ‘lists are not transparent … [or] scientific since none of these organisations principally carries out scientific activities’ (Miró-Llinares and Aguerri, 2023: 367). These reductionist methods reinforce a monosemic and coercive model of communication but also insulate mainstream media from critiques of propaganda and populist demagoguery.
This approach also obscures the continuum between right-wing media networks, corporate propaganda and what is defined as disinformation. Guess et al. (2020) have identified consumers of fake news as elderly, voracious consumers of traditional conservative news such as Fox and talk radio. The determining effect of disinformation as opposed to traditional right-wing propaganda is unclear. This continuum goes a significant way in explaining Aotearoa-New Zealand's challenges with right-wing populism. Among the ranks of culture war and conspiracy media leaders are prominent former traditional broadcasters; Lizz Gunn, Peter Williams and Sean Plunkett. 6 It should also be noted that New Zealand's right-wing activists, business lobby and even the office of former Prime Minister John Key have a track record of using radical online outlets and culture war invective as a key political communication strategy (Hager, 2014). These are the same business lobby groups and political figures, globally connected through the Atlas Network, that have railed against disinformation research as an attack on free speech (Parsons, 2024).
Cold War post-politics
Disinformation studies is able to present its seemingly ‘objective’ technocratic expertise as authoritative and decisive by articulating a post-political, morally charged discourse (Jutel, 2019). Chantelle Mouffe (2005) describes politics in the moral register as signifying post-politics, because there can be no engagement between opposed ideas, only a moral ‘struggle between right and wrong’ (p. 5). Mouffe sees politics as increasingly the exclusive terrain of a self-referential technocratic elite. Dissenting voices are increasingly excluded from democratic debates as anti-democratic extremists (Mouffe, 2018). For Mouffe, simply removing ideas that may be found distasteful, such as those of the far-right, from the public sphere (as is often encouraged by disinformation research), does not make them go away – they fester into antagonism between enemies, contributing to a vicious cycle which encourages polarised, extremist views, furthering the growth of far-right populism (Salter, 2022). Post-politics also suffers from a disinterest in the work necessary to sustain and reimagine a robust, agonistic public culture and media policy.
Alongside her late former partner Ernesto Laclau, Mouffe pioneered a highly influential theory of discourse (Laclau and Mouffe, 1985) which argues that the discursive construction of phenomena cannot be separated from its material effects. Any hegemonic formation, such as that which sustains disinformation studies, must be articulated around a shared 'empty signifier' – which in this case is disinformation. Such a hegemonic formation would include tech companies, governments, academic researchers and NGOs in comprising the field of disinformation studies. These groups may hold very different understandings of what disinformation actually means, but those are cancelled out through a shared disidentification against a common antagonist: the agents of disinformation. Through the articulation of this figure of moral corruption, a “them versus us” antagonistic frontier begins to form. Once the other side is cast as morally abhorrent, then any discourse on the subject of disinformation increasingly must select one side or the other.
Thus, it is the antagonist,
Discourse theory as method
Having outlined the political economic forces and logics of the political that shape the disinformation field, this article turns to the policy environment of Aotearoa-New Zealand. This analysis will centre on how disinformation expertise and methods permeate the fields of government, media and civil society groups. Following this, our analysis turns to TDP as key policy entrepreneurs that operate across the field of power and reproduce hybrid war notions of social relations and public life.
In terms of research methods, we undertook a discourse analysis of a corpus of policy documents, TDP research reports, journalistic stories, and television programs (including 2 documentaries), covering a 4-year period (2020–23 inclusive). However, rather than seeing data and theory as necessarily and ontologically distinct, we follow a discourse theory approach which focuses on the ‘articulation’ of theory and data in conjunction (see Glynos and Howarth, 2007; Howarth, 2005). While compatible with methods such as discourse analysis, the articulation approach is more problem-driven, perceiving instances of language in use as ‘meaningful objects of analysis’ (Howarth, 2005: 317) through the deployment of theory to a political/social problem. Being problem-driven draws on Foucault's (1985: 318) problematisation, which ‘begins with a set of pressing political and ethical problems in the present, before seeking to analyse the historical and structural conditions which give rise to them’.
The identification of research objects to come under the analysis is therefore not driven so much by the demarcation of particular genres (as in some forms of discourse analysis), but by the political work that is being initiated by the authors. Of particular interest would be objects that are key to the construction of political identities within a particular field, and/or that attempt to ferment or sustain hegemonic formations, and/or that construct social antagonisms and establish political frontiers (Howarth, 2005).
Hence, the overarching aim is to account for how the contours of the disinformation field emerge through the process of discursive articulation, that is, in discourse theory terms, through the struggle for hegemony in which normative democratic values and truth are constructed as imperiled by political pathologies that are germinated by nefarious external actors (see Glynos and Howarth, 2007). The heightening of certain forms of political antagonisms (disinformation dupes) and the repression of others (platform imperialism) is key to the legitimation rituals between fields that create disinformation expertise as a form of cultural capital able to speak for universal truth.
Disinformation studies in Aotearoa-New Zealand
The Christchurch mosque attacks of 2019 7 put the country at the centre of Techlash questions about platform harm, disinformation and terrorism (Hoverd et al., 2021; Thompson, 2019). The prime minister Jacinda Ardern's compassionate leadership and creation of The Christchurch Call was emblematic of a post-political moral politics through a “multi-stakeholder” approach to platform governance at the heart of the disinformation field. Much of Ardern's international prestige was premised upon this articulation of socially responsible leadership, and later with COVID-19 as an evidence-based technocrat. This continues to be her legacy as Ardern now resides in fellowship at Harvard's Berkman Klein Centre for Internet and Society to ‘further the mission of the Christchurch Call’ (Hinds, 2023). It's a role that also brings her into contact with the SIO in framing the disinformation research agenda (Morgan, 2023). In spite of this high-profile intervention, the policy response of more efficient flagging of harmful content, support for academic research into disinformation and a broad digital civil society approach were ‘initiatives [already] being implemented … in the policy pipeline or under deliberation well before the Christchurch terror attacks’ (Thompson, 2019: 99).
The approach of partnering with American platforms through academia and civil society limits the scope for regulation and accountability. A focus on technological challenges ‘produces messaging that shields social media platforms’ and offers ‘positive branding for the global digital FAANG platforms … [while] discourage[ing] scrutiny of how their business models encourage online and offline hate’ (Hoverd et al., 2021: 4). What Ardern's leadership articulates is a relationship between the heights of the field of power (heads of state, tech, national security elite) and an active civil society supported by a network of Silicon Valley NGOs like the Centre for Humane Technology and the EFF (Christchurch Call, 2023). Symbolising this move into the NGO space, since Ardern left office, the Christchurch Call has transformed from a government initiative to a philanthropic NGO. Ardern has claimed in the Washington Post that the Call represents the best approach to governing various tech issues, from disinformation to AI. Her rationale is characteristic of Silicon Valley NGOs; rather than ‘regulate in haste’ and harm the open internet, the Call models ‘companies, government officials, academics and civil society [coming] together not only to build consensus but also to make progress’ (Ardern, 2023). This fuses the governance imaginaries of the security state with a redemptive civil society engaged in media education or open-source research. It is a seemingly moral and humane approach to the warification of social relations. However, this post-political consensus is imperiled by the abhorrent antagonistic subject of populism, extremism and terrorism who seeks to ‘drive social discord, grievances and fears … undermining trust in democratic institutions’ (New Zealand Government, 2023: 21).
The Silicon Valley policy blueprint for multistakeholder platform governance has been advanced in the Aotearoa-New Zealand context through a ‘whole of society approach’ to disinformation (New Zealand Government, 2023: 21). In the aftermath of March 15th the Government's Communications Security Bureau (GCSB), under the auspices of the Department of Prime Minister and Cabinet (DPMC), organised He Whenua Taurikura Hui, a series of community, government and academic meetings designed to counter extremism and security threats. This culminated in the New Zealand government's first ever public national security strategy document Secure Together/Tō Tātou Korowai Manaaki. Disinformation features as one of 12 ‘core national security issues’ to be met with a ‘civil society-led group to advise government’ on disinformation, a fund for community projects which respond to disinformation and ‘a new intelligence priority focused on the national security implications of disinformation’ (New Zealand Government, 2023: 21).
Under this ‘whole of society approach’, agents from across the field partake in shared practices that sediment the definitional power of the disinformation field. For all the worry about the health of and trust in democratic institutions there is no concomitant recapitalising of the state, simply disinformation vigilance. The DPMC's disinformation resilience fund has partnered with InternetNZ, the NGO which manages the .nz domain, to lead community-based education to combat disinformation targeted at the elderly, indigenous people and other at-risk groups. The DPMC has commissioned reports on disinformation and extremism from the cyber-security field, through consultancy firm
This classification of disinformation pathologies is echoed in other government ministries. An academic report for the department of internal affairs’ review of online harms mobilises the concept of ‘information disorders’ (Lips and Eppel, 2022). This report cites TDP's use of the concept (Hannah et al., 2022), in a manner that is highly normative and, as we explore in the following section, not based in any body of academic work. This ill-defined concept is rather the product of policy entrepreneurialism in settings such as the American centrist think tank the Aspen Ideas Institute. 8
The Disinformation Project
These issues need not be understood through the disinformation rubric, but this is the cultural capital that coalesces in the field of power. A key agent that has been able to work across the fields of policy, media and civil society in Aotearoa-New Zealand has been TDP. The private research organisation's origins were as part of a broad interdisciplinary network of university researchers and their work on disinformation at the cusp of the COVID-19 pandemic in 2020. Their research director previously partnered with Twitter's #DataForGood initiative that responded to the Christchurch Call for partnership. As the former Australasian head of Twitter public policy described it ‘we want to listen, learn, and put [disinformation research] insights into action’ (Hinesley, 2020). TDP have received funding from the DPMC (2023), the private sector and worked with community groups to ‘understand and meaningfully respond to information disorders’ (The Disinformation Project, 2023). While other private research organisations such as Logically and HEIA received more funding from DPMC, TDP's importance was in furnishing the media field with disinformation expertise (dominating this area between 2021 and 2023).
As prominent pundits and embodiments of the field, the TDP have been subject to intense vitriol and harassment from the political right. The aforementioned culture war media figures, right-wing lobbyists and the parties of the right have railed against TDP and other disinformation researchers. 9 This reactionary atmosphere has contributed to the closure of TDP in October, 2024, much like SIO which has seen cuts and key departures (Coffey, 2024) following legal attacks from the American right. It is also symptomatic of the shifting character of platform imperialism as Trump and Silicon Valley founders and VCs are now in staunch alliance. It is not the intention of this article to make light of the harassment faced by disinformation researchers, but to insist on a political economy and platform imperialism critique in this current conjuncture. Without it we are left with an antagonistic frontier between the populist right championing a self-publishing ontology and a post-politics that seeks to manage the public sphere within the confines of platform imperialism and its methods.
TDP were emblematic of what Bourdieu identifies as hybrid media-intellectuals that function as ‘entrepreneurs who need to preserve, and increase, their symbolic capital’ (1998: 5). The majority of TDP's work consists of regular media appearances that reproduce a lexicon of ‘infection’ and the cultural capital of disinformation studies. This included RNZ's series ‘Undercurrent’ in which the purveyor of disinformation is an ‘unknown threat … quite similar to the Christchurch terrorist’ (Hannah, 2023). Between 2022 and 2023, TDP were the principal source in New Zealand's two leading newspapers. In articles where disinformation was the primary focus [
The defining moments of media capital and public visibility for TDP were as the disinformation experts for two publicly funded documentary films released in the aftermath of the Wellington Parliament Occupation; Stuff's
However, it is in the state broadcaster TVNZ's
While TDP invoke a mediatised aura of academic expertise, their published work is more characteristic of a policy entrepreneurialism. TDP's single academic publication for
Their central conceptual category ‘information disorders’ (Hannah et al., 2022) is both highly normative and unsupported by academic literature. It is a disinformation studies term of art that has principally emerged in policy documents. 11 Hattotuwa's assertion that TDP's work provides ‘a profound insight into the mindset of the individuals on these [disinformation] channels’ (Pemberton, 2022) is wholly unsupported. Lacking conceptual clarity, the text's main contribution is in advancing the metaphors of infection. Disinformation becomes interchangeably a ‘complex nebulae’, ‘miasma’, ‘contagion’ with ‘irrigation patterns’ (Hannah et al., 2022). Elsewhere in the TDP corpus, disinformation is a ‘pulsating pathology’ or ‘digital Novichuk’ (Hattotuwa, 2021). The recourse to metaphor is characteristic of the impasses within the field. The pathologisation of antagonism and disavowal of structural democratic crises leads to the technocracy invoking the tropes of techno-horror and subversion in Red Scare terms (Jutel, 2023b).
This language of infection and subversion is a product of the ‘warification’ (Eberle and Daniel, 2023: 17) of the social, engendered by platform imperialism and American geopolitics. Social antagonism and crises are seen as vectors of subversion, a logic expedient to the political right and the national security state. Throughout TDP's work they have sought to identify Russian disinformation at the heart of their analysis, rendering domestic politics subservient to this geopolitical lens. This has perverse effects on the public institutions needed to rebuild democratic consensus in this fraught political conjuncture. In June 2023 TDP played a crucial role in leading a Red Scare along with the populist right against the public broadcaster RNZ, after complaints of Russian bias from the Ukrainian expatriate community. The accusation that an online editor altered wire stories to reflect a purportedly ‘pro-Russian’ viewpoint created an international furore and RNZ's chief executive was quick to denounce the stories as ‘Pro-Kremlin garbage’ (Manhire, 2023). Hattotuwa's media and policy entrepreneurialism kicked into gear, with an editorial in
While this intervention by Hattotuwa positioned TDP's work at the centre of the story and media policy debate, it also unwittingly aided far right attacks on RNZ as ‘Red Radio’ (Manhire, 2023). Attacking public media and journalism was a key plank of the populist right's 2023 election campaign, 12 in keeping with long-standing policies of championing deregulation and private media ownership. These are the same forces of the political right that have ruthlessly attacked TDP. The histrionic language about Russian ‘infection as narrative injection’ (Hattotuwa, 2023), and the prominent disinformation documentaries, effectively muscled out and dismissed pre-existing work and policy debates. 13 The damage to RNZ from the political right and the disinformation field is not hypothetical. Following the scandal RNZ assembled a blue-ribbon panel to assess editorial policies and practices. The panel found that ‘the vast majority of stories edited by the journalist were edited appropriately’ (Akel et al., 2023: 13) and there was ‘no evidence to suggest the individual intended to insert misinformation … [or] engage in some kind of pro-Russian propaganda campaign’ (Akel et al., 2023: 15). They also concluded that the chief executive in acceding to the disinformation narrative was ‘unhelpful’ and contributed to ‘the loss of public confidence in RNZ as a source of trusted news’ (Akel et al., 2023: 16). While TDP's normative politics has championed mainstream journalism, their work has undermined trust in public media and misidentified foreign threats in the place of the traditional business activism of the right.
Conclusion
The impact of the disinformation field in Aotearoa-New Zealand has been to redirect policy discussions of platform harms and public media policy into an administrative paradigm that imposes a Cold War lens over domestic politics. The field's concerns and methods are characteristic of platform imperialism in foreclosing anti-trust or national platform policies. The emphasis on hybrid war threats discounts existing right-wing media/political networks and the corporate degradation of journalism, while treating platforms as unalterable entities rather than dependent on regulatory carve-outs and arbitrage. This framing of platform harms elides questions of infrastructural dependence and the autonomy of the Aotearoa-New Zealand's public sphere. While ‘mainstream media’ is uncritically held up as a default institution of consensus, there is no interest in developing public media or platform policies that might achieve a degree of ‘relational sovereignty’ (Bannerman, 2024) in the face of platform imperialism.
Aotearoa-New Zealand's major contribution to the field has been the former Prime Minister Jacinda Ardern's role as moral agent, bringing together civil society, national security bureaucrats and platforms around the Christchurch Call. She links the domestic field to Silicon Valley funded NGOs and research institutes. This stakeholder approach helped sustain the myth of openness and network collaboration at the heart of American platform imperialism. Disinformation expertise and civil society collaboration backed by big data platform affordances are articulated as the technocratic restoration of democracy. Crucially this eschewed structural and anti-trust critiques during the post-2016 Techlash. The fragility of this discursive formation and the failure to confront platform power is evidenced by the recent rollback of platform data and API access, effectively imperiling the field. This is a powerful demonstration that the disinformation field is not autonomous from the field of power, i.e., an administrative paradigm, and that confronting platform political economy and oligarchy is essential to addressing platform harms.
While TDP's role in media-driven policy entrepreneurialism is finished their dichotomies and discursive formations persist in our understanding of social media and politics in Aotearoa-New Zealand. Communication is understood as a form of hybrid warfare or a defense against those that threaten a consensus reality and liberal institutions. This moral and national security politics replaces a material politics and disavows agonistic political struggle as a way to overcome antagonism. It also ascribes political beliefs as discrete categories with a linear valance rather than a discursive process of articulation and contradiction. It is in this way that the disinformation expert can position themselves above the fray as a moral technocrat. This moral politics also contributes to the broader culture war (Phelan, 2023) framing of online discourse in which warring camps remain structurally co-dependent. The post-political attempts by disinformation experts to excise communication pathologies have successfully been framed as censorious tyranny by a now ascendant tech-oligarchy. Thus, platform policy discussions remain within the contours of platform imperialism; between national-security threats and a libertarian self-publishing ontology. In contrast, the success of former US Federal Trade Commissioner Lina Khan, both in policy and political alliance building, is suggestive of an approach capable of breaking this deadlock.
Further to this entrenchment of culture war and moral post-politics, TDP has furnished the fields of journalism, academia and government with a lexicon of infection, infiltration and subversion. The recourse to metaphor and invocations of terror speak to the deadlocks of this field in understanding the constitutive role of antagonism in political identity. Russian narrative infection, miasma and all manner of biological analogies speak to the ultimately paranoid, cold war politics of this paradigm. As a discursive structure it is a natural fit on the political right and with the national security state. TDP's attacks on the public broadcaster RNZ, unwittingly in concert with the populist right, speaks to a retreat from politics outside of their own cultural capital and a national security framework. This field that claims to be grounded in the new social science of communicative warfare is inseparable from the post-political fantasies and paranoiac horrors that animate it.
Footnotes
Declaration of conflicting interests
The authors declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
