Abstract
In this article, we review the epistemological boundaries of disinformation studies and argue that they are informed by network and transmission models where the unit of analysis (i.e., disinformation) is assumed to follow contagion growth patterns typical of population models. This framework reduces disinformation to a behavioral problem that downplays the participatory and ritualistic dimension of disinformation, which we argue cannot be reduced, and therefore cannot be corrected, by targeting individual behavior. We review seminal contributions to information and communication studies to foreground disinformation as de facto alternative social contracts that organize the overflow of information in meaningful narratives. We conclude by arguing that disinformation studies would benefit from tracing the resonance of narratives informed by lived experiences to achieve a higher-level principle that can negotiate conflicting realities.
Introduction: Disinformation as transmission
Disinformation and problematic content broadly defined have become integral to the management of information ecosystems following the discovery of Russian influence operations targeting the 2016 US presidential election and the ensuing onslaught of Covid-19 conspiracy theories during the pandemic. These catalyst events triggered a range of detection and mitigation measures to ward off disinformation online, from fact-checking programs, strict content moderation systems, and the enforcement of community guidelines aimed at preserving the health of the public debate (Stencel and Luther, 2021). Fact-checking in particular was heralded as a natural antidote to disinformation by providing evidence to rebut the inaccuracies advanced to mislead individuals (Burel et al., 2020).
During this period, social platforms put together extensive trust and safety teams, an effort that was followed by the consolidation of several companies dedicated to the identification and removal of problematic content, such as Alethea and Graphika, but also NGOs and policy institutes like Atlantic Council’s Digital Forensic Research Lab, which operate in cooperation with social platforms and a vast network of initiatives dedicated to disinformation defense operation such as Factmata, Mandiant, and Logically. In this emerging market, disinformation is defined as “the new malware” (Hsu, 2022) whereby the targeted clients are the social platforms and national governments fearful of contentious and polarizing communication that can lead to social upheaval. Clients also include corporations and public companies targeted by coordinated inauthentic behavior intended to influence public perception of their brands and effect changes to their stock price. The definition of the problem eventually spilled out of the political arena as the range of online harms managed by such companies also expanded (Hsu, 2022).
These initiatives added to the information infrastructure managing the accuracy and facticity of information online, a technobureaucracy largely managed by automated knowledge systems such as Google's Knowledge Graph. Outliers flagged by these systems may be referred to human moderation, but real-time semantic networks effectively store the representations of objects and events promptly distinguished from fantasies and inaccurate information (Ford, 2022). Facts are of course verifiable, but verification depends on human practices including the very production of traces and silences. The production of facts, and indeed the production of history itself, is thus invariably embedded with bias, as known facts in history are subjected to interpretations contingent on political decisions, chief of which is the practice of silence itself (Fasolt, 2004). The opposition between factual and non-factual events is therefore problematic because facts are not ontologically objective, but socially produced through consensus building (Hacking, 1999). The notion of facticity informing disinformation studies, however, downplays how events are constructed out of ruptures that play a central role in how individuals experience everyday life (Wagner-Pacifici, 2017).
Central to this framework is the assumption that information carries intention, the defining factor that separates misinformation—or information identified as inaccurate but unwittingly consumed and shared (Karlova and Fisher, 2013)—from disinformation, or the intentional distribution of fabricated stories to advance political goals (Bennett and Livingston, 2018). Mis- and disinformation are therefore indelibly attached to agency and intentionality, an arrangement that extends to the very notion of information, even if intentionality and agency are categories of limited import in online social networks. The emphasis placed on agency was a necessary, if questionable, development that separated information from signals in early communication models, a set of diffusion parameters that depicted information flow with the image of a tube: something would go into one end of the tube and come out at the other end. What circulated through these tubes could be water, gasoline, or information, in which case the tube was in fact a wire. The process was assumed to be reversible, linear, and largely predictable (von Foerster, 1980).
These signal processing models of communication effectively described how unqualified signals traveled through channels measuring the speed of the transmission over unreliable channels until their eventual conversion into meaningful information. In earlier models, information and signals are indistinguishable because human intervention was assumed to be predictable, with deviations in the outcome due to system failures driven by noise, compression, or loss of information as opposed to the agency of individuals sharing information (Shannon and Weaver, 1962). But communication models would eventually evolve to implicitly assume that information carried human agency. The example often used by von Foerster (1980) was that of a library, which can store data as books, documents, films, etc. But one must read its books and documents to obtain information from a library. This nuance all but disappeared with libraries being commonly regarded as information storage and retrieval systems. By attaching human agency and intentionality to the very notion of information, the vehicle of information was effectively merged with the effects of information. Given this critical development, it is perhaps unsurprising that libraries would find themselves in the political crosshairs in the US (Cowan, 2023).
Another contribution to this epistemological quandary was provided by Carey (2009), particularly the distinction between communication as transmission and communication as ritual. These alternative but complementary definitions contrast the transmission of communication with the ritualistic, ancient practice of sharing and participating. The transmission view stems from a metaphor of geography and transportation and is commonly described with the words imparting, sending, transmitting, or giving information to others. It is permanently connected to the dream of increasing the speed through which messages are transferred as they travel in space, or to send and receive signals and messages over distance for the purpose of control. Due to its enormous impact on businesses and society, communication quickly became associated with transmission. Communication became a process of sending and receiving information from one mind to another (Carey, 2009).
The fundamental insight of the transmission view of communication was that information moves independently of and faster than physical entities and can therefore be a simulation and control mechanism for what has been left behind. Carey (2009) traced the development of this powerful metaphor to the nineteenth century when the movement of goods, people, and information were undistinguishable processes broadly described as communication. This discovery was first exploited in railroad dispatching in England in 1844. Before the use of the telegraph to control switching, the Worcester Railroad kept horses every five miles along the line, and they raced up and down the track so that riders could warn engineers of impending collisions. The introduction of telegraph lines would eventually allow for centralized control along many miles of track, as information could travel faster than galloping horses and needed no face-to-face interaction. The use of the telegraph in conjunction with the railroad allowed for universal systems governing transmission and control, with communication increasingly becoming synonymous with transmitting, imparting, sending, or giving information to others—an epistemological assumption that would be shared by disinformation models based on transmission and control and whose unit of analysis is informed by population models.
Disinformation as rituals of lived experience
Despite the tangible threats posed by disinformation campaigns, the epistemological framework informing disinformation and the ontological boundaries defining the facticity of events remain stubbornly unclear (Vinhas and Bastos, 2022). More tangible projects include automated, supervised, targeted, and recruitment-based interventions designed to build on media literacy skills to correct information that is inaccurate, false, or misleading, including the ability to identify unverified information (Boyd, 2017). This media literacy project, however, was unironically adopted by conspiracy entrepreneurs who counseled their audiences in developing the skills of “independent researchers,” a development that was salient in the uptake of conspiratorial narratives aided by the substantial increase in social media use over the course of the pandemic (Birchall and Knight, 2022). While social media platforms have been relatively successful in limiting the spread of some of these communities, notably QAnon, the underlying narratives remain reportedly popular, merely adapting slogans and hashtags to the strictures of a more outwardly censorious environment (Birchall and Knight, 2022).
Conspiracy-theorizing can in fact thrive in environments where disinformation is defined by the behavioral and epistemic assumptions of data scientists employed by social platforms (Marres, 2018). Such mechanistic models relate individual-level responses (vital rates, to refer to the terminology employed in demography) to changes in population density and structure (Maltby et al., 2001), which are assumed to follow contagion growth patterns (Lerman and Ghosh, 2010). Fact-checkers and media literacy initiatives are brought in to regulate socially dysfunctional platforms that nonetheless operate as designed (Walker et al., 2019). The disinformation framework is therefore informed by metaphors of transmission and information pollution—as in the WHO's infodemic—with limited capacity to unpack alternative practices of sensemaking that tend to reify established hierarchies of dominance and oppression (Marwick, 2018).
Consequently, the study of disinformation as a form of participatory knowledge-making remains relatively absent from this research agenda. This would require embedding such narratives in communities of practice (Lave and Wenger, 1991) and unpacking their ontological dimensions as “fluid adaptations” to regimes of attention online (Venturini, 2022). It should also foreground how these affective regimes of attention are optimized for circulation in social platforms designed for content that triggers moral outrage (Crockett, 2017). Such affordances include mechanisms for producing a “mimetic rivalry” whose resolution is situated in the ritualized spectacle of scapegoating (Shullenberger, 2016). Disinformation is also constructed out of narratives that piece together identities, loyalties, and social relationships that bypass the belief in an objectively verifiable truth to favor stories that have currency within the community. Unlike the verifiable facts that underpin the work of fact-checkers, they allude to the notion of total social facts: rituals of lived experience that weave together strands of social and psychological life with implications that are simultaneously social, economic, legal, political, and religious (Mauss, 2002).
Much of this is encapsulated in the ritual or constitutive view of communication, a proposal that contrasts with the transmission or informational model central to disinformation studies (Carey, 2009). Rituals of communication are directed toward the maintenance of society in time and focused on the representation of shared beliefs. The archetypal case of communication under a ritual view is the sacred ceremony that draws people together in fellowship and commonality. The ritual view of communication is also indebted to religion, but to religious practices that downplay sermon, instruction, and admonition, highlighting instead prayers, chants, and ceremony. Carey (2009) conceives the ritual view of communication as the highest manifestation in the construction and maintenance of an ordered, meaningful cultural world that can serve as a control and container for human action. To partake in a ritualistic view of communication is to be part of a community, with the expulsion of members of the Christian Church from participation in the sacraments and services being appropriately referred to as excommunication.
The design of social platforms is particularly conducive to the flourishing of ritualistic communication and oral folk cultures, which are thought to value antagonism over facticity and the retaining of attention over consensual communication (Venturini, 2022). This design supports the proliferation of online folk theories like QAnon, which grafts contemporary anti-elite populist sentiment onto the rootstock of primeval antisemitic conspiracy theories. QAnon is firmly rooted in the obscure milieus of online antagonistic communities characterized by their use of memes and language games used to ritualistically perform in- and out-group distinctions. These fringe signals are in turn amplified by social media influencers. Under concerted pressure, social platforms eventually acted against such communities and substantially diminished QAnon's reach.
But post-QAnon folk theories adapted to this moderated environment. This adjustment has found right-wing one-world government conspiracy theorists finding common cause with left-wing anti-vaxxers via their shared antagonism toward figures like Bill Gates and the World Economic Forum. Reformist visions for a more sustainable form of global capitalism, such as the World Economic Forum's Great Reset proposal, are interpreted through this lens as sinister plans for “population control.” But whereas the ritual communication of QAnon arguably had “fictioning” at its core (Zeeuw and Gekker, 2023) to create alternative worlds based around far-fetched fantasies like “Pizzagate,” the Great Reset conspiracy theory is more difficult to untangle from legitimate critiques (Tuters and Willaert, 2022). As the critique of power often resorts to metaphors, efforts to fact-check post-QAnon conspiracy theorizing can backfire because metaphors cannot be fact-checked. These post-QAnon developments foreground the inherent limitations of the mis- and disinformation framework based on the transmission of information.
Conclusion: From transmission to resonance
The reduction of disinformation to the transmission of false information fails to acknowledge the compelling social role of communing and participating in meaningful rituals. By providing a stage for drama, disinformation often portrays an arena of dramatic forces and actions that invites participation on the basis of our assuming, often vicariously, social roles in it. Disinformation under a transmission view of communication introduces behavioral models focused on the dissemination of false information in larger and larger packages over increasingly greater distances. Questions arise as to the effects of this on audiences: disinformation can be debilitating to an informed citizenry because it obscures reality; it can change or harden attitudes toward political deliberation; it can also, finally, breed doubt and debase the credibility of democratic organizations. Analyzing disinformation under the ritual view, however, requires tracing narratives that resonate with the community. It entails reading and partaking in stories that are not units of information traveling from sender to receiver, but a collective narrative effort similar to attending a mass, a situation in which nothing new is learned, and where no information is essentially accurate, but in which a particular view of the contending forces in the world is portrayed and confirmed.
With reading and sharing performed as ritual and dramatic acts, the constitutive view of communication accounts for communication exchanges detached from the accuracy or truthfulness of statements. This ritualistic dimension conjures alternative social contracts often at odds with public discourse based on objectivity, an aspirational notion hinging on nonpartisanship, impartiality, and skepticism, a method originally and unironically devised for contexts in which facts could not be trusted (Schudson, 1981). But in contexts of competing and contested realities of reference, the quest for truth and information is ultimately informed by lived experiences and personal narratives that have limited recourse to moral universalism. Within this epistemic shift, facts and truth are driven by individual and group advocacy—a perspective through which factual objectivity is often a distortion of experienced reality. This is not exclusive to outlandish conspiracy theories; newsrooms across the Western world are experiencing a similar epistemic shift, whereby objectivity is increasingly perceived as incongruous with truthful reporting informed by one's backgrounds, experiences, and points of view (Downie and Heyward, 2023).
Mis- and disinformation studies, finally, would benefit from acknowledging that such ritualistic narratives, particularly where social cohesion is contested, can resonate with communities and activate latent reservoirs of ideas that are often extremely reactionary (Paasonen, 2011). This process of methodological empathy could amount to a significant effort in counteracting the harms of disinformation and finding a higher-level principle capable of providing a compromise among conflicting realities.
Footnotes
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This study was supported by the University College Dublin and OBRSS scheme (grants R21650 and R20825).
