Abstract
This article examines the use of generative metaphors in the context of interference operations, particularly focusing on trolling and disinformation. It begins by emphasising the crucial role of metaphors in shaping perceptions of cybersecurity issues and subsequent government policies. To demonstrate this, the study delves into two case studies â the Philippines and Australia â analysing how their historical and political contexts have shaped the metaphors they employ to address trolling and disinformation. The article evaluates the effectiveness of these metaphors in both cases, considering their impact on policy formulation. It employs Allan McConnell's methodology to assess process and program success, ultimately concluding that, while the virus metaphor conveys urgency, it falls short in addressing the root causes of trolling. Conversely, the industry metaphor, as exemplified in the Philippines, promotes accountability and regulation.
Choice of language (and metaphors) within government documents are often deliberate to the policy end being sought. Policies always have advocates prior to their adoption and justifications are made about them afterwards which may differ from the original problem frames in an effort to adapt to the politics of the moment. The actions taken and the political situation are often factors that influence language choices rather than the other way around. John Kingdonâs multi-stream framework argues that policy solutions often float around waiting for the convergence of a policy problem and appropriate political license. 1 This convergence can occur, sometimes, through the analogies and metaphors used. These are often called generative metaphors â those which âgenerate mental models that carry over associations from one domain to anotherâ 2 â to explore the policy and legal implications for emerging technology. This is particularly so in areas of emerging technologies and threats.
There is a widespread adoption of metaphors by national governments, international legal bodies, and private companies to describe cybersecurity issues and activities. 3 Examples within the domain of cyber include: âcyber weapons, bot armies, and virtual arsenals.â 4 Underpinning these are a wider metaphor of âwarâ that applies to cyberspace; it just as readily could be constructed as an âinformation environmentâ where cyber weapons are pollution.
Such metaphors play a crucial role in shaping stakeholdersâ reactions to current cybersecurity conditions and problems, influencing their perceptions of responsibility and liability in the face of threats and breaches. 5 A critical argument is that the language used often creates limits and blind spots in policy, limiting its effectiveness. This article seeks to question the efficacy of these metaphors, through looking at how theyâve impacted Filipino and Australian approaches to trolling / disinformation. The history of democracy in these two case studies has informed the metaphors they used to address the novelty of trolling, which in turn informs policies against disinformation depend on their perception of democracy and the generative metaphors they use. This article questions the generative metaphors currently used in the trolling debate and asks whether the policy frameworks should shift.
Interference operations and trolling
To accurately assess the generative metaphors used by the governments of Australia and the Philippines (through their government documents and statements by government officials) trolling as a tool needs to be critically assessed. Trolling has led to what Hannan calls âpost-truth politicsâ wherein public discourse is now driven by lies, inappropriate behaviour, and deeper polarisation. 6 Scholars, military professionals, reporters and politicians have used a host of terms to describe the threat: fake news; 7 computational propaganda; 8 information warfare; 9 influence operations; 10 strategic communications; 11 active measures; 12 hostile social manipulation; 13 hashtag warfare; 14 unrestricted warfare; 15 malign cyber operations; 16 psychological operations. 17 Some of these are military metaphors; others are not. For the most part, these terms focus on specific and visible techniques, tools or modes of military action while ignoring the larger and more opaque manipulation of civilian populations. The exception, of course, is espionage â a distinct threat separate to that covered in this article. At the core of espionage are acts related to the theft of information â from industrial and trade through to official, classified government secrets. 18 The focus of this work is actions taken to achieve mass influence on opinions and/or actions of individuals, governments and/or publics. 19
The conflation of terms is understandable: the use of information as a resource, environment and weapon within the 21st century is an emergent capability, âstill seeking both language and concepts to become normative for discussions of warfareâ.
20
But it does have some consequences â such as a set metaphor that has been adopted by States or regional blocs. As Antulio J Echevarria argues: While the original aim of such labelling, or re-labelling, may have been to draw the attention of busy policymakers to rapidly emerging security issues, it has evolved into something of a culture of replication in which the labels are repeated more out of habit than conscious reflection. This habit has led to a wealth of confusion that has clouded the thinking of policymakers and impaired the development of sound counter-strategies.
21
The naming also risks conflating two broad forms of strategy: âan all-encompassing effort to use all measures short of war; and the more targeted and specific approach of employing information to achieve disruptive effectsâ. 22 Part of the difficulty, therefore, is the lack of a set definition. 23 It fails to have an accepted policy response as well, even down to the metaphors used.
This study adopts Slupskaâs argument wherein she evaluates the effectiveness of prevailing metaphors in guiding policy formulation. Slupska argues that current metaphors that relate cybersecurity to âwarâ do not capture the fullness of issues that governments experience. 24 She advocates for governmentsâ heightened awareness of the metaphors they use and recommends for them to adopt those which foster accountabilities for all stakeholders involved. 25 These include alternate metaphors such as an information ecosystem (where misinformation is the equivalent of littering); data as a valuable goldmine to be explored (or data as uranium which can be powerful or radioactive); an industry to be regulated (such as the Philippines); or a virus to be disinfected (such as Australia). These metaphors will be explored below.
Case study: Philippines
Philippine politics has a âpatronâclient factional frameworkâ, 26 also called âmutual aid relationshipsâ, or a bond of exchange between wealthy providers and supporters who pledge their loyalty and support to particular political parties. 27 This results in political parties that âcontinue to be candidate-centred coalitions of provincial bosses, political machines, and local clans, anchored on clientelistic, parochial, and personal inducements rather than on issues, ideologies, and party platformsâ. 28
A 2010 study found that Philippine trust and governance levels were low due to âpolitical instability, the failure of the political leaders to deliver the goods and combat corruption effectively, and its unfavorable policy contextâ. 29 Philippine democratic processes are claimed to be neither âparticipative nor equitableâ. 30 However, disenchanted Filipino voters have developed subversive tendencies against the educated elites â feeling disrespected as it presumes their lack of agency. 31 Rather, the poor tend to vote on a âmoral economyâ â who will offer the most benefit to their local community? 32 Thus, politicians appropriated this in their empathetic campaigns, pitting a fight against the elite. 33
In 2016, former president Rodrigo Duterte introduced a revolutionary campaign strategy by hiring social media advertising coordinators. 34 This produced an unprecedented surge in pro-Duterte social media interactions. 35 It sprang from a mechanised online network: individuals took on multiple user accounts to spread pro-Duterte information and exponentially multiply its exposure, in a move described through the generative metaphor of âtroll farmsâ. 36 Duterte admitted to engaging these services, 37 highlighting the growing industry. 38 Pro-Duterte sentiments grew widespread; his presidential statements had corresponding positive reception and anti-Duterte statements were condemned. 39
Incumbent president Ferdinand Marcos Jr is alleged to have applied the same tactics, but he denies it. 40 Soon after his victory in the 2022 Philippine presidential elections, an anonymous individual called a national radio station, claiming to be a member of a trolling think tank that contributed to Marcosâ winning social media campaign. 41 He said he was paid âą2.5 million (around AUD $67,987) for trolling work that year. 42 Many workers were from top Philippine universities. 43 The caller felt guilty for contributing to the victory of a former dictatorâs son and wanted to come clean. 44 He claimed to know the troll farm operatorsâ relevant names and office addresses. He said Marcos was their biggest client among their other politicians. Before elections, they bought Facebook accounts and pages having around 300,000 followers that shared trivial content. 45 Each account or page ranged in cost from âą2500 (around AUD68) to âą1 million (AUD27,193). 46 After buying these accounts, trolls shared funny, shocking and entertaining memes for re-sharing and attracting new followers. 47 Come election period, trivial content was replaced by material promoting candidates. 48 The revelation brought with it cries for regulation â underpinned by the generative metaphor of trolling as an industry, reflecting the multiple supply chains and stakeholders 49 including social media influencers, bloggers and digital freelancers. 50
Case study: Australia
By comparison, the Australian democratic experience and construct has shaped the metaphors it uses. The Australian Constitution supports an active exchange of âpolitical communicationâ between the people and the members of government. 51 This is a recognised, but constitutionally implied, freedom of political communication and has been subject to recent judicial skepticism as to its existence. 52 Perhaps, because such a freedom is undefined and unclear (and indeed is not a personal right), metaphors around regulation have been overtaken by metaphors around trolling and disinformation as a virus.
The desire to protect the information âenvironmentâ, specifically with respect to elections, is not new. In the 1912 case of Smith v Oldham,
53
the validity of legislation prohibiting newspapers and other publishers from publishing anonymously written articles on matters of the election was questioned. Isaacs J scathingly remarked that the public injury, so far as political results are concerned, is as great when the opinion of the electorate is warped by reckless, or even careless, misstatements, as when they are knowingly untrue; in each case the result is falsified
54
The recent High Court case of Libertyworks v Commonwealth (Libertyworks) confirms this. 55 In Libertyworks, the compulsive provisions within the new Foreign Influence Transparency Scheme Act 2018 (Cth) (âthe FITS Actâ) as a precondition to engaging in political communication with the public, or a section of the public were challenged as unduly restricting the implied freedom of political communication. The Australian governmentâs intent was for the âsunlightâ of truth 56 to act as a âdisinfectantâ to disinformation 57 alongside other lines of effort. This strategic framework mirrors that of the United States (US) in the late 1930s and can be titled illumination. 58 A majority of the Court found in favour of the provisions and their constitutionality.
This generative metaphor of a âsicknessâ that must be âdisinfectedâ has permeated through government rhetoric in Australia. Northern Territory former Chief Minister, Michael Gunner decried international trolling activity originating from the US, United Kingdom and Canada that spread fake news, claiming that Aboriginal people had been captured by the army and forcibly jabbed with the COVID vaccine. 59 Former Australian Prime Minister Scott Morrison, responding to the 2019 cyber interference in the Australian Parliament House computer network (although unconfirmed whether it was from foreign entities), reassured that the government would take on a serious fight against cyber-attacks, noting that âmalicious actors are constantly evolvingâ, and that his government would take a âproactive and coordinated approach to protecting Australiaâs sovereignty, economy and national securityâ, 60 by investing and strengthening its cybersecurity agency.
COVID-19 clearly enhanced the attractiveness of the metaphor. The Australian Communications and Media Authority also adopted the World Health Organizationâs coined term âinfodemicâ, as they studied misinformation origins and activity in their June 2020 position paper, Misinformation and News Quality on Digital Platforms in Australia, stating in their conclusion: The COVID-19 infodemic has also brought home that combating malicious behaviour from state actors and scammers is only one facet of misinformation, which is a far broader issue requiring a multi-pronged response.
61
The metaphor makes sense, particularly against the backdrop of increased global health awareness. Yet, it is not new. The origin of âinfodemicâ traces back to 2003 and was published in a Washington Post column by David Rothkopf, in which he was discussing the Severe Acute Respiratory Syndrome (SARS) outbreak and combined the terms âinformationâ and âepidemicâ. He wrote: What exactly do I mean by âinfodemicâ? A few facts, mixed with fear, speculation and rumor, amplified and relayed swiftly worldwide by modern information technologies, have affected national and international economies, politics and even security in ways that are utterly disproportionate with the root realities. It is a phenomenon we have seen with greater frequency in recent years â not only in our reaction to SARS, for example, but also in our response to terrorism and even relatively to minor occurrences such as shark sightings.
62
But long before the COVID-19 pandemic occurred, disinformation spread real-life harm on an impactful scale â deeper hate and polarisation of citizens, persecution of public officials, and even homicide. 63 Massachusetts Institute of Technology (MIT) data scientists discovered that fake news spreads âfaster, deeper, and more broadlyâ than true news. 64 The swiftness is attributed to ânoveltyâ, wherein creators make fake news items that shock, surprise, and thus become more shareable. 65 Underpinning this is an economy; there is money to be made in shocking news stories (hence, clickbait). What, then, should be done about the metaphors being used?
Evaluating the metaphors
This article has sought to outline what generative metaphors are, and how they have potentially shaped (or been shaped by) policy outcomes. It is, of course, near impossible to measure how these metaphors have impacted on the legal frameworks of the two countries in any definitive way. The closest criteria that could be relied upon are found within the methodology of political scientist, Professor Emeritus Allan McConnell.
66
McConnell defines success when policy âachieves the goals that proponents set out to achieve and attracts no criticism of any significance, or support is virtually universal.â
67
McConnell then breaks âsuccessâ into three categories: ⢠process (where government identifies a problem, considers potential solutions, consults with stakeholders, and makes a policy decision); ⢠program (how government implements its statement of intent); and ⢠political (what the consequences of the policy are on the government's reputation, and how their electoral chances affect the policyâs funding and programs).
Ultimately, the metaphors arise from a desire for political success â but seem to impact on process and program success. Rather than working through McConnellâs criteria, this article seeks to look at which metaphor (virus or industry) might be preferable within an Australian policy context â chosen because this article is written for an Australian journal, and an Australian audience. The same analysis could occur in reverse, to see which metaphor is best placed for a Filipino audience. By asking these questions, this article seeks to expose not only how the mental shortcuts inherent in metaphors may illuminate a society, but also expose aspects of a society that are omitted through a metaphor.
The value of industry
The industry metaphor holds actors accountable for their contributed actions. It goes to the heart of a nation-Stateâs role â to regulate for the good of the people â and seeks to protect those in the industry. Knowing the production levels and relationship networks can give insight on penalisation and its severity. Public officials may receive heavier penalties for undesirable or unsafe behaviour than the general public, as they hold positions of public trust and confidence.
This was the focus of the Senate Select Committee on Foreign Interference through Social Media. Dr Andrew Dowse, Director of RAND Australia, submitted to the Select Committee that regulation was key to achieve a series of interventions, ranging from addressing the motivation of actors to addressing structural issues in social media networks, to various ways of reducing the likelihood of the audience believing or amplifying force content. In my view such interventions should be priorities for our government, as otherwise the risks and potential consequences of interference through social media will just continue to get worse.
68
The Committee agreed, pushing away from a âwhack-a-moleâ regulatory approach in favour of a more comprehensive regulation of the industry. 69 It is not clear however that such industry regulation is well placed in Australia. Since 1788, Australian society has been particularly against government intervention in industrial matters. The storming of the Eureka Stockade has captured and divided public opinion within Australia for over 160 years. Generally, government intervention in industrial action (or industry more generally) is characterised by âdeeply held, even if imperfectly understood, reservations.â 70 While government intervention in industrial action is neither novel nor unique, 71 it remains an understudied area of the law in Australia and this historical aversion perhaps has prohibited the use of an industry metaphor within Australian policy responses to trolling. Since the advent of national, collective bargaining, the role of the military in assisting the civil authority has become increasingly controversial âin a democracy committed to solving labour-management disputes through collective bargaining mechanisms.â 72 This is all to say that Australia has had historically high levels of union membership, and the use of the industrial relations and corporations powers under our Constitution might lead to a different expectation of what industry regulation looks like in Australia. 73
The value of a virus
The virus metaphor is particularly compelling, as it encapsulates the risk of disinformation and bids to try to contain it. Clinically, trolling is propagated by users with higher trait psychopathy and lower affective empathy â they can predict their victimâs potential emotional suffering. 74 Making offensive comments is also contagious. 75 Stanford University cyber-risk researchers studied how disinformation proliferation by Russia during the 2016 US elections mimicked a virusâ spread, modelled under Ebola. 76 They aimed to âfind the most effective way to cut transmission chains, correct the information if possible and educate the most vulnerable targetsâ. 77
A virus metaphor reduces the governmentâs liability to efficiently stop disinformation from spreading and providing user rehabilitation for those who have suffered from trolling. However, from a process perspective, a virus metaphor fails to engage with the notion that disinformation propagation stems not from digital organisms like bots, but from genuine individuals who must be held accountable. The metaphor is only limited to reactive methods: diligent fact-checking, reacting to fake news, using social mediaâs artificial intelligence systems to sort fake from real news, and reporting disinformation occurrences. Unfortunately, the source of the âvirusâ still seems unknown to law enforcement officers when, in fact, such networks are traceable (with effort). 78
The suite of legislation to respond to the âvirusâ of misinformation has been built around a pillar of âsunlightâ concept â a disinformation âdisinfectantâ that aims to âensure activities are exposedâ. 79 As noted above, this is based around the idea of illumination. The importance of illumination as a central tenet of countering Information Operations (IOs) was reinforced in 2018 with Australiaâs Counter Foreign Interference Strategy, operationalised by the National Counter Foreign Interference Coordinator within the Department of Home Affairs. 80 The strategy, in acknowledging the need for âconvincing foreign interference actors that their actions will have costsâ, 81 clarified that this would occur by âshowing foreign interference actors that their actions can and will be revealedâ. 82
Illumination would appear to be founded on the doctrine of the âmarketplace of ideasâ or âcounterspeechâ.
83
These concepts denote the philosophical rationale for freedom of expression, using an analogy of the economic concept of a free market, where ideas can be traded and accepted. It is the underlying concept of Australiaâs implied freedom of political communication.
84
The marketplace of ideas, and thus illumination, is premised on a rational audience where individuals exposed to the same information, who are able to distinguish between true and false information, will place more value on the truth. John Milton, arguing against British censorship laws, stated in 1644: And though all the winds of doctrine were let loose to play upon the earth, so Truth be in the field, we do injuriously by licensing and prohibiting to misdoubt her strength. Let her and Falsehood grapple; who ever knew Truth put to the worse in a free and open encounter?
85
It is important to note that this rational audience, also known as the âwisdom of crowdsâ or âwealth of networksâ, has been subject to sustained criticism starting from at least the advent of a broadcast-era model of information distribution. 86 One critique aptly notes that as a model it is âundeniably elegant and compelling, an Enlightenment-era cocktail of Bayesian opinion formation, free speech, and capitalism. Unfortunately, its most foundational premise is false.â 87 This fatal flaw has crystallised in an algorithmic marketplace of ideas, and the efficacy of counter-speech has been questioned by former Prime Minister Kevin Rudd. 88 It seems then that the virus metaphor has been built around an even more-outdated metaphor of free trade (âthe marketplaceâ). Is it fit for purpose?
Within the limited scope available, this article suggests it is. It is interesting to note that the rise of the virus metaphor has resulted from a shift in global events and changing expectations around how governments can respond to a virus. It is important, then, in that the meaning of the virus metaphor has unexpectedly shifted â and the Australian government has sought to embrace both the old and new meanings of the metaphor.
It is suggested that since the Australian federal governmentâs response to COVID-19 and mass vaccination highlighted that global pandemics can be controlled and responded to, there has there been a shift in the idea that a government has little role in stopping a virus spreading. This is a marked change from the intent of federal government â as AV Dicey once noted, âfederal government means weak government.â 89 There are very limited options for Australia to take steps domestically, outside of funding state and territory responses or utilising the ânuclear optionâ (another generative metaphor) of calling in the Australian Defence Force to enforce Commonwealth laws. 90 Yet the experiences of COVID-19 highlighted that co-operative federalism can and does work. As such, within the emerging technology space, digital âhygieneâ can be comprehended and sold in policy responses; individual resilience to a wider âvirusâ can be requested by a government; and responsibility can be devolved from government to individuals, organisations and states and territories (in a federal construct).
Conclusion
Although an industry metaphor allows for accountability in all levels of disinformation production and dissemination, the virus metaphor allows governments and people to understand its urgency and its ability to damage an interconnected population. This article suggests that, although the virus metaphor was inappropriate prior to COVID-19, collective experiences of a federated system responding to a national emergency have now changed to the extent that the virus metaphor can catalyse private and public action. This is to be compared to adopting an industry perspective â which, against the backdrop of Australian historical experiences of industry regulation, might not be appropriate.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
