Abstract
This article articulates the intersection of wellness communities and anti-vaccine (‘anti-vax’) groups to demonstrate how vaccine misinformation and pseudoscience can propagate. This misinformation is often pushed by wellness influencers. One recent example is wellness figure Pete Evans, a celebrity chef and self-described ‘qualified health coach’. By 2020, however, Evans had developed anti-vax views and began to promote fake COVID cures, anti-vax misinformation, and COVID conspiracy theories from QAnon. This contribution examines this overlap to demonstrate how wellness influencers spread misinformation that fuel vaccine hesitancy. Evans is just one example; journalists have reported on yoga teachers in California protesting against lockdowns and on wellness influencers claiming that a ‘“shadowy cabal” of scientists and companies’ were responsible for COVID. These examples demonstrate how community intersections can amplify misinformation, pseudoscience and anti-vax views to a motivated and highly receptive audience.
Introduction
The COVID-19 pandemic has highlighted how preventative measures such as vaccines are essential to protecting individual health and preventing health care systems from being overwhelmed. At the same time, the pandemic also revealed how disinformation and misinformation can spread in moments of social angst and upheaval, undermining preventative measures. In the case of the COVID-19 pandemic, for example, disinformation fuelled vaccine hesitancy and resistance to vaccine mandates. This disinformation was often promoted by wellness influencers who relied on pseudoscientific claims and their reputations as fitness or wellness experts to amplify the perceived veracity of anti-vaccine content.
Using the example of Pete Evans, an Australian celebrity chef and wellness figure, this article articulates how the intersection of wellness communities and anti-vaccine (‘anti-vax’) groups can lead to the propagation of vaccine misinformation and pseudoscience in unexpected ways. A concept central to this discussion is Manuel Castells’ ‘network society’, in which networks are the predominant organisational form. I begin by providing a brief overview of the ‘dimensions’ of network society which, I argue, enable the spread of dis- and misinformation. The next section details Pete Evans’ transformation from chef and health and wellness influencer to anti-vaccine figure and conspiracy theorist. The third section discusses the role of emotion or affect in influencers’ anti-vax content, and how these affective appeals leverage fear and uncertainty to increase vaccine hesitancy, as well as the psychological responses that make it difficult to counter or debunk this disinformation, such as cognitive dissonance, motivated reasoning, and commitment bias. The fourth and final section outlines how Evans is emblematic of a repeated convergence between wellness and anti-vax discourse and far-right conspiracy theories that undermines public trust in vaccines, scientists, and governments.
Castells’ ‘network society’
According to Castells, the term ‘network society’ describes the era resulting from a shift from an industrial age to one centred on information. Castells introduced the concept for two primary reasons. First, he considered calling the post-industrial era an ‘information society’ to be inaccurate since ‘information, in its broadest sense, e.g. as communication of knowledge, has been critical in all societies’ (Daniels, 2009: 21n31). Second, he suggested that the network is the primary driver of social development starting in the latter half of the twentieth century. The network society is comprised of four ‘dimensions of social change’ that make possible the new social structure Castells (2000c: 693) outlined. In addition to ‘globalisation’ and the ‘demise of the sovereign nation-state’ (Castells, 2000c: 694), one of the primary dimensions of the network society is a ‘new technological paradigm, based on the deployment of new information technologies’ which ‘allow the formation of new forms of social organization and social interaction along electronically based information networks’ (Castells, 2000c: 693). He ultimately argued that information networks are now the ‘predominant’ form of social organisation (Castells, 2000a: 16).
The fourth key dimension of social change, according to Castells (2000c: 694), is the ‘enclosing of dominant cultural manifestations in an interactive, electronic hypertext’. He argues that culture is now ‘organized primarily around an integrated system of electronic media’ which enables a fluid and interactive exchange of content (Castells, 2000a: 12). Interactions between people are key to this dimension (Castells, 2000a: 6), and ‘offline’ practices, beliefs and processes are stored, replicated and shared via networks. While not ignoring the potential downsides of these dimensions (see, e.g. Roberts, 2019: 54), Castells (2000b: 71) also saw great potential in the network society, stating that it was necessary to ‘structure the unstructured while preserving flexibility, since the unstructured is the driving force of innovation in human activity’. He further asserted that ‘the Internet embodies the culture of freedom’ (Castells, 2012: 231). Online content shared during the COVID-19 pandemic, however, also shows that the network society can support the spread of dis- and misinformation that can pose a threat to people's health, safety and wellbeing.
The difference between disinformation and misinformation is a seemingly minor but ultimately important distinction. Misinformation refers to the inadvertent sharing of factually incorrect information (Farkas and Schou, 2018: 299; Tandoc et al., 2018: 140). In other words, with misinformation, there is no intent to deceive. Rather, in online networks such as social media platforms, misinformation usually takes the form of sharing or re-posting false information. Disinformation, however, refers to ‘the deliberate creation and sharing of information known to be false’ (Tandoc et al., 2018: 140). Bennett and Livingston (2018: 124), for example define disinformation as ‘intentional falsehoods spread as news stories or simulated documentary formats to advance political goals’. The reason for highlighting the differences in these terms is simply to emphasise that any discussion of online falsehoods needs to recognise two separate issues: first, where false information or disinformation originates and, second, how and why it spreads as misinformation.
Since the start of the pandemic, social media platforms have been flooded with disinformation concerning COVID vaccines, including conspiracies about the motives of companies developing them, government mandates to get vaccinated, and the efficacy of the vaccines. In what might seem like a counterintuitive development to some, this disinformation is often disseminated by wellness influencers who mix health and lifestyle advice with pseudoscientific discourse that casts doubt on the safety or efficacy of the vaccines and put lives and public health in danger. One example is self-styled wellness figure and former celebrity chef Pete Evans. Evans originally made a name for himself as a restauranteur in Sydney, Australia, but primarily gained fame as a judge on the cooking show My Kitchen Rules on the Seven Network in Australia between 2010 and 2020. His popularity led to regular recipe columns in Australian magazines (Valentine, 2020), and he also published twelve diet and recipe books, many of which touted the benefits of diet regimes such as paleo and keto diets.
Pete Evans: from wellness guru to anti-vaxxer
In his 2014 recipe book Healthy Every Day, Evans (2014: 7) claims he is ‘a qualified health coach with the authority to speak on the topic of healthy eating’. However, despite his success and celebrity, some information he shared was critiqued and, in some cases, ridiculed for its pseudoscientific character. In 2012, for example, Evans was roundly mocked on social media and Australia's morning show Today after sharing an example of his daily diet in the newspaper The Sunday Age. The listed foods included ‘alkalised water with apple cider vinegar’ and ‘activated almonds’, the latter of which was the most frequent target of jokes on social media platforms such as Twitter (Starke, 2012). He would eventually become an avid proponent of the controversial paleo or ‘caveman’ diet that encourages consumption of meats, nuts and seeds while discouraging grains, beans, sugar, and dairy products; in recent years, this diet has been linked to increased risk of heart disease (Molloy, 2018).
In 2015, he raised eyebrows with a paleo diet cookbook for babies and toddlers called Bubba Yum Tum: The Paleo Way, which included a recipe for a broth that was ‘described as a replacement for baby formula’ (Molloy, 2020). Pan Macmillan originally intended to publish a hardcopy of the book but dropped those plans after Australia's Federal Health Department opened an investigation into the book and multiple health experts expressed concern that the diet plan could harm, or even lead to the death, of infants (Davey, 2015). Evans eventually self-published the cookbook online.
In following years, Evans’ claims became more controversial and pseudoscientific in nature. In 2016, medical professionals again critiqued his advice after advising one of his Facebook followers, who said she had osteoporosis, to cut dairy from her diet because ‘calcium from dairy can remove the calcium from your bones’ (Evans qtd. in Molloy, 2020). Evans (qtd. in Molloy, 2020) also comments that ‘most doctors do not know this information’, a claim that both asserts his own expertise while undermining the knowledge and authority of medical experts. In a documentary called The Magic Pill which aired on Netflix in 2018, Evans claimed without evidence that a keto diet ‘could treat a range of chronic health issues, including cancer, autism and asthma’, leading the Australia Medical Association (AMA) to petition to have the documentary removed from Netflix for promoting misinformation (Farrukh, 2020). To support these claims, Evans regularly shared ‘testimonials’ (i.e. comments on social media) from followers who had claimed to ‘heal themselves through food’ (Molloy, 2020). However, Evans did not limit his advice to just food and nutrition. In 2014, he publicly announced his support for the group Fluoride Free Western Australia (Farrukh, 2020) after falsely claiming that ‘fluoride in drinking water is harmful to health’ (Hopcraft, 2017). These examples are not exhaustive, but they demonstrate how disinformation shared by popular wellness figures such as Evans can lead people to question health advice from qualified professionals, and share their scepticism in technologically-supported communities, an example of a new ‘social interaction’ enabled by what Castells described as a ‘new technological paradigm’.
By 2020, as the COVID-19 pandemic wreaking havoc globally, Evans had begun to share anti-mask and anti-vax posts on his social media accounts. He had previously hinted at his anti-vax beliefs even before the COVID pandemic, sharing a link on his Facebook site in March 2019 to a podcast by anti-vaxxer Paul Chek and thanking him for ‘asking the questions that need to be asked about vaccines and medicine’ (Evans qtd. in Molloy, 2020). However, he made his anti-vax and COVID-denial views more explicit during the pandemic. Throughout 2020, Evans repeatedly claimed that COVID-19 was a ‘scam’ and lockdowns were part of ‘a sinister plan’ (Templeton, 2020), shared anti-vax and COVID conspiracy theory content from QAnon adherents on his Instagram account (Aubrey, 2020), and posted a link to an interview with ‘Plandemic’ believer (and Holocaust denier) David Icke to his Instagram bio (Wilson, 2020). Evans would eventually be banned from Facebook in December 2020 after repeatedly sharing COVID-related disinformation (Kaye, 2020), followed by an Instagram ban in February 2021 for the same reason (Gladstone, 2021).
Evans would eventually announce his candidacy for a Senate seat at an anti-vaccine rally in Melbourne, Australia in February 2021 as a part of the libertarian Great Australian Party (GAP) before quietly ending his candidacy in October of the same year (Lewis, 2021). He also began to promote fake COVID cures through online platforms. In April 2020, for example, was fined $25,200 Australian for breaches of the Therapeutic Goods Act after claiming on a Facebook live stream that a light emitting machine called the BioCharger, which he was selling for AU$14,900 on his website, could treat the coronavirus, as well as sharpen mental clarity, restore strength and stamina, and aid recovery from injury and stress (Doherty, 2020).
It would be tempting, perhaps, to dismiss Evans as a swindler. Ali Breland (2020) notes that some wellness influencers are ‘taking advantage of people's fears to hawk unproven supplements and drive traffic to their sites’. The language Evans used to promote his BioCharger and other wellness products, and in his anti-vax posts, features pseudoscientific discourse meant to capitalise upon his reputation as a chef and wellness ‘expert’ to promote those products and views. However, Evans also amplified conspiracies popularised by conspiracy groups such as QAnon which not only cast doubt on vaccines, but also undermined confidence in medical experts, governments, and others working to promote the vaccine.
Evans demonstrates how the exchange of ideas in the shared ‘interactive, electronic hypertext’ of the network society, as Castells described, not only drives innovation, but also allows for potentially harmful dis- and misinformation to freely flow between different communities. His online content, for example, shows how disinformation from far-right conspiracy groups such as QAnon can be channelled to wellness communities through prominent wellness figures, and there is evidence that Evans is not an isolated example. Journalists have reported on yoga teachers in California protesting against COVID lockdowns and encouraging people to burn their masks (Wijeyakumar, 2020), and on wellness influencers who were ‘promoting a video that claimed a ‘shadowy cabal’ of scientists and companies were linked to the rise of the virus’ (Aguirre, 2021). Other examples include yoga instructor Stephanie Birch, who regularly posted QAnon hashtags on her now-deleted Instagram account, and wellness influencer Krystal Tini who, at the time of writing, has over 172,000 followers on Instagram and has ‘consistently posted anti-vaccine content, including one post that compared lockdowns to the horrors inflicted on Polish Jews in the Warsaw ghetto’ (Kale, 2021).
Comparing vaccines to atrocities such as slavery, the Spanish Inquisition, or the Holocaust, or predicting government-imposed limits on individual freedom is a common theme in both wellness and anti-vax discourse during the COVID pandemic, which is often echoed by followers of wellness figures (see, e.g. Colgrove and Samuel, 2022). For example, protestors at the February 2021 anti-vax rally in Melbourne held signs with slogans such as ‘Let me learn to think for myself’, ‘The decision should be “mine,” not yours’, ‘The so-called vaccine is a permanent DNA modification’, and even ‘My body, my choice’ (Loomes, 2021), appropriating a slogan often used by freedom of choice advocates. As clinical psychologist Doreen Dodgen-Magee (2021) summarises: Given the ease with which people can find misinformation and faulty pseudoscientific ‘evidence’ that supports their initial bias against the COVID-19 vaccine, it's not surprising that a burgeoning community exists to elevate and adulate those who have championed not getting vaccinated as an expression of their ‘protected freedom.’
Other wellness influencers, such as chiropractor Joseph Arena, promote conspiracy theories including the QAnon-backed idea that the US Federal Emergency Management Agency (FEMA) would use the COVID pandemic as a pretence to put people in government-run concentration camps and institute martial law (Breland, 2020). Arena also encouraged his over 40,000 followers to visit QAnon websites to discover ‘the truth’ for themselves. Travis View (qtd. in Breland, 2020), host of the podcast QAnon Anonymous, notes that wellness figures ‘are generally anti-establishment and anti-mainstream narrative and distrustful of authority, which lines up with QAnon's populist message’. These examples highlight how Castells’ claim that the network society embodies a culture of freedom that can manifest in unexpected ways in this confluence of wellness and anti-vax discourse. Evans even called himself ‘one of the catalysts for a conversation about such an important topic (as) freedom of speech’ after being banned from Facebook in December 2020 (Kaye, 2020).
Another temptation might be to question the impact a single influencer can have on vaccine initiatives. However, the Center for Countering Digital Hate or CCDH (2021: 6) published a report in March 2021 that details how the vast majority of anti-vaccine misinformation on Facebook and Twitter – 65% to be exact – can be traced back to twelve influential posters they call the ‘Disinformation Dozen’. As Sirin Kale (2021) summarised in The Guardian: Included within the CCDH's ‘disinformation dozen’ are Joseph Mercola, a US wellness entrepreneur called the ‘most influential spreader of Covid-19 misinformation online’ by the New York Times; Dr Christiane Northrup, a wellness expert who helped popularise the notorious Covid pseudo-documentary Plandemic by sharing it with her 560,000 Facebook followers; and Kelly Brogan, a contributor to Gwyneth Paltrow's Goop wellness platform. Mikki Willis, the director of Plandemic, is well known in the California yoga scene, while David ‘Avocado’ Wolfe, a conspiracy theorist and raw food advocate, is a regular figure at anti-vaccination protests across the US.
Similarly, research from The Disinformation Project in New Zealand, part of the Aotearoa New Zealand Centre of Research Excellence Te Pūnaha Matatini, details how on just one day of a 23-day long anti-vaccine mandate protest and occupation of Parliament grounds in capital of Wellington, ‘73% of interactions in the mis- and disinformation ecology were generated by just a dozen accounts’ (Hannah et al., 2022: 9).
Unfortunately, the spread of disinformation from all these small number of sources in what Castells described as an ‘interactive hypertext’ is significant. Because it resonates more strongly with already-held opinions and tends toward the shocking or sensational, disinformation is shared more frequently and spreads faster than factual news stories (Silverman, 2016; Vosoughi et al., 2018). A 2018 study from MIT, which analysed 126,000 stories, shared by 3 million users on Twitter, for example, shows that ‘falsehoods were 70% more likely to be retweeted than the truth’ in part because ‘users were more likely to retweet information that was more novel’ (Vosoughi et al., 2018: 1149). Their findings also suggest ‘that false news spreads farther, faster, deeper, and more broadly than the truth because humans, not robots, are more likely to spread it’ (Vosoughi et al., 2018: 1150). This dynamic can be seen in the case of anti-vaccine disinformation shared by wellness influencers, which is then shared and amplified by others who share it as misinformation.
Vaccine disinformation and affect
Engagement with and sharing of disinformation is often driven by an emotional or affective response to the content. While noting the relationship between the two terms, Megan Boler and Elizabeth Davis differentiate between affect and emotions; while emotions are personal and individual, they argue that affect ‘may be understood as emotions that are collectively or intersubjectively manifested, experienced, and mobilized, out of the “private,” individual realm and into shared, even public spaces, which may then be channelled into movement(s)’ (Boler and Davis, 2018: 81). Affect is, therefore, political in nature, and can play a significant role in persuading people of the supposed truth of information.
Boler and Davis (2018: 78–79) argue that the believability of content such as news has an aesthetic (tonal) and affective element. They cite Arlie Hochschild, who describes affect as a ‘deep story’ or a ‘feels-as-if story’ (Boler and Davis, 2018: 78, emphasis in original). According to Hochschild (2016: 135–139), a deep story shapes or informs ‘feeling rules’ that describe the relationship between various social contexts, demographic characteristics, and expected emotional responses. In short, something must ‘sound’ true and relate to the ‘deep story’ of a person to be believed and considered ‘true’ within a specific social circle or context. Content that evokes affective responses is thus integral to the spread of disinformation as misinformation and makes that misinformation, and the affective response to it, seem sensible or legitimate because of affect's collective character.
Affect and emotion, perhaps not surprisingly, also play a role in the spread of anti-vaccine content which contributes to vaccine hesitancy. Miyazaki et al. (2021: 86) note that anti-vaxxers looking to persuade others to question the value of COVID-19 vaccines ‘tend to use more emotional expressions, including negative words, than pro-vaxxers and neutral users’. The inclusion of emotional expressions is a particularly effective method of influencing people's views on vaccines. Betsch et al. (2011: 750), for example, found that narratives about vaccines that incorporate emotional language are more likely to leave a lasting impression on readers than texts that contain detailed information and statistics. Moreover, Miyazaki et al. (2021: 8) cite neurological research from Grandjean et al. (2005) that demonstrates that emotional appeals can ‘effectively affect users who just witnessed the message even though they are not directed at them’.
Previous research on persuasion on social media platforms shows that influential content tends include expressions of negative sentiment (Quercia et al., 2011; Xiao and Khazaei, 2019). As Miyazaki et al. mention, negative expressions are common in anti-COVID vaccine content as well. One common tactic is to play upon the fear, anxiety and uncertainty caused by the pandemic to increase doubts in COVID vaccines and the governments promoting them. In addition to referring to COVID-19 as a ‘scam’ and a ‘hoax’ on Facebook, for example, Evans also called COVID vaccines ‘poison’ and discouraged people from getting tested for the virus in an Instagram post in 2020, which remained accessible until he was banned from the platform early the following year (Kaye, 2020). Following the death of popular Australian cricket player Shane Warne of a heart attack in March 2022, Evans seemingly suggested during a public Zoom call that Warne died as a result of the COVID vaccine, stating, ‘[S]o many doctors I’ve interviewed have been screaming for the last year-and-a-half, saying the vaccines are going to cause death like we’ve never seen across the planet’ (qtd. in Lefroy, 2022). The false link between Warne's death and the vaccine was repeated by several anti-vaxxers on multiple social media platforms after the cricketer's death (Lefroy, 2022).
Considering his self-branding as a wellness expert and ‘qualified health coach’, Evans’ advice had greater potential to incite fear and distrust, and there is at least anecdotal evidence this is the case. For example, at the February 2021 anti-vaccine rally in Melbourne, at which Evans announced his Senate candidacy, one speaker discussed how she felt ‘so alone’ during the COVID lockdowns in the city and unable to question the government's actions (qtd. in Rolfe, 2021). This dynamic is visible in other wellness circles as well. Sirin Kale (2021), a reporter for The Guardian, tells the story of woman named Ozlem Demirboga Carr, who was ‘unusually anxious during the UK's Covid lockdown in March 2020 and, like many people, decided to practise yoga as a way to de-stress’. Carr followed Instagram wellness influencers Phoebe Greenacre, popular for her yoga videos, Kelly Vittengl, ‘women's empowerment and spiritual mentor’, and other wellness accounts on the platform looking for guidance. Kale's story describes how the content Carr experiences shifted from wellness and self-care to discussions of personal freedom, government distrust, and conspiracy theories: ‘The conversation and tone of their posts shifted’, [Carr] says. ‘At first it was all about self-care and being part of a community that is caring for each other. But then they started to speak more about how there should be a choice when it came to vaccines’. […] Carr watched as Greenacre posted an Instagram story describing vaccine passports as ‘medical apartheid’. Vittengl went further. In a post in July, Vittengl, who is unvaccinated, compared vaccine passports to the social polarisation witnessed during the Holocaust and spoke about the ‘mess’ caused by the ‘ideology of the western medical system’. ‘We aren’t being shown the full picture’, Vittengl concluded, in a post that was liked by Greenacre. (Kale, 2021)
This brief summation includes some elements common to anti-vax discussed above, such as comparisons to the Holocaust, references to government limits on personal freedom (i.e. ‘medical apartheid’), and discourse that undermines trust in the medical system. Azadeh Ghafari, a licensed psychotherapist, notes that scepticism of governments and the medical practitioners is a reasonable reaction from members of minority groups, particularly communities of colour, who have been mistreated by those governments and medical systems in the past. However, wellness communities, both online and offline, tend to be ‘dominated by well-off white people’, indicating a different dynamic is at play. She points to an affective element to explain this difference, namely fear. As she states, ‘When I see this behavior happening, there's a level of fear going on, particularly in the white community. They feel out of control, they don’t know who to trust, and it's manifested in this manner’ (Ghafari qtd. in Breland, 2020).
Fear, uncertainty, and receptive audiences
That uncertainty and sense of lack of control contributes to an audience that is highly receptive to disinformation because it is often actively seeking advice and guidance. McKinley and Lauby (2021: 4254) note that, with health information, uncertainty is ‘the initial driver of information seeking action’. People seek out health advice to address gaps in knowledge and ‘reduce the cognitive and emotional discomfort induced through this ambiguity’ (McKinley and Lauby, 2021: 4254). However, misinformation can often fill that gap instead, often by sharing or promoting conspiracy theories that minimise threats (Romer and Jamieson, 2020: 2) or promote alternative measures to deal with risk. These conspiratorial framings can not only increase vaccine hesitancy (Hornsey et al., 2020: 1), but also might clarify why vaccine sceptics turned to alternative treatments such as to the horse deworming medication Ivermectin to treat COVID despite a lack of evidence that it is effective. The link between conspiracy theories and vaccine hesitancy can also help explain why wellness influencers such as Evans amplify such conspiracies while promoting ‘safer’ and ‘less invasive’ COVID treatments such as the BioCharger.
The power of affective responses means factual details about the effectiveness and safety of vaccines may be less convincing than disinformation and conspiracy theories about vaccines that includes emotional appeals, which in turn increases vaccine hesitancy in those uncertain about their safety. Moreover, these factually incorrect, emotionally charged messages – which, again, tend to spread faster than factually correct content in an interactive hypertext – can potentially inspire vaccine hesitancy in those who may not even be seeking information about vaccines. Johnson et al. (2020: 230) argue, for example, that although anti-vax communities are relatively small in size, they ‘manage to become highly entangled with undecided clusters in the main online network’. Miyazaki et al. (2021: 83) note that anti-vaxxers will even purposefully engage those with pro-vaccine views or, especially, ‘neutral’ views in an attempt to persuade them to question vaccines. These interactions, and the spread of dis-misinformation they enable, mean people are more likely to base their beliefs, opinions and decisions on factually incorrect stories crafted to incite strong emotional reactions. As Kraft et al. (2015: 125) argue, these reactions are ‘triggered unconsciously, followed spontaneously by the activation of associative pathways that link thoughts to feelings to intentions to behavior’. In this case, affective reactions can trigger anti-vaccine behaviours, presenting a significant challenge to health practitioners and governments promoting vaccine initiatives.
Persistence of dis- and misinformation
However, there are also significant challenges when it comes to countering disinformation. A complete review of these issues is beyond the scope of this article (for a thorough discussion, see Caled and Silva, 2022), but some challenges are worth highlighting. One reason for the persistence of dis- and misinformation online is arguably financial, entangled in the ‘economic structure of the Internet’ itself (Farkas and Schou, 2018: 303). Social media platforms have faced increased scrutiny and critique in recent years for their role in spreading disinformation and hate speech and lowering trust in governmental institutions and traditional media (Bradshaw and Howard, 2019). However, they also profit from user generated content (UGC) and interactions (see, for example, Fuchs, 2012) and thus have little motivation to limit content that is sensationalised and controversial because it increases engagement. Anti-vax content is no exception. The CCDH estimated in 2020 that ‘the anti-vaccine movement could realise US$1 billion in annual revenues for social media firms. As much as $989 million could accrue to Facebook and Instagram alone, largely from advertising targeting the 38.7 million followers of anti-vaccine accounts’ (Burki, 2020: e504). It can also be time-consuming and potentially expensive to debunk falsehoods, perhaps requiring a large, dedicated team of researchers.
The ease and speed at which digital content can be replicated is another challenge and suggestions for identifying disinformation such as automatic detection or relying on user reporting can undermine corrective efforts. Automated tools, for example, incorrectly flag and block content from those working to combat disinformation and misinformation such as activists and researchers. User reports also yield inconsistent results. In 2020, the CCDH (2020: 11) reported that only 5% of reported anti-vaccine content was removed from major social media platforms Facebook, Instagram, Twitter and YouTube. Moreover, as Matt Hatfield (2021) of the Canadian-based internet advocacy group OpenMedia argues: Any user report system can and will be abused by organized communities of malicious users who troll the Internet looking for opportunities to ‘brigade’ other users who don’t share their identity or politics. These groups abuse reporting systems by mass reporting content from those they don’t like as offensive, or trawling through their past social media history to find anything that might cross some part of a platform's community rules.
There is evidence that anti-vax communities are engaging in this kind of abuse. ‘Team Halo’, a UN-backed group of scientists working to amplify certified health professionals sharing accurate information on COVID vaccines on social media, garnered over 80 million ‘organic views’ of their videos on TikTok alone in the first eighteen months of the COVID-19 pandemic (Bender, 2021). Team Halo's organisers encouraged scientists and researchers creating videos to focus less on crafting a ‘perfect’ message and instead generate many messages in a variety of formats to increase reach and engagement. Videos range from scientists debunking other TikTok videos, to meme-based videos, to ‘dancing under information’ style videos that are seemingly emblematic of TikTok. Maddie Bender (2021) reports that some contributors even take an ‘identity-based approach, either because they want to or because they have to, in order to engage distrustful people in their community’. While a full analysis of Team Halo's videos is beyond the scope of this article, these anecdotal examples suggest that some experts are aware of and even appropriate the kind of personal and affective appeals seen in some content spreading misinformation. However, members of the initiative have reported that they have ‘had their own reputable, data-supported content suspended and flagged for review by the platforms’ (Agustin, 2021).
Efforts to discredit disinformation can also inadvertently bring additional attention to disinformation and misinformation (see, for example, Ball, 2017: 1–13). Bennett and Livingston (2018: 125) argue that journalistic sources are sometimes responsible for this added visibility, stating that disinformation ‘often passes through the gates of the legacy media, resulting in an “amplifier effect” for stories that would be dismissed as absurd in earlier eras of more effective press gatekeeping’.
Actions such as removing or flagging content, or ‘fact checking’ anti-vaccine content are reactive and, as such, do not address the underlying reasons why this disinformation spreads. Moreover, research from psychologists suggests that reactive attempts may be too late to make a difference because attempts to debunk disinformation can actually strengthen belief in that false information. The presentation of fact-based ‘counter-attitudinal’ information, that is, information that conflicts with a person's already-held beliefs, can result in cognitive dissonance, a condition in which someone simultaneously holds two thoughts that are in conflict. Social psychologists note that people tend to work to reduce this cognitive dissonance by rejecting one set of ideas through a process called motivated reasoning (Kunda, 1990: 481). With motivated reasoning, Kraft et al. (2015: 125) note that ‘exposure to dissonant information will trigger a negative affective response, leading to distrust of the scientific community, counterarguing and attitude polarization.’
Dodgen-Magee (2021) sees a similar psychological response in relation to anti-vax information which she called commitment bias, which refers to ‘our tendency to adhere to positions we’ve taken, especially publicly, regardless of information that challenges them’. She further explains: The loud social media stances which many vaccine-hesitant individuals have powerfully asserted may well be fed by this bias – unconsciously driving them, along with their followers, to maintain a white-knuckle hold on their position, regardless of any evidence to the contrary. (Dodgen-Magee, 2021)
In other words, people who are presented with facts that conflict with their worldview will double down in their belief of false information to reduce the discomfort of cognitive dissonance. Moreover, the stronger someone's belief or attitude is, the more likely they are to deny the incongruent evidence (Kraft et al., 2015: 122). This cognitive response plays a role in anti-vax beliefs or climate change denial and is the reason why people continue to share misinformation despite a sometimes overwhelming amount of evidence that what they are sharing is wrong.
This reaction is, of course, not a new phenomenon. Writing in the 1950s, the psychologist Leon Festinger et al. (1956: 3) wrote: A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point. We have all experienced the futility of trying to change a strong conviction, especially if the person has some investment in his belief. We are familiar with the variety of ingenious defenses with which people protect their convictions, managing to keep them unscathed through the most devastating attacks. […] [S]uppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting other people to his view.
Because affective reactions are so convincing, Jonathan Albright (2017: 87) stresses that, ‘fact-based evidence is not relevant to a growing segment of the populace’. Instead, there is a persistent belief that facts are open to interpretation (Sarasin, 2017). Even when facts are considered, there is a tendency towards confirmation bias or ‘unwitting selectivity in the acquisition and use of evidence’ (Nickerson, 1998: 175). In other words, people can selectively pick information that confirms their existing beliefs or ideas and ignore contradictory information in order to justify their beliefs and, in some cases, actions. Thus, for an increasing number of people only ‘facts’ that are in line with their own sense of the truth and political world view are actually true. As a result of the ‘infodemic’ that developed in the wake of the COVID-19 pandemic, these ‘facts’ are inspiring a vaccine hesitancy that can undermine global efforts to control the pandemic (World Health Organization, 2020). Wellness influencers like Pete Evans are key players in that infodemic.
Discussion
Ultimately, Evans’ influence and role in expanding vaccine hesitancy may be limited. After all, as highlighted above, he ultimately lost access to his major social media accounts because of his repeated sharing of disinformation there. However, Evans is a demonstrative example of how the network society, and its dimensions as outlined by Castells, can lead to unexpected community intersections that amplify misinformation, pseudoscience, and anti-vax views to a motivated and highly receptive and audience. More concerning is the increasing overlap of wellness communities, conspiracy theorists, and far-right content, which suggests the potential for wellness influencers to nudge people to beliefs even more harmful than vaccine hesitancy.
Evans is again a key example. As Lucy Valentine (2020) notes, in addition to regularly posting anti-vaccine content, he also posted photos of himself ‘wearing MAGA [Make America Great Again] hats and sharing cryptic content related to QAnon conspiracy theories’. In November 2020, Evans posted a cartoon to his Instagram account featuring caterpillar in a MAGA hat talking to a butterfly with a neo-Nazi symbol on its wings called the Sonnenrad. This symbol was featured in the manifesto the perpetrator of the 2019 terrorist attack against mosques in Christchurch, Aotearoa New Zealand that left 52 people dead (Carey, 2020), and the manifesto of a white supremacist who killed ten people in Buffalo, New York in May 2022 (Feuer, 2022). The repetition of this symbol in various contexts demonstrates how the flexibility of online networks and participation in what Castells described as a shared hypertext enables overlaps between conspiracy theory groups like QAnon, anti-vax communities, wellness groups, and extremists and supports the exponential spread of extremist ideologies. The post generated significant backlash, prompting Evans to delete the post and apologise. In the process, Evans claimed he was unaware of the symbol's meaning (Carey, 2020), a claim complicated by the fact that he posted that he was ‘waiting for someone to see that’ when someone pointed out the neo-Nazi symbol on his Instagram site.
The mix of wellness and self-help discourse with extremist ideas was also on display in the ‘freedom convoy’ protest in Aotearoa New Zealand, which one reporter described as a ‘leaderless mass of harmless hippies, Hari Krishnas and homeopathy believers on one side and angry anti-vax and maybe incipient assassins on the other’ (Peacock, 2022). A 2021 Report from The Disinformation Project notes that ‘Covid-19 disinformation was being used as a Trojan Horse for norm-setting and norm-entrenchment of far-right ideologies within Aotearoa New Zealand’ (Hannah et al., 2021). In sum, wellness influencers like Evans, and the disinformation they share, are increasingly intertwined with theories and groups that pose different, but equally significant, threats to society.
Although it seems as if disinformation is the source of many current political and social issues, the journalist James Ball (2017: 13) argues that it is more a symptom than a cause. Factors contributing to the generation and spread of mis- and disinformation include long-standing societal issues such as misogyny, racism, nationalism, hyper-partisanship, distrust in government, and anti-intellectualism. For example, Jennifer Reich (2021), a professor of sociology at the University of Colorado Denver, says that ineffective treatments like ivermectin and hydroxychloroquine are popular among vaccine sceptics ‘precisely because those treatments haven’t gone through the same process of scientific and expert review that they distrust.’
Ball's observation points to a path forward for those looking to combat vaccine hesitancy. Rather than debunking or blocking vaccine disinformation, reactions that have several complications as noted above, work needs to be done to reinstitute trust in government, scientists, and medical practitioners. That task is, of course, much easier said than done. However, the work of groups such as Team Halo, who post to popular social media platforms such as TikTok and Instagram and, in essence, work to become wellness influencers themselves, is just one possible model others can adopt in the future.
Footnotes
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding
The author received no financial support for the research, authorship and/or publication of this article.
