Abstract
Health misinformation, a major public health challenge, is increasingly spread through social networking sites such as WhatsApp which is popular among culturally and linguistically diverse (CALD) communities including the African migrant and refugee community, a relatively disadvantaged minority in Australia. Knowledge remains limited about how health misinformation spread occurs through WhatsApp in this community. The present study explored the mechanism of health misinformation circulation on WhatsApp, and the ways members of the African community in Southeast Queensland (SEQ) respond to it. Findings include a technological aspect of WhatsApp, especially technological affordances that facilitate health misinformation spread with features such as sharing and forwarding buttons. Also, at a user or an individual level, trust in significant others favour the reception and sharing of unverified health information to WhatsApp contacts and group members. Although WhatsApp group members, especially leaders usually set up rules to moderate content including health misinformation to primarily preserve harmony in groups, lack of or suboptimal content moderation on WhatsApp exacerbates its spread among community members whose responses vary. Responses include fear and mistrust which could confuse them and hinder acceptance and compliance to public health measures from credible sources such as governments. Therefore, it is essential that public health stakeholders acknowledge and foster information-sharing culture on WhatsApp in the African community. They should also raise awareness among community members and train them on how to deal with health misinformation. The training could focus on reducing negative individual and social influences by improving literacy and self-efficacy in detecting health misinformation and decreasing echo chamber effects. Additionally, the training could emphasise health misinformation management on WhatsApp by leveraging African community leaders’ gatekeeping role and involving them in content moderation.
Keywords
Introduction
The world faces serious public health issues due to misinformation (Finegold, Asaria, & Francis, 2013; Naghavi et al., 2017); and health misinformation can lead to underestimation and resistance to critical public health measures supported by governments. Health misinformation can be defined as “a health-related claim that is based on anecdotal evidence, false, or misleading owing to the lack of existing scientific knowledge” (Suarez-Lledo, & Alvarez-Galvez, 2021, p.2). It is also important to differentiate between misinformation and disinformation. The former refers to false information without the malign intention to harm, while the latter has a more cynical connotation as authors deliberately fabricate false information intending to harm others (Suarez-Lledo, & Alvarez-Galvez, 2021). Health misinformation, especially during epidemics and pandemics, can negatively impact outcomes in populations including disadvantaged communities (Birukila et al., 2017; Wagner, 2014; Siddiqui et al., 2020). Negative health outcomes, including deaths, could potentially be lessened with adapted and effective public health communications. Health misinformation is already an issue in healthcare systems. However, its rapid spread in recent times, including through social networking sites and other social media, is a major concern that needs to be addressed by public health stakeholders (Bode, & Vraga, 2018). Various communities, especially those in and from regions of the Global South, including Africa, Latin America, Asia, and Oceania, prefer social networking sites such as WhatsApp for their daily digital communications (Baulch, Matamoros-Fernández, & Johns, 2020). People from the Global South have different practices and experiences of online communication from the Global North regarding health that deserve scholarly attention.
In Australia, culturally and linguistically diverse (CALD) communities are reported to face difficulties in accessing healthcare services mostly because of differences in languages and cultures (Agu, Lobo, Crawford, & Chigwada, 2016; Jatrana, Pasupuleti, & Richardson, 2014; Manderson, & Allotey, 2003). These cultural differences prevail in most cases despite the provision of universal health care (Medicare) for all Australian citizens including those from refugee backgrounds with lawful immigration status. Despite available subsidised healthcare services, members of the CALD communities often express dissatisfaction because of unmatched expectations of medical service delivery (Ziaian, de Anstiss, Antoniou, Puvimanasinghe, & Baghurst, 2016; Franks, Gawn, & Bowden, 2007). Thus, the literature often mentions the difficulties of access to health care by members of CALD communities because of socio-cultural challenges. However, the use of digital technologies to seek and access health information and services is less documented in the literature.
The African migrant and refugee community is a minority Australian community with an estimated number of 400,000 people (58% white South African, and 42% sub-Saharan black Africans) in 2020 (Counted, Renzaho, 2021). People of African descent represent about 1.6% of 25.69 million of the total Australian population. Many members of the African community in Australia, especially those from humanitarian crisis backgrounds and Sub-Saharan African countries, face resettlement challenges due to linguistic and cultural barriers (Ikafa, Hack-Polay, Walker, & Mahmoud, 2022).
This study explored how health misinformation spread occurs through WhatsApp and the responses it induces among members of the African migrant and refugee community in Southeast Queensland (SEQ). WhatsApp was almost universally identified during focus groups as the most popular instant messaging app in the community. The study revealed insights into the roles of key actors and components involved in the circulation of health misinformation on WhatsApp and how community members react to that phenomenon. Understanding African community members’ experience of health misinformation through a major instant messaging app, like WhatsApp, helps to identify key elements to be leveraged in countering the phenomenon and more effectively promoting public health. To understand members of the African migrant and refugee community's experience of health misinformation in SEQ via WhatsApp, this study took a qualitative, ethnographic approach. The study specifically employed focused ethnography ‘to explore specific cultural perspectives held by sub-groups of people’ (Higginbottom, Boadu, & Pillay, 2013, p.1) about experiences of health information including misinformation on WhatsApp within the African community in SEQ.
This study sought to investigate the primary research question: RQ: In what ways do African migrants and refugees in SEQ spread and respond to health misinformation using WhatsApp?
The article begins by presenting a theoretical framework for analysing how health misinformation spread occurs on WhatsApp and is responded to among members of the African community in SEQ. Next, I share the methods used to collect and analyse the data including thematic and semiotic analyses of focus groups and scroll back interview transcripts and WhatsApp screenshots. I outline my positionality statement for conducting the research and I present the study findings and discussion. Finally, my conclusion summarises key components of the ways that health misinformation is spread and responded to by community members.
Theoretical framework
This study applied a broader theoretical framework to investigate how misinformation is spread and responded to on WhatsApp. It is a combination of theories and concepts from different disciplines such as social, media and communication, and health sciences. This is because the phenomenon of misinformation remains complex and requires different perspectives for a thorough examination.
Misinformation spread mechanism in society
Social conformity shapes human behaviour. People usually thrive as a collective society with shared cultural values and acceptance of common rules and norms. Asch's (1956) seminal work on conformity underscores the fact that individuals may seek to conform to other people's behaviours despite contrary beliefs. Individuals would seek to satisfy society's expectations to keep their status and the attached utility benefits (Bernheim, 1994). This collective way of life induces social homogeneity and preserves harmony in society. Because of that quest for homogeneity, people act in a way to comply with others’ perceptions and avoid any action that could be disruptive and perceived as offensive in public opinion.
Conformity also underpins different degrees of networks among people developing loose or close relationships and ties. Network theories, for example, allow us to understand how health misinformation spreads through ‘social influence, social learning, and social contagion’ (Bessi et al., 2015; Radzikowski et al., 2016 as cited by Wang, McKee, Torbica, & Stuckler, 2019, p.6). Also, people like to share information with like-minded others. This is a form of homophily whereby members of given circles are likely to share information among themselves (McPherson, Smith-Lovin, & Cook, 2001). Individuals, being part of different loose and close circles, would share information with others. Unfortunately, information, the origin of which is not always or cannot be verified, is also frequently shared. Information spreads along with circles of friends and family members and circulates in a ‘cascade’ within relatively homogeneous groups who believe or are susceptible to believing in it (Wang, McKee, Torbica, & Stuckler, 2019, p.2).
Misinformation spread mechanism at an individual level
Besides social mechanisms that drive an information-sharing culture, misinformation could be explained at an individual level. Various studies have highlighted the psychological aspects of misinformation. According to Chua and Banerjee (2017), individuals with a low epistemological level (epistemology being the science of knowledge) are likely to share unsubstantiated health information. Also, the rumour theory posits that unless the story or rumour has personal relevance to individuals it may be not worth paying attention to and spreading it (Allport, & Postman, 1947). People are known to be inclined and sensitive to sex, anxiety, hope, desire and hate. Therefore, they are delighted with sex scandals, graphic and violent images, selfish, and intolerant stories, and life dreams that are commonplace on social media such as WhatsApp. Individuals are receptive to stories that echo their inner thoughts. In that sense, the echo chamber effect (Cinelli, De Francisci Morales, Galeazzi, Quattrociocchi, & Starnini, 2021) with its selective corollary due to personal interest, contributes to the acceptance of and spreading of misinformation among like-minded peers. Also, Bruns (2019) insists on the key role of human actors and the impact of homophily in information selection and acceptance by digital media users while questioning the overstatement of technological determinism in addressing echo chamber and filter bubbles in the literature.
Misinformation spread agents and responses
The other theoretical understanding of misinformation remains the components involved in its spread mechanism. Wardle and Derakshan (2017) identify the agent, message and interpreter of misinformation. Thus, agents of misinformation include message authors and sources even if they are usually unknown which is typical of misinformation with little or no verifiable information about authors. Importantly, message authors’ careless or deliberate communication acts result in misleading others with potentially negative outcomes. The other key component of misinformation relates to the message itself; the way it is crafted using specific emotions, especially negative ones such as fear, anger and sadness (Aquino, Donzelli, De Franco, Privitera, Lopalco, & Carducci, 2017) and how long it remains in circulation. Finally, the interpreter's role in believing and sharing biased information cannot be underestimated in the spread mechanism of misinformation.
Methods
I employed a focussed ethnographic approach relying on data collection techniques such as focus groups and scroll-back interviews to account for African community members’ experiences of health information including misinformation through WhatsApp. I also analysed the data following thematic and semiotic analyses.
Focus groups
I undertook focus group interviews with fifteen (15) community members (seven men and eight women) to investigate their attitudes, knowledge, opinions and experiences in health information and care via digital technologies including social apps. Group discussions such as focus groups are in line with natural social gatherings and can parallel regular community meetings in the African community. However, because members of the African community, especially men and women might differently experience digital technologies for accessing health information, I organised focus groups with men and women separately to avoid culturally based gender biases (Eisenhauer, Mosher, Lamson, Wolf, and Schwartz, 2012). Differences in gender roles in traditional African households potentially lead to different ways and purposes in seeking and accessing health information via WhatsApp. Many studies on the adoption of technology by Africans in areas like health care usually focus their research on individuals and communities without paying much attention to gender relationships in engaging with technology and community agencies (Stamp, 1990). To collect data with African women, I trained an African female volunteer moderator to conduct a focus group and scroll-back interviews, while I carried out the same research activities with men in the community.
Scroll back interviews
I utilised the scroll-back interviewing method (Robards, & Lincoln, 2017) with seven (7) community members (three men and four women from focus groups) as follow-up research activities. Scroll-back interviewing is a participatory method that places media research participants as key contributors in the analysis of health-related information accessed and used on their mobile phones. The scroll-back interviewing method was named as such when deployed in media research on young people's use of Facebook. Robards and Lincoln (2017) presented the method as a combination of techniques, such as the swiping or ‘scrolling’ (p. 9) of screens by users during in-depth interviews, to assist interviewees in commenting on ‘traces’ or archives of captured moments from their past.
In this study, scroll-back interviewing techniques were used by the female volunteer and me with community members who agreed to participate in individual in-depth discussions following initial focus groups. These were intended to gather richer information about interviewees’ experiences accessing health information via WhatsApp. Participants scrolled back through health-related messages on their mobile phones via WhatsApp and commented on those messages during interviews. They explained, for example, their motivations for using WhatsApp, what they liked and disliked about messages including health misinformation, and general challenges in using WhatsApp. Participants took screenshots and shared them with the trained female facilitator and me. By commenting on health information including misinformation received and shared on WhatsApp, they reflected on their experiences of the instant messaging app.
Thematic and semiotic analyses
The analysis of the data collected from members of the African community consisted of the thematic analysis of the focus groups and scroll-back interview transcripts and the semiotic analysis of WhatsApp screenshots. Several themes emerged from the data about Africans’ experiences of health information and services through digital technologies in SEQ, however, for this report, only health misinformation is discussed here.
I carried out a thematic analysis of transcripts by operating NVivo software for coding. I applied Thomas’ (2006) six steps coding guideline by, first, cleaning and preparing the raw data on the transcripts, second, conducting a close reading of the raw data from the prepared and deidentified transcripts, third, creating categories through the identification and description of primary or key themes via NVivo software, and fourth, checking coding results and sending transcripts to two educated members of the African community to do some sense checking with them about key themes. The two other steps included, fifth, reviewing and further refining the categories and even identifying new ones which were initially overlooked, and finally, sixth, writing up research findings with primary and secondary themes as headings and subheadings, and actual phrases and words by participants as quotations and illustrations in the report. Spontaneous and consistent themes that emerged from the data beyond research questions were also included in the report as key findings.
As far as the thematic analysis of data was concerned, I also relied on the framework of the six coding steps (see the six steps in the paragraph above) as suggested by Thomas (2006). I also used the conceptual semiotic framework by Roland Barthes (1977) to analyse WhatsApp screenshots. Barthes develops an understanding of signs at two main levels including denotation and connotation. Denotation consists in reading the sign using common sense description, while connotation refers to interpreting the sign according to the reader's cultural background (Bouzida, 2014). The semiotic analysis focused on language including texts in English and other languages such as Swahili and images (e.g., WhatsApp profiles, photos, colours, emojis, emoticons, forward buttons, audio, and video links) as archived in WhatsApp.
Positionality statement
I am both an insider and an outsider with regard to the African community in SEQ. I am an insider because I am an African born and raised in Cote d'Ivoire, west Africa. Apart from key healthcare stakeholders consulted for this study, who support newly arrived migrants’ and refugees’ health and are mostly from different cultural backgrounds, most of my research participants from the general community come from Africa. We share cultural heritage and values because of our origins in Africa. My understanding and interpretations of participants’ views are likely to be similar to theirs. However, I am also an outsider because of my position as a researcher who has been trained as a health communication and digital health academic to conduct research to ethical standards, especially when data collection involves migrants and refugees. The latter, for example, are considered vulnerable in research settings (Ellis, Kia-Keating, Yusuf, Lincoln, & Nur, 2007; Mackenzie, McDowell, & Pittaway, 2007) which need higher ethical standards.
Findings and discussion
The first part of the findings refers to a thematic analysis of the focus groups. Different major themes emerge from the focus groups, however, only the themes and subthemes related to health misinformation are reported here. The following findings discuss the spread mechanism of health misinformation at different levels including technological, user or individual and community leadership levels which entail mixed and problematic responses by members of the African community in SEQ.
At a technological level: affordances via information sharing features
Members of the African migrant and refugee community rely on the instant-messaging app WhatsApp to circulate and obtain information including health information in the community. The reliance on WhatsApp is shaped by different aspects of the messaging app including technological affordances. Technological affordances are the perceived range of possible actions available through digital technologies (Bucher, & Helmond, 2018). The most important technological affordances of WhatsApp are its range of possible options for making and receiving messages, both copied and created, and that all these things can be done individually or in groups. Features allow for sending and receiving messages with individuals and groups who can also make voice and video calls, and share and forward messages (Lu, Vijaykumar, Jin, & Rogerson, 2022). WhatsApp communication features contribute, at a technological level, to rapidly circulate an enormous amount of information including false health information. The virality of information on social media occurs because of information sharing and forwarding as many people rapidly get that information in a short period (Goel, Anderson, Hofman, & Watts, 2016). The possibility to rapidly spread information via WhatsApp also works to limit opportunities for incorrect information to be countered or contested prior to it reaching large audiences. Thus, technological affordances play a key role in health misinformation spread via WhatsApp in the community. The following quote shows how community members easily send and obtain information as individuals or in a group on WhatsApp. Like in a group, if you send a message to people in the group everybody will get the message. If you are not in a group, if that person is your sister, and if you want to send her the message, you want to select the people, these people, then they get the message. All of them. (Ayan, male community member, 37 years old)
At a user or an individual level
Another key aspect of health misinformation spread on WhatsApp within the African community in SEQ relates to the influence or esteem of the message source (e.g., the message sender, sharer, or forwarder) for message receivers. Considering the immediate source or sharer of health misinformation and the power they exert on message receivers, who in turn, circulate false information is critical given the collective nature of African communities, where family and community members are trusted sources of information. Community members, who receive unverified health information, might decide to share it with others, perpetuating the cycle of misinformation spread. The following quote, for example, is from a community member whose mother encouraged her to protect against COVID-19 by drinking a non-conventional medical beverage. Traditional remedies, such as herbal beverages to cure diseases, are commonplace in an African context. I know my Mum. She always likes…she sends me health videos. They are like you need to do this; you need to do that. For example, through WhatsApp, during the pandemic of COVID-19, videos will be about like use ginger, use this, put lemon to help you overcome. (Monaa, female community member, 24 years old)
Yeah. And this [video on WhatsApp] shows how we can make something to drink and make yourself feel good. Yeah, and that video helps a lot. Okay. And we try, to help other people here. (Hadiza, female community member, 32 years old)
Community members make medical decisions such as drinking herbal or traditional medicines based on significant others’ advice which could be detrimental to health. Informal medical advice with unverified evidence can harm individuals’ health.
At a community leadership level
I found that African community members, who play key gatekeeping roles and are trusted information sources in their communities more broadly, also informally help set user rules to exert some level of content moderation on associated community WhatsApp groups. Such informal content moderation is primarily to preserve community harmony but also works to help counter toxic and problematic content in community WhatsApp groups, including around health misinformation. Community leaders also countered misinformation by directly sending material backed by official health groups and other authorities to their WhatsApp connections, as Eyze explains below: Community leaders or Secretary sent us important messages including health information individually through WhatsApp. Also, sometimes, within a WhatsApp group, leaders would share information, and everybody would see it. (Eyze, female community member, 33 years old) In my community, WhatsApp can help us. There are rules here [the community]. He [ the community leader] can send us messages and call us for a meeting. Rules are important. With WhatsApp, everything can be easy. (Odilon, male community member, 40 years old)
This quote also reveals the communication structure whereby African leaders play a key information source and gatekeeping role on WhatsApp. The quote shows WhatsApp is a key communication channel used by African leaders to inform about and discuss issues in the community.
Mixed and problematic responses to health misinformation
After discussing the nature of health misinformation spread, the following theme relates to mixed responses from message receivers. First, there is fear caused by health information that is not always based on scientific evidence. The authors employ frightening language and images to share baseless health information. The unknown authors and senders of health misinformation usually frame messages using rhetoric that triggers strong emotions such as fear and despair in receivers. Fear could be disproportionate with a dreadful effect that competes with public health messaging. Health messaging usually focuses on effective specific health behaviours to prevent further infections. However, the dreadful effect of health misinformation competes with health messaging by governments and other verifiable sources which usually deliver it in English with a limited reach of CALD communities. In the quote below, Farai, a community member described the moment he was petrified by sensationalised content about the COVID-19 pandemic with negative consequences. COVID-19 was very difficult because I was scared…. I was scared. Oh. It's coming! Too many things [pointing at his head]. I missed some appointments to go to the doctor and a driving license. Because when I was scared, I couldn't go there. So, I missed appointments like that. (Farai, male community member, 26 years old)
I heard [saw and listened to videos on WhatsApp] people sending videos of how people are dying like flies. So, when you start seeing this type of video on your phone, you say ah. What type of disease is this? If you touch someone, maybe you will fall down. (Jengo, male community member, 33 years old)
The other response to health misinformation among members of the African community in SEQ was mistrust. Indeed, community members have doubts about the veracity of some health information and there is a risk some might lose trust in legitimate health information because of the lack of credibility of WhatsApp and other social media platforms. On the other hand, some members have more discernment and health literacy as you can read in the quote below from Monaa. This indicates that some community members have the ability to assess the value of health information based on factors including their levels of education, health awareness, and literacy. Members are also aware of health misinformation as a major social phenomenon across social media which increases mistrust about online information.
For me, in terms of health, I don't understand. I don't really trust the stuff that I get from it. When it comes to telling about regarding health remedies, I need more solid research to back it up. (Monaa, female community member, 24 years old)
In sum, the thematic analysis of focus groups highlights the components of health misinformation spread including technological affordances that facilitate the spread, and key agents such as message senders, sharers, receivers and forwarders in the phenomenon which induce mixed reactions by African community members. This misinformation on WhatsApp could cause community members to disproportionately react with fear, anxiety, and mistrust. Community members reported that they/their peers reacted with fear/anxiety to any information regarding Covid on account of seeing sensationalised material and misinformation via community groups on WhatsApp. Repetitive exposure to health misinformation could not only compromise health awareness and literacy but also hinder the reception of true and official health information by community members.
The second part of the data analysis outlines the semiotic analysis of screenshots of health misinformation on WhatsApp. I found that low literacy in English can cause misunderstanding of legitimate health information. Also, there are different severity levels of health misinformation on WhatsApp with potential negative health outcomes. The following three figures represent demonstrations of health misinformation experience within the African community. Figure 1 illustrates a misunderstanding of credible health information by community members due to limited English skills and knowledge.

A misunderstood COVID-19-related message in an African community WhatsApp group.
Low literacy and increased risk of misunderstanding health information
Figure 1 is a video related to the COVID-19 vaccine. The image shows a blurred man, with a tie and a suit, who looks professional and official. It is unclear which country the video refers to even though the DRC and Australian flags are displayed in the profile. The accompanying description states that ‘Vaccine nationalism will prolong the COVID-19 pandemic, not shorten it’. The link to the video indicates the message was sourced from Facebook. An interviewee shared this message as an example of a health conspiracy theory.
However, this health message about vaccine nationalism, referring to the behaviour of rich nations with disproportionate financial resources rapidly buying and securing available COVID-19 vaccines for their populations at the expense of low-income nations, was a genuine concern of specialists worried that such practices could worsen the COVID-19 pandemic by leaving entire regions in the world unprotected. Katz, Weintraub, Bekker, and Brandt (2021, p. 1281), for example, criticised vaccine nationalism as ‘short-sighted, ineffective, and deadly’. Our interviewee's understanding of this material as misinformation or part of a conspiracy might be due to several factors, including insufficient English skills to comprehend the nuance of the message. English is the second, and in many cases, the third language, spoken by a large proportion of the African community in SEQ.
Other signs in this video, such as the DRC and Australian flags, evoke bi-national identity. While African community members in this study live in Australia, their new country, they remained deeply attached to their home countries, including the DRC. By displaying both flags on the group profile community members express a bi-nationalism and an attachment to both countries. This social and emotional attachment to both countries (one potentially a victim of covid nationalism and the other a beneficiary), alongside the ready capacity to forward this material through Facebook, where it was sourced, highlights how platform features quickly contribute to making content widespread or ‘viral’ on WhatsApp and across other platforms. boyd (2011, p. 239) has referred to this phenomenon using terminology such as ‘replicability’ and ‘scalability’, describing these as “high-level” affordances of contemporary social media. However, from a public health perspective, content such as misinformation or conspiracy theories about health issues could have a large negative impact on populations, especially those with low health literacy, (i.e., through the quick replication and potential for a significantly larger reach of baseless claims in a platformed society). The next figure, Figure 2, shows how unknown authors with obscure political agendas try to manipulate public opinions to defy essential public health measures amid a major health crisis. A study by Wilson and Wiysonge (2020) revealed that anti-vaccination campaigns on social media about Covid were actively organised by foreign misinformation manipulators who significantly contributed to vaccine hesitancy in Western countries.

A sarcastic message about the COVID-19 vaccine.
Public opinion manipulation and risk of public health defiance
Figure 2 describes a message about COVID-19. It is composed of an image of the former Prime Minister of Australia, Scott Morrison, and some accompanying text. The text uses a sarcastic tone to inform the audience about the implications of the future COVID-19 vaccines (vaccines were not yet available). The message announced that the vaccine would curtail the freedom of those who refuse it. Former Prime Minister, Scott Morrison, is presented here as a hypocritical authority figure implementing public health measures that reduce individual freedom in Australia. This message appears in a WhatsApp group whose members are clearly from the DRC as indicated by the flag in the profile.
Figure 2 is also an example of politicized health information amid a pandemic. The message makes some claims that are questionable as the Australian government had not yet made the COVID-19 vaccine ‘mandatory’. Although some concerns might sound legitimate, claims such as ‘no welfare payment’, ‘tax refund withheld’, and ‘kids cannot attend schools’ seem exaggerated. At the time of the interview, these consequences were not in place. The text used a sarcastic and inflammatory tone to campaign against the COVID-19 vaccine with little evidence. With the headline and conclusion in red font, the sender/producer of the message desired to create a sense of urgency in viewers/readers to see what they believed to be at stake with future COVID-19 vaccines.
Importantly, the creators of the post have used the image of Scott Morrison, the former Australian Prime Minister in the message as a symbol of power that goes against individual freedom and freedom of choice. This was part of an emerging counter-campaign against COVID-19 vaccines before their global rollout to reduce the impact of the virus on populations. The interviewee, who shared this screenshot, pointed out that a ‘message like this confuses’ people on “the internet”. Health misinformation disseminated on WhatsApp undermines users’ trust in Covid-related information as pointed out by the community member. Community members rely on peers and other members as main sources of information and through whom they try to understand what is going on in terms of local and international matters. Open access to information technology in terms of content creation and consumption reinforces freedom of expression which is good in a democratic society. However, the downside of open access to information technology is that it could create confusion and a toxic environment that reduces the impact of public health strategies. Health promotion and campaigns could face serious competition and opinion adversaries who might be good at designing and communicating their misleading messages through widespread social apps like WhatsApp and attract public attention and resistance to public health measures. Following is Figure 3 and another instance of a health misinformation message with an even more threatening tone by unknown message authors.

An illustration of a COVID-19 conspiracy theory shared in a Congolese community WhatsApp group in SEQ.
More malicious health misinformation and conspiracies
Figure 3 is a forwarded message sent to a WhatsApp group of African community members originally from the DRC, and who live in Australia. Both countries, to which members belong, are symbolised by the flags displayed in the WhatsApp profile. Figure 3 is a long message addressed to the people of the DRC. The source of the message is unknown. It is in French and sounds the alarm to inform the people of the two Congo countries that future stocks of chloroquine intended to treat COVID-19 (there was no vaccine against COVID-19 at that time) would be poisoned by ill-intentioned people wanting to kill black Africans. The community member, who shared this screenshot during a scroll-back interview, pointed to this message as an illustration of ‘bizarre facts that have been shared around’. It alleged that the future chloroquine stocks at the time of posting would be poisoned and sent to Sub-Saharan African people, particularly those in both Congo: DRC (capital city: Kinshasa) and Republic of Congo (capital city: Brazzaville) to harm them during the COVID-19 pandemic. To attach some credibility to the text, the sender evoked the popular French physician Professor Raoult as the ‘friend’ of the unknown authors of the message. It lacked substantial evidence readers could rely on to check its credibility as the source and the original authors are both unknown. The community member who showed this message described it as ‘bizarre facts’ similar to conspiracy theories intended to manipulate people in Africa and particularly those from both Congo for obscure gains.
There is a possibility that many people in the African community, take at face value all the information that circulates through social apps like WhatsApp. A study by Flintham et al. (2018) revealed that a significant number of social media users did not apply any verification strategies regarding fake news and that they believed them. African Australians in SEQ might not also question health disinformation circulating on WhatsApp because of factors such as trust and intimacy in small community groups. There is, therefore, a high likelihood that some members of the African community would become victims of this kind of health disinformation. This kind of material is clearly more intentional and malicious than the kind of issue I highlighted in Figure 1 where I showed a legitimate health message being misinterpreted.
In summary, widespread health misinformation is a major risk in terms of public health because it jeopardizes the efforts of public health stakeholders who often struggle to convince community members of the merits of health solutions, such as vaccines. This is particularly the case in environments where low levels of written language and digital literacy prevail, such as Africa. Misinformation and public manipulation, for example, sparked an anti-polio vaccine campaign in northern Nigeria (Birukila et al., 2017), where religious leaders used digital technologies, such as Bluetooth, to spread polio anti-vaccine messaging to their people. Fortunately, public health officials led a counter-offensive to promote the vaccine with a promotional video via Bluetooth in an African language and successfully countered the anti-vaccine campaign. This study suggests similar efforts are needed by Australian public health officials to ensure that members of migrant and refugee communities, such as SEQ's African community, are appropriately reached by their messaging.
Conclusion
This study has demonstrated that health misinformation spread on WhatsApp in the African migrant and refugee community in SEQ occurs because of a range of diverse factors – some of which are universal such as technological affordances allowing for quick sharing and sending of material, and others which are community specific such as low levels of English, digital and health literacy. Individual susceptibility and the strength (or lack) of informal content moderation by African leaders who, to preserve community harmony, usually set acceptable parameters for engagement in WhatsApp groups are also important factors. Besides the health misinformation spread mechanism, the study findings revealed community members’ responses which include fear, anxiety, and distrust. Politicised health information tends to create distrust in governments and health officials beyond mere confusion in populations and particularly disadvantaged minority CALD communities and subsequently hinders public health efforts. Further understanding of health misinformation in the African community stems from complex personal and external factors. Members of the African community, indeed, experience low levels of English literacy. English is the language in which most health information is delivered in Australia. The Use of the English language in mainstream and digital media is problematic for many African community members, especially those from refugee backgrounds with no or little literacy because they don’t understand English. Also, the difficulty of health misinformation is external to individuals because they are victims of public opinion manipulation by unknown authors with obscure agendas. Because of the competition created by the persistent noise and toxicity of health mis (dis)information, public health stakeholders might have difficulty delivering genuine health messages and addressing public health issues. This study contributes to research on the health misinformation global spread and responses especially negative emotions (e.g., fear, anxiety, mistrust) with a focus on a CALD migrant and refugee community in Australia, an industrialised nation that faces health disparities due to social inequalities. Based on the study findings, public health stakeholders must ensure African community members understand health messages by an interpretation (of a video or audio) or a translation (of a text) into the language most users are familiar with on WhatsApp. Public health stakeholders also need to acknowledge the important influence the African community exerts on individuals and reduce any negative echo-chamber effects that favour reception and sharing of unverified information. Public health stakeholders should also integrate into their strategies an information-sharing culture on WhatsApp within the African community. This collective way of rapid access to information could lead to viral spread of false and harmful health information. That is why, public health stakeholders need to empower community members through literacy and self-efficacy and by training them in detecting and countering health misinformation. Public health officials should further leverage the African leaders’ gatekeeping role in content moderation and health promotion. African leaders are respected and have some form of control over the content shared on WhatsApp and can help counter health misinformation.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship and/or publication of this article.
