Abstract
Digital innovations are transforming newsroom practices across the globe. The advent of artificial intelligence (AI) tools is upending and altering newsmaking and news distribution practices. Whilst there is existing literature on the ever-changing journalism practices due to AI tools in the Global North, scholarship on how these digital innovations are shaping African newsrooms has remained scant. This study uses the case of Alice, an AI-powered newsreader in Zimbabwe, to explore the audience perceptions of this digital innovation. Drawing upon Afrokology, and cultural studies as theoretical frameworks, this qualitative study derived data through digital ethnography and in-depth interviews. Findings demonstrate that there are mixed feelings on the appropriation and use of AI-driven news anchors. On the one hand, some audiences applaud the use of AI news anchors as innovative storytelling techniques. On the other hand, audiences are concerned with Alice’s lack of human emotion, poor accent and perceive her as a threat to traditional journalists’ jobs. Some of the resistance towards Alice indicate a need for decolonising the AI tools in the newsrooms.
Keywords
Introduction
The advent of new digital technologies is increasingly transforming the processes of gathering, producing, disseminating and consuming news. Newsrooms are adopting new innovative tools in ways that are both advancing and upending journalism practices. These new technologies are helping news organisations that seek innovative ways of storytelling and user-engagement. Artificial Intelligence (AI) tools have become central to these digital transformations in newsrooms as they are shaping the processes of news content production and distribution (Jamil, 2021). Automated content creation and machine-written stories are increasingly becoming popular, especially in Western newsrooms (Jamil, 2021). It is, therefore, not surprising that research on the adoption of AI in newsrooms has predominantly focussed on the media landscape in the Global North (Kothari and Cruikshank, 2022). As such, ‘little information’ exists about how African newsrooms are adopting and utilising AI technologies (Kothari and Cruikshank, 2022: 17). In other words, there is a ‘glaring lack of research’ on how AI tools are being adopted in African newsrooms (Munoriyarwa et al., 2023: 1374).
Given that the scant literature on the adoption of AI in African newsrooms (Kothari and Cruikshank, 2022; Moyo et al., 2019; Munoriyarwa et al., 2023; Ogola, 2023) does not focus on AI-powered news anchors, it is timely to examine the audience perceptions of AI-driven news presenters in the contexts of developing countries. This study uses the case of Alice, an AI-powered news anchor in Zimbabwe, to explore audiences’ perceptions of this news content. Alice was unveiled by the Centre for Innovation and Technology (CITE), a digital media start-up in Zimbabwe, in May 2023. In light of the assumption that AI is a ‘Western phenomenon’, it is important to explore CITE’s audiences’ experiences and perceptions of this AI-generated news reader. A study on audiences’ perceptions of Alice provides insights into the adoption and reception of AI tools within the specificities of African newsroom contexts.
The nexus of machines, humans and culture remains highly contested in media and cultural studies (Natale and Guzman, 2022). Advancements such as AI tools are reconfiguring our understanding of the relationship between technology and culture (Natale and Guzman, 2022). This study affirms a belief that whilst machines and humans are independent actors, they are deeply entrenched in the processes of meaning-making (Natale and Guzman, 2022). As such, this research draws upon the perspectives of media and cultural studies (Hepp, 2020; Natale and Guzman, 2022) to examine audience perceptions of the integration of Alice into newsroom practices in Zimbabwe. Whilst acknowledging the implications of ‘algorithmic technologies’ and the ‘datafication of society’ (Natale and Guzman, 2022: 628), the human element should not be overlooked in making sense of these technological transformations. In essence, cultural studies enable researchers to ‘reclaim the human in machine cultures’ (Natale and Guzman, 2022: 629). I consider AI-driven newsreaders as embedded in culture, as meanings assigned to them by journalists, audiences and other actors are context-specific.
Background
The media sector in Zimbabwe is dominated by state-controlled media such as the Zimbabwe Newspapers Group (Zimpapers) owned news outlets (Herald, Chronicle and Sunday Mail) and the Zimbabwe Broadcasting Corporation (ZBC). Privately-owned news outlets such as NewsDay and Daily News also have a share in the media market. However, of late, peripheral news actors such as CITE, ZimLive, 263Chat and The NewsHawks are also transforming the media ecology in the country. CITE is one of the digital media start-ups in Zimbabwe which are shaping the journalism landscape. Its AI-generated female newsreader was launched in May 2023. In September 2023, the AI presenter was given a complete makeover. Alice has a human-like appearance, but is not a ‘real’ human. The robot’s name came about after a request by the CITE team on social media platforms for audiences to name the AI presenter. CITE selected Alice from a list of suggested names. One of the audiences suggested the name Alice on 12 May 2023.
Alice presents CITE’s programmes such as Rate your councillor, Meet your candidate, The Brief Bulletin and This week on CITE co-hosted with Zenzele Ndebele. The AI-driven news anchor seeks to enhance news delivery and promote user-engagement, and this is demonstrated in how Alice introduces herself: My name is Alice and I am an AI presenter at CITE. With rapid technological advancements, we are continuously exploring innovative ways to enhance our news delivery and engage our audiences. I present a program Rate your Councillor. . . As an AI presenter, I deliver news content dynamically and interactively with natural language processing and speech synthesis capabilities. My current role is to present. I will soon provide news stories, conduct interviews and engage with audiences seamlessly. Through machine-learning, I mimic human-like speech patterns, intonation and facial expressions to deliver presentations. I help the newsroom reduce the time required for editing, sub-titling and sharing content (@Aicitezw, 9 June 2023).
CITE’s Social Media Manager asserts that Alice serves to augment the work of traditional journalists in the newsroom: The news reporters gather news and write it and Alice then reads the top three stories selected by the editor. We make use of Flexclip, a video editing tool which has the text to speech function, to package the stories and have Alice read them. Alice introduces the programme through the text to speech function. The stories are logged into the video editing tool. Alice also makes use of subtitles (Interview, 5 October 2023).
This AI-driven innovation at CITE should be located within the broader digital transformations in newsrooms.
Digital transformations in newsrooms
Digital disruptions are shaping newsmaking practices in African contexts (Bosch, 2010; Mabweazara, 2015; Makwambeni et al., 2023; Mare, 2014). The day-to-day journalistic cultures and routines are being shifted and upended due to these digital changes. African journalists are appropriating and utilising social media applications to generate story ideas, engage with audiences and break news events (Mare, 2014; Sibanda and Ndlovu, 2023; Verweij and Van Noort, 2014). In the Zimbabwean context, these digital technologies are being harnessed and domesticated in ways that are advancing and changing newsroom practices and cultures (Mpofu et al., 2023; Sibanda and Ndlovu, 2023; Tshuma et al., 2022). Digital innovations such as citizen journalism and readers’ comments are continuously shaping the questions of journalistic identity, user-engagement and newsroom practices. The most popular social networking site in Zimbabwe is WhatsApp, followed by Facebook (Mare, 2018). WhatsApp contributes ‘over 44 percent of all mobile internet usage in the country’ (Media Institute of Southern Africa [MISA], 2020: 19). Most of the users in Zimbabwe rely on mobile phones to connect to the Internet.
In Zimbabwe’s political landscape, digital platforms serve as alternative voices or counter-hegemonic sites (Matsilele and Ruhanya, 2020; Mpofu et al., 2023; Tshuma et al., 2022). Digital tools such as news websites (Moyo, 2007), Twitter (Tshuma et al., 2023) and WhatsApp (Ndzinisa et al., 2021) are being utilised by users to counter government propaganda. Whilst these new digital tools are contributing to democratic cultures, there are notable challenges regarding the appropriation of these tools and these include a declining rate of Internet access, use and freedom (MISA, 2020). The inhibiting factors are the mobile data costs and connectivity fees (MISA, 2020). Further, digital authoritarianism in the form of state-ordered Internet shutdowns also undermines the use of these online platforms for democratic engagement (Mare, 2020). This study focusses on audience perceptions of AI-driven news anchors.
AI tools are rapidly transforming the journalism practice (Heiselberg et al., 2022; Jamil, 2021; Moran and Shaikh, 2022; Sun et al., 2022). Terms such as ‘automation’, ‘automated journalism’, ‘algorithmic journalism’, ‘robotic journalism’ and ‘computational journalism’ tend to be used to describe the use of algorithms to complete human-like tasks (Jamil, 2021; Kothari and Cruikshank, 2022). AI is about imitating human aspects of intelligence (Jamil, 2021). In the context of journalism, AI denotes a ‘series of algorithmic processes which produce and disseminate text and images (including videos) for public consumption usually with little human oversight’ (Moran and Shaikh, 2022: 1756). These AI tools in newsrooms include machine learning, moderation, automated content creation and speech-to-text programmes (Kothari and Cruikshank, 2022). Some newsrooms are using AI applications to analyse data from different sources, produce news, lead generation, monitoring breaking news, data visualisations, fact-checking and for user engagement (Jamil, 2021; Kothari and Cruikshank, 2022; Moyo et al., 2019; Stray, 2019). AI tools are also being utilised in investigative journalism (Stray, 2019). Tools such as Heliograf, News Tracer or CrowdTangle are being used by news organisations to track breaking news and viral stories (Kothari and Cruikshank, 2022). As such, journalistic routines and work are being transformed within newsrooms.
AI tools are also being used by journalists to transcribe audio and video interviews. Chinese newsrooms such as Xinhua are utilising AI to produce news (Kothari and Cruikshank, 2022). Work bots or communicative robots are automatically generating journalistic content (Hepp, 2020: 1415). In 2018, this Chinese news agency developed the first AI news anchor which ‘appears as a man, reads text naturally and learns from videos’ (Kothari and Cruikshank, 2022: 19). In some instances, the human role of a communicator is being altered (Jamil, 2021). Discourses on AI are also anchored on a notion that journalists may not remain the ‘authority for creating and disseminating news’ (Jamil, 2021: 7). Of course, proponents of AI in newsrooms contend that these tools can help to free up journalists to partake in more ‘serious work’ (Moran and Shaikh, 2022: 1765) and ‘in-depth reporting’ (Kothari and Cruikshank, 2022). Thus, AI is pitched as a ‘liberator’ rather than a threat to professional journalists (Moran and Shaikh, 2022: 1765). However, some studies show that there is a concern that AI tools will render the jobs of professional journalists obsolete (Jamil, 2021; Moran and Shaikh, 2022). There is a belief that journalists may be substituted by machines (Moran and Shaikh, 2022; Jamil, 2021). Journalists argue that AI-generated text is poor in quality, and not ‘real’ journalism (Moran and Shaikh, 2022: 1768).
AI generated newsreaders are gaining popularity across the globe. The first country to unveil an AI news anchor was China in 2018 through the state-run Xinhua news agency. Other AI news anchors are Sana and Lisa (India), Snezhana Tumanova (Russia), Fedha (Kuwait) and Nadira (Indonesia). Not much is known about AI-led transformations in low-income countries such as Pakistan (Jamil, 2021). A study on AI in Chinese newsrooms indicate that the public’s emotions and perceptions towards AI news anchors was positive (Sun et al., 2022).
The progress in AI integration in African newsrooms has been slow. In other words, most African newsrooms are not yet fully utilising AI (Kothari and Cruikshank, 2022). Whilst AI tools have the potential to promote public interest media in Africa, their adoption by news organisations in the continent is ‘lacking’ (Kothari and Cruikshank, 2022: 24) and remains relatively low. Ogola (2023: 5) adds that: The big well-resourced media have invested in several premium AI systems and are also developing several custom-built AI tools. However, most of the smaller media organisations have either not adopted AI into their newsroom processes, or where they have done so, they rely largely on open-source tools.
Of course, there are some news outlets which have adopted these digital innovations for news gathering, processing, distribution and audience engagement (Ogola, 2023). Despite these opportunities offered by AI in journalism, there are notable challenges. There are concerns that AI tools can be abused to spread disinformation (Hajli et al., 2022). Further, AI represents the cultural norms and biases of the programmers: When such a system is applied in a different country or culture, the values encoded in the AI tools do not automatically adjust to new realities. African countries, which already rely on technology developed in either the Global North or China, will either face challenges in trying to localize the assumptions coded into the AI systems or will end up reinforcing similar biases. (Kothari and Cruikshank, 2022: 21).
Given that the algorithms are developed in Western contexts, there are concerns about their adoption in African newsrooms (Kothari and Cruikshank, 2022). The utilisation of the natural language processing technology can pose challenges. There is a need for scholarly reflections on how AI tools can be decolonised and localised (Kothari and Cruikshank, 2022). This study explores how audiences are responding to the use of AI-driven news anchor in Zimbabwe.
The scarcity of skilled personnel and newsroom resources are some of the challenges of incorporating AI in journalism. Most journalism schools in Africa lack resources to equip students with skills on utilising AI tools in newsrooms. There are also concerns that the adoption of AI will have a negative bearing on the ‘journalist’s place in the newsroom’ (Kothari and Cruikshank, 2022: 21). Other scholars are concerned with issues of authorship and responsibility for mistakes (Kothari and Cruikshank, 2022). Incognisant of these challenges, this study examines the perspectives of audiences towards CITE’s AI presenter.
Theoretical reflections
This study is informed by scholarship on the need to decolonise, de-Westernise and Africanise media and communication studies (Mano and Milton, 2021a; Moyo and Mutsvairo, 2018; Rønning and Kupe, 2000). It specifically draws upon Afrokology as a theoretical premise to understand the audience perceptions of Alice in Zimbabwe (Mano and Milton, 2021a, 2021b). Digital technologies should not reinforce global inequalities, but instead should reflect local imaginations and aspirations. Afrokology is an intellectual inquiry that calls for the theorisation of communication and the media from African experiences and epistemes (Milton and Mono, 2021). It advocates the centring of African epistemologies and ontologies that have been silenced and marginalised (Mano and Milton, 2021a). Afrokology believes in the reciprocity, interconnectedness, pluriversality and conviviality of knowledges (Mano and Milton, 2021b). It is thus premised on an idea that the hegemonic Global North epistemes are incomplete (milton and Mano, 2021). Thus, Afrokology advances the idea of a ‘pluriverse of knowledges’ (Mano and Milton, 2021a: 2) as it disrupts and challenges epistemological hierarchies and asymmetries. New digital innovations tend to reinforce Western cultural biases. As such, it is important to understand the adoption and use of AI tools from a particular African culture, place and history. The appropriation and use of digital tools such as AI should be able to reflect the voices, experiences and knowledges of Africans. Various scholars reflect on the ways of decolonising AI by engaging issues of ‘data colonialism’ and ‘digital colonialism’ (Adams, 2021; Couldry and Mejias, 2023; Mumford, 2022). They critique the ways in AI tools reinforce questions of race and coloniality in relation with Western hegemonic epistemes (Adams, 2021). Ricaurte (2022:726) adds that hegemonic AI perpetuates epistemic violence.
There are conceptual debates on the nexus of human agency, culture and technological transformations (Gaw, 2022; Natale and Guzman, 2022). Whilst other studies consider the intersection of humans and machines from the perspectives of ‘algorithmic culture’ (Natale and Guzman, 2022) and ‘algorithmic logics’ (Gaw, 2022), this research is located within cultural studies. Algorithmic culture denotes that it is not only humans who are producing culture, but also AI driven applications (Natale and Guzman, 2022). Gaw’s (2022) work utilised the notion of ‘algorithmic taste’ to explore the processes and mechanisms that govern the construction of cultural taste through Netflix’s recommender system. This study is situated within media and cultural studies (Natale and Guzman, 2022) as it seeks to make sense of how culture is constructed by humans and machines. The cultural studies tradition is about meaning-making, as it focusses on how people create meanings in order to make sense of their everyday lives (Strelitz, 2000). Scholars argue that the cultural studies tradition provides an important vantage point for exploring issues of communication and culture in AI technologies (Natale and Guzman, 2022). Further, cultural studies theorists view culture as a site of ideological struggles; where asymmetrical power relations are sustained, and challenged (Dahlgren, 1997). In this regard, AI technologies are construed as tools that can serve to reinforce biases and social inequalities (Natale and Guzman, 2022). Gaw (2022: 708) uses the notion of ‘algorithmic power’ to demonstrate that algorithms not only ‘ascribe meaningfulness’ to cultures, but also ‘enact power’ (Gaw, 2022: 708).
Methodology
This qualitative study utilised in-depth interviews and focus group discussions to gain insights on audiences’ perceptions of ‘Alice’. These interviews were conducted in both online and offline spaces. Digital ethnography enabled the researcher to observe and participate in online discussions, and document the findings. Purposive sampling was utilised to deliberately and consciously select participants for the study. The participants included CITE’s audiences who are members of the news outlets’ WhatsApp groups. As a researcher, I posed questions to members of the WhatsApp groups and then recorded their responses. Besides collecting data from CITE’s WhatsApp groups, the researcher also gathered data from other WhatsApp platforms where the topic of Alice was trending. 30 individual face-to-face in-depth interviews were also held with members of these WhatsApp platforms. Digital ethnography was also extended to Facebook and Twitter (X) as a method of making sense of audience perceptions of the AI news presenter.
On Twitter/X and Facebook, the researcher analysed user comments on the content pertaining to Alice. The data were collected from the social media pages of CITE, Alice and Zenzele Ndebele. The scope of study was May 2023 to November 2023. I selected social media content (tweets and Facebook posts) which attracted a high number of responses from audiences. Data was also collected from Twitter (X) and Facebook posts by Zenzele Ndebele, the Director of CITE, related to Alice. Zenzele Ndebele is actively involved on social media and has 150,000 followers on Twitter (X). Over 200 tweets and Facebook comments were analysed in order to understand audiences’ perceptions of Alice. The aim of the study was to generate deeper insights on audiences’ reception of Alice. The news anchor has a Twitter (X) handle (@AiCITEzw) which currently has over 576 followers. Users are commenting on the AI-powered news that is shared on Alice’s twitter (X) handle and the main CITE’s twitter (X) account. The AI-generated news is also shared on CITE’s Instagram (over 4000 followers), TikTok (over 9600 followers) and Facebook (113,000 followers) accounts. The following section analyses the audiences’ perceptions of CITE’s introduction of Alice to deliver news and engage with audiences. In other words, this study shows what audiences think of this AI-powered news presenter. Data from social media posts and responses of participants were analysed using qualitative thematic analysis.
Audience perceptions of CITE’s Alice
Findings suggest that there are mixed feelings regarding Alice and automated news reading in general. On the one hand, some audiences had positive remarks as they believe that Alice offers new and innovative ways of storytelling. On the other hand, there are some participants who had negative responses as they are sceptical about Alice and AI-driven news presenters in general. The following sub-sections discusses the diverse audience perceptions of AI-driven news presenters.
‘Oblivious’, audiences who thought Alice was human
There are participants who strongly believed that Alice was a ‘real’ human. They assumed that the AI-powered news was being presented by a human being, instead of a machine. Machines and humans have become intertwined to an extent that some participants are failing to differentiate the two. Such a perspective reinforces Heiselberg et al.’s (2022) understanding that there are ‘oblivious’ audiences who believe that AI-driven news presenters are human beings. Audience reactions on social media platforms (Twitter/X, Facebook and Instagram) reflected this misconception that this newsbot as a ‘real’ human. As a result, some of these users reprimanded Alice for the mispronunciation of local names in ways that seemed as if they were addressing a ‘real’ human. This led to some of the users ridiculing people who did not know that Alice was an artificial intelligence presenter: Some people commenting on this tweet missed the aspect that Alice is an artificial intelligence presenter, not a real person. Why are they fighting with a robot? (@CITEZW, 6 September 2023).
Other Twitter/X users were asking if Alice was married, which shows that they were regarding her as a ‘real’ human (@CITEZW, 6 September 2023). In essence, some online users focussed on Alice’s physical attributes such as hairstyles which undermines the prospects of adopting AI news anchors for news delivery and audience engagement.
‘The future is here’,: Optimism surrounding Alice
The way some audiences are acclaiming Alice is evidence of the positive reception of the AI news presenter. Some participants celebrated that the ‘future is here’ as they urged newsrooms to embrace AI tools in ways that address contextual needs (Interview, 26 July 2023). Instead of dismissing Alice, some participants believe that Zimbabwean newsrooms should create opportunities around AI tools in order to enhance journalistic practices. In this regard, online users extolled CITE for the ‘brilliant work’ of ‘moving with technology’ (Interview, 28 July 2023). They applauded CITE for the good initiative and creativity in leveraging AI tools to enhance news distribution. In essence, the participants believe that AI is ‘the future’ and applaud CITE for ‘moving with the times’ (Interview, 5 August 2023). Of course, some participants noted the shortcomings of AI-driven news presenters such as the lack of ‘human touch’.
AI news anchors lack a human touch
Some audiences were worried that Alice lacks the ‘human touch’ which makes it difficult for them to connect with the news content and also with the news presenter (Interview, 21 August 2023). Whilst AI anchors mimic humans, the participants argue that these robots are not human enough. Thus, the concern is that Alice lacks a human element. One of the participants said that the ‘problem with a robot presenting news is that it lacks a human interaction’ (Interview, 26 July 2023). Other participants added that newsreaders should have ‘human expressions’, which Alice supposedly lacks (Interview, 26 July 2023). In other words, the participants argue that AI newsreaders lack a ‘human command’ (Interview, 26 July 2023). Heiselberg et al. (2022) assert that robots that are ‘more humanlike’ generate positive attitudes from audiences. This is evident in this study as some participants question the authenticity of Alice due to perceived lack of ‘human touch’: Alice is not genuine. She is not real. She is fake (Interview, 26 July 2023).
Some participants were adamant that they prefer ‘real’ people instead of AI news presenters. Whilst acknowledging that Alice is ‘quite an innovation’, another participant stated that he prefers to ‘watch a real person’ presenting news (Interview, 26 July 2023). Others believe that despite mimicking human-like speech patterns and annotations, Alice is failing to provide these human expressions. Such comments reinforce the belief that AI newsreaders do not provide a ‘human touch’ (McCoy, 2023). Some scholars affirm that AI news anchors ‘will never have the gravitas, the engagement or the raised eyebrow of reporters’ (McCoy, 2023). Closely related with the issue of ‘human touch’ is the concern of voice emotionality.
Discourses on voice emotionality and credibility
Concerns over voice emotionality and credibility demonstrates participants’ understanding of the nexus between humans and machines. The dominant view is that machines are falling short of conducting humanlike tasks as they lack certain attributes. Some participants focussed on what Heiselberg et al. (2022) term ‘voice emotionality’ to judge the credibility of Alice. In other words, CITE’s audiences were concerned with the appropriateness of Alice’s voice. This confirms Heiselberg et al.’s (2022) findings that audiences in Denmark argue that the voice of an AI news presenter is ‘unnatural’ and ‘alienating’. Heiselberg et al. (2022) assert that audiences tend to express disappointment when appropriate feelings are not reflected in the voice of an AI-driven news presenter. One of the participants expressed disappointment with Alice: News reading is art, and at some point you ought to instil some emotions just by how you present a story. That thing is plain, no emotions whatsoever (Interview, 27 July 2023).
The main issue is that audiences feel that Alice does not exude emotions and hence fails to connect with them.
Issues of authenticity and believability determine audiences’ assessment of the appropriateness of AI-driven news presenters (Heiselberg et al., 2022). Participants assert that Alice ‘does not give the listener the assurance that it is real news’ (Interview, 26 July 2023). Thus, some participants do not view Alice as a credible news source: News is about trustworthiness. Audiences have to believe both the news organisation and the news presenter first before believing the news themselves (Interview, 27 July 2023).
The challenge, as noted by participants, is that Alice is not ‘genuine’ and ‘real’. Other perceptions were related to the implications of AI tools automating the tasks traditionally assigned to journalists. As such, AI tools such as Alice are helping to reimagine the role and place of journalists in the newsroom.
Alice and the politics of accent
The reception of Alice is being shaped by the socio-political and cultural environment in Matabeleland region where CITE’s offices are located. Some of the criticisms of Alice should be understood within the political and cultural settings in Matabeleland. To understand this concern with Alice’s mispronunciation of local names, it is important to note the politics of language in Matabeleland. There is a concern about language marginalisation in Zimbabwe local languages such as IsiNdebele are annihilated and denigrated through misspellings (Ndhlovu, 2007; Nxumalo, 2022). Alice’s mispronunciation evokes memories of the perceived and realities of linguistic and cultural subjugation of the minority groups in Zimbabwe.
Whilst AI tools have imitating capabilities, they do not have cultural sensitivity as evident in the failure of Alice to pronounce specific local terms. This shortcoming has triggered expressions of annoyance and disappointment from some audiences who perceive Alice as undermining local cultures. As such, these AI tools reinforce what Mano and Milton (2021a) consider as epistemological hierarchies and Western cultural biases. Some of the reactions to Alice were related to how the AI news presenter was failing to pronounce local names and surnames. AI tools should not only imitate human mannerisms but also local voices. In essence, the lack of contextual awareness in the programming of Alice illustrates the global inequalities in data flows (Couldry and Mejias, 2023; Ricaurte, 2022). User comments on social media pages were replete with concerns about the mispronunciation of local names/surnames. The comments below from @CITEZW twitter (X) page show the concern that Alice pronounce the surname ‘Ncube’ as ‘Cubes’:
Is this lady sure she can’t pronounce the surname Ncube properly? Or is it just drama?
Lawe wena yini leyo Cubes ungasijwayeli kabi why ungabizi ibizo lakhe kuphela ma isibongo sikwehlula? (what is this Cubes [instead of Ncube]? Respect us. Why don’t you just mention the first name if you can’t pronounce the surname).
Pronunciation is a bit awkward.
Alice needs to brush up her Ndebele pronunciation.
As already noted, some of the participants were oblivious that Alice was a robot. They responded by attacking Alice for the mispronunciation of local names. Another user was adamant that ‘people should stand up together in unity and defend their language’ (@CITEZW). Other mispronunciations of Alice include ‘Bhuluweyo’ instead of ‘Bulawayo’. Such concerns about the knowledges of the Other being made ‘invisible’ indicates that ‘hegemonic data culture’ promotes global hierarchies and epistemic violence (Ricaurte, 2022: 730).
One of the participants raised a concern that AI tools are made from a Global North context: The tone and pronunciation become a disconnect from how we as Zimbabweans are used to say certain words (Interview, 28 July 2023).
This affirms how ‘algorithmic governmentality’ is widening the Global North-South systematic asymmetries (Ricaurte, 2022: 727). The processes of datafication and ‘algorithmisation of culture and society’ constitute epistemic violence (Ricaurte, 2022: 730). Other participants queried the naming of Alice and requested that the AI news anchor be given a local name such as ‘Nolwazi’ and ‘Sakhile’ (@zenzele, 6 September 2023). One of the online users commented on Zenzele Ndebele’s tweet: Can you make her more native? Native name and short hair (@zenzele, 6 September 2023)
The idea of making Alice ‘more native’ reflects a call to indigenise, localise and Africanise the AI news anchor. There is an assumption that Alice is ‘Western’ and hence perpetuates Western hegemonic epistemes. The suggested Ndebele names for the AI tool not only reflect a disconnect between the news anchor and local audiences, but also a call to decolonise the digital innovation. The adoption and utilisation of AI tools in newsrooms should take into account the local and cultural contexts. The concern is that AI tools are a Western phenomenon and hence do not resonate with the lived experiences of indigenous communities. AI tools such as Alice should not reproduce and perpetuate coloniality, or asymmetrical global power matrices, but instead reflect local situations. Such concerns affirm and reinforce the calls to decolonise AI tools (Adams, 2021; Couldry and Mejias, 2023; Mumford, 2022). An unintended consequence is that audiences end up focussing on Alice’s failure to pronounce local names, instead of the news content being presented by the AI tool. Such concerns undermine the credibility of the AI news presenter.
‘One job gone’: Alice a threat to journalists in the news sector?
There are debates on whether AI news presenters like Alice are freeing up journalists from routine tasks, or they are threats to their jobs. Sceptics assert that these robots are replacing journalists in the workplace. Such questions underpin some of the audience perceptions of Alice. On the one hand, some participants felt that AI news presenters like Alice will have implications on ‘imisebenzi yethu’ (our jobs) (Facebook.com/CITEZW, 6 September 2023). There is a belief that Alice is taking away jobs and perpetuating the economic exploitation of traditional journalists. On the other hand, participants noted that Alice makes an economic sense as newsrooms are ‘hard-pressed by uncertain financial times’ (Interview, 3 August 2023). The participant added that: ‘if CITE wanted, Alice can work even at 2am and she won’t complain. You won’t get that from us (humans)’ (Interview, 3 August 2023). Some audiences are concerned that in a country where there is a high unemployment rather, news organisations should not be using AI to present news: We don’t have jobs and you are giving them to AI (@zenzele, 6 September 2023). Alice is taking away a job or five from our people (@zenzele, 6 September 2023). One job lost (@zenzele, 6 September 2023).
The intersection of machines (AI) and humans (journalists) is at the centre of the debates. Some participants assert that human agency is being undermined by AI technologies. In essence, there is some resistance to Alice due to the perception that the AI news anchor is taking ‘our jobs’. Such comments affirm findings from some scholarly works that AI technologies may take the jobs of traditional journalists (Jamil, 2021; Moran and Shaikh, 2022). However, other participants argue that traditional journalists can co-exist with AI news anchors. There are certain jobs that have to be conducted by traditional journalists such as going to the field to gather news. Thus, AI news presenters complement the work of traditional journalists by curating information and distributing news more efficiently. Some audiences did not reject Alice as they believe that AI must ‘complement our jobs/efforts’. They called for a synergy between traditional journalists and AI tools in order to produce and distribute relevant news. Thus, the belief is that AI tools should augment the work of humans rather than replace them: It is better for humans to use machines rather than have machines render humans as dispensable (Interview, 27 July 2023).
There is no doubt that AI technologies are reformulating labour relations (Hepp, 2020; Ricaurte, 2022). Hepp (2020: 1415) adds that work bots ‘operate as something more akin to colleagues: companions in content production that reveal transformations to the journalistic work process’. Moran and Shaikh (2022) posit that AI technologies are freeing up journalists to conduct other tasks.
Given the concerns that Zimbabwean newsrooms are under-staffed and journalists are over-worked, other participants argue that AI tools can promote financial sustainability. Of course, one of the participants raised a concern that newsrooms should not be exploitative but should commit to paying journalists fair wages. The fear by some participants is that traditional journalists will soon be out of work due to the ‘takeover’ of machines like Alice. Participants are concerned that newsrooms may begin to replace human workers with AI machines. In the midst of these technological advancements, participants argue that journalists should always remain central in this journalistic process that includes AI tools such as Alice. They state that AI tools should assist in news gathering, analysing frequencies and generating statistics, and not as news anchors.
Kumele umuntu abale I news azincediswa ngu Alice ukubhala (a human must present news. Alice must assist humans to write the news). (Interview, 3 August 2023). Well they say we should adapt to change but at what cost? I think to put Alice in front is wrong, you can use AI for other things like editing and other backroom stuff but in front is like rubbing in; in a country with such high unemployment (Interview, 3 August 2023).
Conclusion
As CITE embraces an AI news anchor, there are pertinent questions on what it means for the future of journalism in the country. Whilst some participants are celebrating this digital innovation, other audiences are raising concerns about Alice’s accent, lack of human touch and voice emotionality and the threat to jobs. Alice denotes an attempt to merge human and machine cultures. Evidently, there are mixed feelings on this nexus between humans and machines. On the one hand, audiences applaud CITE for the innovative and creative way of storytelling. On the other hand, there are concerns about poor pronunciation, lack of human touch and emotions, perceived credibility of AI news and the implications on the news sector. Given these technological advancements, debates on the nexus of machines and humans in the newsroom will persist. CITE’s management is adamant that Alice will not replace traditional journalists, but rather augment journalistic work. However, concerns have been raised on the appropriation of these digital innovations in African contexts. Questions on the naming and accent of Alice reflect the concerns on the social inequalities surrounding AI (Gaw, 2022; Natale and Guzman, 2022), and the need to decolonise and localise AI technologies (Kothari and Cruikshank, 2022).
