Abstract
Questions of data privacy in Africa are imbued with complexity. We examine a slice of this complexity by putting the concept of contextual integrity into dialogue with Africa's plural-legal contexts to explore data privacy within Africa's emerging digital landscape. The conceptual insights are empirically illustrated based on a case study in Ghana, involving content analysis of policy documents and interviews with a sample of residents, cultural leaders (among the Akan ethnic group), subject matter experts, and digital entrepreneurs (e.g. fintech firms). Our findings illuminate a nuanced and contextually rooted understanding of privacy, focusing on the complementarities and tensions around data anonymization for privacy; the multiplicity of information spheres that result in a complicated terrain of privacy breaches; and the individuality, mutuality, and collectivity of privacy harms and remedies thereof. Our findings challenge prevailing discourses in the literature that either suggests (a) misalignments between locally-rooted privacy norms and European-imported statutory regulations on privacy in African countries or (b) privacy myopia among Africans—underestimating their personal information and the need to protect such information. We suggest shifting away from such privacy discourses from above (meta-narratives, generalizations, and inherent assumptions on African states and societies) to conversations from below. That is, refocusing on the particularities of place, culture, and norms that complicate meta-narratives on privacy. This requires centering the everyday privacy norms that result from co-existing, and sometimes oppositional, customary-informed and state-stipulated information norms within Africa's emerging digital landscapes.
Introduction
Questions of data privacy in Africa are imbued with complexity. The nexus of digitalization and data privacy remains a contested terrain, and Africa is no stranger to the unfolding privacy concerns within the data-driven digital age. This is especially the case in the emerging frontiers of digital capitalism in Africa, where digital data subjects are increasingly made visible, knowable, and predictable through the widening panopticon lenses of the digital world (Dalberto et al., 2018; Frimpong Boamah and Murshid, 2019; Srinivasan et al., 2019). It is of little surprise that the often celebratory tone around data-driven digital solutions is met with critical questions around privacy in areas such as public service delivery (Gamage, 2016; Mayer-Schönberger and Cukier, 2013), healthcare (Dash et al., 2019; Galetsi et al., 2020), precision decision-making in agricultural and food systems (Biermann et al., 2021; Carolan, 2017), and smart and connected urban environments (Al Nuaimi et al., 2015; Löfgren and Webster, 2020; Söderström et al., 2020). As emerging digital data solutions are embedded in socio-institutional contexts, the meanings and practices around data privacy become complicated at best and often challenging.
Approaches to understanding data privacy's complexity range from conceptual to empirical. On the one hand, there is a purely conceptual departure point, which risks oversimplifying human behavior for theory's sake. Take, for instance, arguments, which with “weak empirical evidence” (Makulilo, 2016: 193) claim that there is a privacy myopia among Africans, who supposedly underestimate their personal information and the need to protect such information (see Bakibinga, 2004; Makulilo, 2016). On the other hand, there is a purely empirical approach, focusing on accounts of lived reality without conceptual guidance. In the interplay between the two ends of theoretical and empirical lies a contextual approach, which we will take as a departure point in our analysis. We need contextualization rather than simplification to understand the lived reality of privacy. The following is an example of a documented privacy concern in using fintech or mobile money: “If A suspects that a partner B is cheating, and A finds a number that B calls regularly, A sends money to the credit of that number and finds who subscribes to that number. However in order for A to remain unknown to the person calling B regularly, A sends either an amount that falls below the allowable minimum credit or sometimes an amount that exceed his credit balance. In either case a report is generated even if the transfer fails. This report normally discloses the name of the third part calling B and his phone number” (Makulilo, 2015: 376).
This article contributes to these ongoing data privacy conversations, particularly around the contextual integrity (CI) of privacy in emerging digital-data solutions. In particular, it focuses on emerging conversations around the need for context-informed conversation around data privacy (Nissenbaum, 2004, 2010) by probing, how do Africa's socio-institutional contexts illuminate the challenges and prospects of and lessons for data privacy within emerging data-driven digital spaces? For instance, some have called for context-sensitive criteria for data processing and re-purposing as a possible data “governance fix” to the flexible regimes afforded to health data processing, especially for Big Tech platforms, in the General Data Protection Regulation (GDPR) and the European Union Regulation 2016/679 (Marelli et al., 2021: 8). In their cautionary note about possible privacy infringements due to the large-scale deployment of technology-enabled COVID-19 datafication and verification measures in Africa, we are reminded that “[…] technologies do not evolve in a legal vacuum, the existing laws and regulations must be respected” (Beduschi, 2022: 4).
In this article, we conceptually and empirically explore CI within Africa's plural-legal contexts to map the tensions, complementarities, ambiguities, and lessons around data privacy resulting from co-existing and interacting customary-informed and state-stipulated information and distribution norms. We pursue this goal in the following sections. The next section maps the conceptual landscape by putting CI and privacy into dialogue with the literature on plural-legal contexts in Africa, distilling the key debates and gaps in the literature on data privacy and regulatory frameworks designed to ensure such privacy. We illustrate this conceptual landscape with empirical moments based on a case study in Ghana involving content analysis of policies/regulations on privacy and privacy-related matters, as well as interviews with sample residents, digital platform entrepreneurs, and cultural leaders in Ghana. The empirical moments speak to how the co-existence of customary-informed and state-stipulated information norms around privacy illuminate three key issues: the tensions and complementarities around data encryption and anonymity, the multiplicity of information spheres that result in a complicated terrain of privacy breaches, and the individuality, mutuality, and collectivity of privacy harms and remedies thereof. We conclude by reflecting on the implications of these three findings for data privacy discourses in Africa.
What's in a context? Privacy within Africa's plural-legal contexts
Privacy is variously defined but its contextual nature has often been misinterpreted in Africa. Generally, privacy is most commonly associated with notions of “information control,” “non-interference,” “limited accessibility,” and intimacy (Makulilo, 2016: 195). In his seminal work on information privacy in the U.S. context, Westin (1968: 5), for instance, defines privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.” Despite its definitional variation and manifold applications, in the decades following Westin's (1968) interpretation of the term, privacy's scholarly analysis has often remained a Eurocentric endeavor. Arora (2019: 3), for example, has called for a decolonization of privacy towards a definition of the concept that does not anchor itself in the universalization of Western cultural assumptions and the privacy attitudes of the “Western-based, white, middle-class demographics.” Similarly, Makulilo (2016: 193) has highlighted how the conceptualization of privacy as fundamentally linked to individuality has led to the argument that Africa exhibits a “privacy myopia.” According to this view, African societies’ focus on collectivity and community circumvents the meaningful formulation and prioritization of privacy. In contrast to this narrative, Makulilo (2016) argues that in today's Africa, there is in fact neither an absence of privacy law nor a stagnant attitude towards privacy. That is to say, privacy in Africa, but also more generally, is constantly evolving. Some may also argue that purported privacy myopia in Africa is fully shared worldwide, including in the very countries whose regulatory ideas have dominated the sphere. That notwithstanding, the particularities of how such myopia (or not) would play out in other contexts present empirical questions of importance, which require investigations in various socio-legal contexts to avoid the flattening of specific local privacy expectations. 1 What follows is the need for context-informed conversations, which bring to light the nuances of data privacy in the African context.
Helen Nissenbaum's theory of CI presents a relational view of privacy, technology and society. It is grounded in rejecting a series of dichotomies in how privacy is discussed and enacted in research and policy: access to information by government vs. non-government entities, sensitive/confidential vs. non-sensitive/non-confidential information, and public vs. private spaces/spheres (Nissenbaum, 2004, 2010). Deploying these dichotomies universally, with minimum attention to the particularities of contexts, erases nuanced conversations around how people's interactions with each other, technologies, and institutional environments structure their privacy preferences and decisions (Nissenbaum, 2010; Vitak and Zimmer, 2020). Here, the theory of CI challenges previous theories of privacy as information control (Fried, 1990; Moore, 2003; Westin, 1968) or restricted access to sensitive information (Allen, 1988; Bok, 1989; Gavison, 1980; Wacks, 1989) by connecting issues of control and restricted access to the context norms that guide behavior and decisions, “The framework of contextual integrity reveals why we do not need to choose between them [control and restricted access]; instead, it recognizes a place for each. The idea that privacy implies a limitation of access by others overlaps, generally, with the idea of an informational norm […] Control, too, remains important in the framework as one of the transmission principles” (Nissenbaum, 2010: 147–148).
CI is violated when either or both norms of appropriateness and distribution is/are transgressed within a context (Nissenbaum, 2004). That is, what constitutes data privacy violations or compliances is deeply context-defined, thus rendering unproductive universalisms in privacy discourses. For instance, some argue that one useful element of CI is spotlighting the context in which personal information flows to determine whether normative privacy protections are needed (Grodzinsky and Tavani, 2005), which helps to avoid simply applying universal normative protection clauses in every instance of data privacy. For some, balancing between what is context-sensitive and what may be seen as universal normative privacy protections illustrates the tensions in the EU GDPR, especially when considered within the growing world of techno-data solutions (Politou et al., 2018; Wachter, 2018; Wachter et al., 2017; Zarsky, 2016). Another important aspect of CI theory detailed in Nissenbaum's 2010 book is to assess the dynamic nature of norms by looking at the moral value of practices. CI has an inherent conservative bias since it prioritizes existing informational norms in the assessment of whether contextual norms are upheld (159). One way to more appropriately evaluate practices that counter pre-existing informational norms is to probe how these practices contribute to the “values, ends, purposes, or goals” (166) of the respective social domains.
Probing privacy through CI also illuminates the plural-legal contexts of societies, especially in Africa. Legal pluralism refers to co-existing multiple, and mutually constitutive legal forms or normative orderings in the same contexts or social fields (Merry, 1988; Tamanaha, 2008, 2011). For some, the social field is integrally plural, wherein higher-order normative or legal orderings, such as statutory laws, evolve and are mutually reconstituted through their interactions with other normative orderings operating within the field (Fitzpatrick, 1983; Frimpong Boamah and Walker, 2016). In Moore's (1973) theorizing of social fields as semi-autonomous, we are reminded that the plurality of rules or normative orderings is the norm, not the exception, of all societies because a social field, “[…] has rule-making capacities, and the means to induce or coerce compliance; but it is simultaneously set in a larger social matrix which can, and does, affect and invade it, sometimes at the invitation of persons inside it, sometimes at its own instance” (Moore, 1973: 720).
Although legal pluralism is by no means a uniquely African phenomenon, it plays a significant role in the various African contexts, where pre-colonial, colonial, and post-colonial legal and quasi-legal systems co-exist and co-develop with such newfound influence of EU standards on the continent. The focus on “Africa” is thus not meant to imply a flattening of the multiplicity of contexts, but rather to stress the value of a broader lens, particularly in response to claims that Africa is a particularly privacy myopic place.
For CI within Africa's plural-legal contexts, the issue is not only about the degree or extent to which EU privacy laws may be influencing (positively or negatively) statutory laws on privacy in Africa or what some have discussed within the context of digital colonialism (Coleman, 2018). The issue is also about the inadvertent disregard of or glossing over the historically contingent processes through which various information norms of appropriateness and distribution have existed and evolved within weak or deep plural-legal contexts in Africa. Hence, some conversations around data privacy in Africa tend to assume away those internally generated (and sometimes customary-informed) information norms guiding behaviors within distinct African cultures, often leading to conclusions such as Africans (a) have weak legal culture in data privacy laws (Abdulrauf and Fombad, 2017; Alunge, 2019), and/or (b) are privacy myopic—have the tendency to underestimate their personal information and the need to protect such information (Bakibinga, 2004). These conclusions do not sufficiently account for the role of non-statutory normative orderings in shaping how Africans of distinct cultures perceive, define, modify, negotiate, and evolve what constitutes informational norms of appropriateness and distribution. Makulilo (2016) addresses an aspect of this privacy myopic problematique by arguing that, absent careful empirical work within African communities, we often miss the point about the evolving privacy consciousness among Africans as their traditional, internally generated information norms confront external influences.
Thus, a more productive conversation, which we conduct in this study, is probing the tensions, ambiguities, and lessons on privacy through the interaction of customary-informed and state-stipulated information and distribution norms. Such a probing shifts the conversation from above (meta-narratives, generalizations, and inherent assumptions on African states and societies) to conversations from below–the particularities of place, culture, and norms that complicate meta-narratives on privacy in Africa. Thus, we contribute to the literature on privacy complementarities, tensions, ambiguities, and lessons (Makulilo, 2016; Reviglio and Alunge, 2020) within Africa's semi-autonomous, plural-legal field with co-existing customary-informed and state-defined laws and practices on informational norms. Next, we use the case of the fintech sector in Ghana to map three conceptually grounded empirical moments to capture the tensions, complementarities, ambiguities, and lessons.
Three empirical moments of privacy within Ghana's plural-legal context
We situate the conceptual arguments presented in concrete empirical examples based on a case study in Ghana. This case study offers a context-situated understanding of the everyday lived experiences and understandings of privacy among different stakeholders within Ghana's plural-legal contexts. Specifically, the case study helps to illustrate the semi-autonomous field of privacy as embodying complementarities and tensions around data anonymization, the multiplicity of information spheres that result in a complicated terrain of privacy breaches, and the individuality, mutuality, and collectivity of privacy harms and remedies thereof. We focus on the Akan ethnic group, mostly living in West-African countries, such as Ghana, Ivory Coast, and Togo. According to the 2021 Population and Housing Census, the Akan are the majority ethnic group in Ghana, constituting 45.5% of the population (Ghana Statistical Service, 2022: 30).
First, a brief commentary on the data and methods employed in this case study analysis. This case study is exploratory (Yin, 2009), based on an initial pilot study in Ghana, involving interviews with conveniently sampled residents in Kumasi (n = 20 residents across urban, peri-urban, and rural communities) and a purposively sampled stakeholders (n = 12, including cultural leaders, subject matter experts on Ghanaian Akan philosophy, privacy, culture, and technology, and digital entrepreneurs, mainly within the fintech sector). We also relied on secondary data, including newspaper articles, reports, and relevant regulatory instruments: 1992 Constitution of Ghana; 2012 Data Protection Act (Act 843); 2003 Ghana Information Communication Technology Policy ICT4AD Policy; 2019 Payment Systems and Services Act (Act of 987); Securities Industries Act (Act 929); 2020 Cybersecurity Act (Act 1038); 2020 Anti-money Laundering Act (Act 1044); 2008 Electronic Transactions Act (Act 772); and 2016 Deposit-Taking Institutions Act (Act 930). As a pilot exploratory study, we prioritized depth over breadth by using our interviews and policy reviews to delve deeper into topics such as (1) historical, cultural, and philosophical meanings, framings, and practices around privacy within the Ghanaian (Akan) context, (2) the mediating role of Ghana's plural orderings (statutory and customary) in structuring information norms around privacy, (3) the impacts of digital technology on meanings and practices around privacy in Ghana's plural-legal contexts, and (4) the prospects and challenges of designing data privacy laws within the rapidly evolving digital landscapes of plural-legal contexts in Ghana. This initial case study account is openly neutral, siding neither with residents nor other stakeholders. We write as at-length observers, utilizing these multiple data sources to ground conceptual and philosophical conversations around data privacy in Ghana.
Complementarities and tensions around information norms on data anonymization for privacy
Discussing the CI of privacy within Ghana's plural-legal context requires interrogating the (mis)alignments in statutory regulations and customary-informed norms and practices around privacy. First, there are similarities in statutory regulations and customary-informed norms and practices in framing information norms on privacy. Information norms around data anonymization serve as valuable examples here. In data privacy, data anonymization involves obscuring or removing personally identifiable information (PII) to safeguard private or sensitive information of individuals in the storage, transfer, communication, and other legitimate use of the data. Three ways of data anonymization align very well with historically contingent customary-informed norms and practices around data privacy in Ghana. The first is data masking, which involves de-identifying or removing PII (e.g. names, addresses, phone numbers, emails) or encrypting the data, including using cryptographic techniques to translate PII or other sensitive data into a code or ciphertext that can be accessed or read by those with a secret key or password. The second is data pseudonymization, which anonymizes data by replacing PII with pseudonyms, and the third is data swapping, which uses fictitious data to replace actual data values.
Data masking, pseudonymization, and swapping are coded in statutory data privacy regulations but are also exemplified in certain cultural norms in Ghana. The three data anonymization strategies are enshrined, albeit loosely or indirectly, in statutory provisions stipulated in the Data Protection Act (DPA), 2012 (Act 843) and the Cybersecurity Act, 2020 (Act 1038). For instance, the DPA states, “A data controller shall destroy or delete a record of personal data or de-identify the record at the expiry of the retention period” (DPA 24: 5). The Cybersecurity Act provides guidelines for service providers or owners of critical information (e.g. fintech organizations) to protect data, including “blocking, filtering, and taking down” data to ensure “[…] the protection of children; the public safety; […]; the protection of reputation or the rights of an individual; the prevention of the disclosure of information received in confidence […]” (DPA 87: 2). For some data entities, such as fintech and related entities, the Bank of Ghana requires compliance with supranational data privacy laws, including the International Organization for Standardization (ISO) 27001-certified information security management systems. The ISO 27001 specifies cryptographic controls policy, including data encryption and/or decryption mechanisms to convert information from plain text to cyphertext or vice-versa. One of the fintech stakeholders interviewed also noted that “complying with the ISO 27001 standards and the PCI DSS (Payment Card Industry Data Security Standard) meets the Bank of Ghana's rigorous data protection requirements in issuing fintech licenses” (Respondent 01, field interview).
These statutory-coded anonymization strategies reflect certain cultural norms on privacy in Ghana, particularly among the Akan. Customarily, data anonymization is highly privileged in communication or records keeping, partly because “It's considered a violation of privacy to mention a person's full name, especially in their absence or without their explicit permission” (Respondent 04, field interview). Instead, data pseudonymization (replacing with pseudonyms), data swapping (using fictitious data), or data masking (de-identifying or encryption) are the accepted culturally informed norms and practices. For data pseudonymization, customarily in Ghana, particularly among the Akan, the day naming system generates pseudonyms for individuals during conversations, especially when such individuals are not present in such discussions. The day naming system works as follows: “A person may be called Kojo, Adwoa, Kwame, or Kofi, but these are not their full names [but names derived from the day of the week they were born]” (Respondent 04, field interview). For instance, for the Akan, a female born on a Monday is called Adwoa [or Adjoa], and the male counterpart is called Kwadwo [or Kojo]. Referencing “Adwoa” in dialogues or recorded documents is a less meaningful identifier, which helps conceal the individual's real identity. Data swapping or replacing actual data with fictitious data is also a culturally appropriate norm or practice for anonymizing sensitive information, especially age and income data, “[…] asking people about their age is not always culturally appropriate. That is why some people intentionally provide a false age to conceal their real age. And the same applies to other things, such as asking people how much money they have” (Respondent 06, field interview).
Further, data masking through encryption remains a cherished cultural norm to ensure privacy. Data encryption manifests through the cultural practice of converting plain or easy-to-understand data (ideas, secrets, PII) into not-so-easily discernible linguistic or poetic forms, such as proverbs, metaphors, and didactic stories. For one respondent, “The Akan language is dominated by proverbs because of information privacy. We can speak volumes, and an outsider will not understand a word of what we are saying” (Respondent 05, field interview). The norm of encryption suggests a contextually appropriate practice of transforming sensitive information into linguistic formats for secured and appropriate distribution to the appropriate recipients, a way to exclude those considered “outsiders” and include those viewed as “insiders” in the information value and transmission chains. Drawing from the world of data encryption, the norm of encryption here is not simply about encrypting identifiers of individuals, but converting the entire information or plaintext into ciphertext, allowing only authorized individuals to decipher the ‘hidden’ meanings conveyed. Some examples of encryption manifest in information distribution around death and the traditional authority, “[…] certain information about our chiefs is kept private, and they are sometimes communicated in proverbs or metaphors to maintain privacy. For instance, the death of a chief is communicated in a coded language, such as, ‘the chief has gone to the village’” (Respondent 04, field interview). Similarly, “[…] culturally, the death of a relative is often not announced plainly to everyone, they may say, the person slept on his left side; so that if children are in the house, they may not understand the message in plain words” (Respondent 06, field interview).
Moments of tension within the semi-autonomous field of privacy also manifested during the fieldwork. Such tensions could be observed in some respondents’ dilemma regarding disclosing personal information to fintech/telecommunication companies and the government. In the past three years, the Government of Ghana rolled out two key policy directives that required all citizens to (1) register their mobile SIM cards with their telecommunication/fintech companies, and (2) provide and verify their personal details to obtain their Ghanaian ID cards (Ghana Card). Failure to comply with these two directives meant those without registered SIM cards and/or the Ghana Card would not have access to telecommunication and fintech services and face significant barriers in service or business transactions. For some respondents, these directives may contradict culturally-rooted information norms around data anonymization (masking, pseudonymization, and swapping). For instance, these directives subvert the culturally appropriate information norm of replacing actual data about yourself (e.g. age and income) with a fictitious one (data swapping) to help conceal information from those you may not trust. For others (also discussed further in the third empirical moment), recent incidents of receiving fake/scam calls and text messages from people they don’t know make them wonder if these fintech companies and/or the government are not taking too seriously or even subverting some of the culturally appropriate information norms, such as pseudonymization (e.g. using the day naming system to generate pseudonyms for people) or data masking (e.g. encrypting data).
These concerns are not entirely unfounded, especially with reported investigations into banks for providing unlawful disclosure, access, and processing of personal data in Nigeria (Akintaro, 2023), or recent allegations by Nigeria's National Identity Management Commission that the Foundations for Investigative Journalism, which stores the digital identity data for Nigerians, is selling the data to a private company (Burt, 2024; see also Borokini and Oloyede, 2021). In a sober reflection on some of these tensions by one of the fintech entrepreneurs interviewed during our fieldwork, they noted, “[…] one of our biggest concerns about data privacy is not about people being bad. Yes, we have scammers, but we are also concerned about us [fintech firms] and the government being good actors. Now, the government's new regulation of linking your SIM card to your Ghana Card offers some huge possibilities to do good things. But are we also looking at the bad actors? Are we looking at future instances where successive governments may have bad actors or those who do not believe in such digitalization initiatives?” (Respondent 20, field interview)
Multiplicities of information spheres and resulting complicated terrain of privacy breaches
A CI analysis of privacy within Africa's plural-legal contexts soon confronts the complicated terrain of information spheres and privacy breaches as statutorily defined and customarily informed norms co-mingle and intersect in complex ways. Privacy and intrusions have often been conceptualized as respecting the boundaries between the private and public spheres. Yet, the public–private dichotomy is of limited help in understanding privacy. In particular, when investigating online spaces (e.g. forums, social media, etc.), the inability of such a public–private lens to live up to the complexities in privacy dynamics becomes clear (Mackenzie, 2017: 294). The more nuanced understanding of privacy breaches that CI offers can thus be understood as a move away from oversimplifying dichotomies (Table 1).
Complicated terrains of public–private spheres and privacy breaches in Ghana's plural-legal contexts
We begin by complicating slightly the public–private binary of information spheres. A nuanced analysis of statutory regulations and customary-informed norms/practices around the public–private information spheres in Ghana illuminates four archetypes of information spheres and what may constitute privacy breaches within these spheres (see Figure 1). Privacy comprises the data type (private/sensitive and public data) and whether it is privately or publicly accessed, used, transferred, or communicated. The public access, use, or communication of public data (public–public spheres) poses a low risk of privacy breaches under both statutory regulations and customary-informed norms/practices in Ghana. Conversely, the public access, use, or communication of private/sensitive data (private–public spheres) poses significant risks to privacy breaches under both statutory regulations and customary-informed norms/practices in Ghana. Most often, discourses and regulations around data privacy are geared towards minimizing the risks associated with the private–public spheres, exemplified in this incident narrated by a respondent, “I took a government student loan for my college education, both tuition and living allowances. When I had my job at the bank, the government sent an email to my employer about the loan repayment plan. My employer shared this information with everyone at the workplace […]” (Respondent 12, field interview).
In addition to the often-limited public–private dichotomy, several other spheres exist within the Ghanaian Akan customary-informed norms and practices, all of which are ruled by certain informational norms. We determine multiple binary spheres from the field interviews, including, (a) parents–children, (b) adults–children, (c) couple–family, (d) wife–husband, (e) male–female, (f) state–citizen, and (g) nuclear–extended family. These binary spheres are not exclusive to Ghana or even the African context. For some of the respondents interviewed, certain information can (and cannot) be shared across these binary spheres, often around money, reproductive health, wealth and property, travel plans, health, death, etc. Safeguarding against privacy breaches across these binary spheres is also about protecting individuals from harm. For instance, some respondents believe that parents should not share information about wealth with children to avoid privacy breaches, but also to protect the children from having access to sensitive information that may expose them to harm when shared with others. For instance, a respondent highlighted the cultural importance of protecting the adult sphere from children's intrusion: “[…] when a child gets to the age when they understand or can communicate, they will always be asked to excuse themselves when adults are having conversations […] the child may be in danger or endanger others if they repeat a sensitive information they heard from the adults” (Respondent 14, field interview).
Individuality, mutuality, and collectivity of privacy harms and remedies thereof
A crucial third component that emerged from our empirical observations is the harms associated with infringing contextual informational norms. Generally, the harms incurred by breaching informational norms can pertain to the person whose information is discussed. In particular, the risk of societal stigmatization and curses has been mentioned during the interviews. For example, a respondent describes the following: “So maybe I’ll be going abroad, maybe in the next two days, I have to keep it a secret. It's good for me to keep it a secret because I have the thought that, if someone hears it, my family witches will do something to me, so it's good for me to keep it” (Respondent 11, field interview).
Within the semi-autonomous field of privacy, these harms do not seem to be accounted for by statutory privacy laws, exemplified by the DPA of 2012. The DPA prohibits the processing of “special personal data” defined as “religious or philosophical beliefs, ethnic origin, race, trade union membership, political opinions, health, sexual life or criminal behavior of an individual” (DPA, 37, 1, b). More generally, the data subject has the right to stop the processing of their “personal data which causes or is likely to cause unwarranted damage or distress to the individual” (DPA, 39, 1). However, the sort of ‘special personal data’ captured in the DPA is not well aligned with those rooted in customarily informed norms discussed by our respondents. Consequently, the harms associated with the breach of contextual informational norms described in the interviews do not necessarily translate into the special personal data addressed in the DPA. This matter further complicates any enforcement of data privacy regulation. Judging the proportionality of a harm from infringed contextual informational norms thus harbors a lot of uncertainty and, consequently, requires substantial value judgments. Within the life cycle of data, there are several points at which value judgments about the proportionality of harms and the decision to prevent or remedy them could be taken.
Forbidden information transfer seems to occur even within the boundaries formalized by legislation. Under “security measures,” the DPA states that: “A data controller shall take the necessary steps to secure the integrity of personal data in the possession or control of a person through the adoption of appropriate, reasonable, technical and organisational measures to prevent (a) loss of damage to, or unauthorised desctruction, and (b) unlawful access to or unauthorised processing of personal data.” (28, 1, a & b). “How those people get our names and contacts needs to be investigated. Let me use [company] because that is my network provider. If I register my SIM and give you the card, then later someone has access to my information, it means you leaked it. I think the staff at [the company] exposes our details to the fraudsters. How can a third party who is not present have access to my details?” (Respondent 09, field interview)
It remains unclear, yet highly unlikely, whether any of these breaches and subsequent harms have been reported to the data protection authority. This is further complicated by the fact that the responsibility lies with the individual, who has been harmed, which is an issue that has also been raised within the context of the EU's GDPR (Bieker, 2022: 171, 189–190). The standard outlined by the DPA thus seems to be in stark contrast to the lived experience of mobile money users and practices within the fintech sector, where, somewhat ironically, the prevention of fraud has also been brought forward as a primary driver behind the collection of more personal information.
Remedies for harms incurred from the breach of contextual informational norms may be brought forward in alternative, co-existing realms of authority rather than in the courts of statutory law. Several respondents mentioned that chiefs can provide a remedy: “In most cases, it ends up in the palace, it sometimes turns to a curse, and through that it ends in death. It can also end up in prison” (Respondent 16, field interview). Or: “In the Akan setting, you get trouble when you gossip and it gets to the Chief's place. When it happens like that, there have been structures put in place by Chiefs to find the truth and punish the guilty” (Respondent 05, field interview)
Discussion and conclusion
Earlier, we have put forward that privacy is best understood as a semi-autonomous field within Africa's legal plural context. Several of our empirical observations confirm the usefulness of this analytical lens in understanding some core issues in the African privacy discourse. In the following, we will connect our theoretical assumptions with the empirical moments emerging from the case study on fintech in Ghana.
First, in line with CI, we posited that rather than just privacy regulation, contexts of privacy consist of statutorily defined and culturally informed information norms. Shifting the focus away from top-down legislative initiatives allows us to uncover privacy “from below.” Although often coded in statutory privacy regulations, we illustrate that several data anonymization practices, such as data pseudonymization (replacing with pseudonyms), data swapping (using fictitious data), or data masking (de-identifying or encryption) are the accepted culturally informed norms and practices around privacy. Here, certain cultural norms mimic statutory regulations around privacy, including relying on day naming system to generate pseudonyms for individuals, deliberately leaving out or giving out false information to conceal sensitive data, and using proverbs to encrypt or obscure information or entire contexts, thereby allowing information sharing in a protected and appropriate manner to “insiders” only. These findings suggest the need to shift away from the lopsided, state-heavy conversations on privacy, including the emphasis on European-imported statutory regulations on privacy in African countries (Birnhack, 2008; Makulilo, 2021, 2017), towards recent calls for careful on-the-ground empirical work within African communities to understand the particularities, socio-cultural, and institutional contexts of privacy (Makulilo, 2016).
Second, our findings suggest that privacy is inherently processual and regenerative. It evolves through multiple spheres and breaches as statutorily defined and customary-informed norms and practices co-mingle to structure individuals’ daily encounters and renegotiations around privacy. This fluid nature of privacy and informational norms could be seen in the description of day-to-day encounters within the different spheres of life. The four archetypes of public–private spheres highlight the ambiguities and associated risks of privacy breaches as statutory regulations and customary-informed norms/practices complexly intersect. Beyond the public/private dichotomy often evoked in privacy discussions, the empirical instances also highlight the delineation of informational spheres along various lines, such as age, family roles, or even spatial and temporal dimensions, further complicating the ambiguous terrain of privacy in the everyday lived experiences of residents. Factors such as whose space one is in, whether a conversation has begun before one's presence, and whether one has been explicitly invited into the conversation need to be considered. What follows is a nuanced, continuous process of gauging informational norms to respect contextual privacy. Accounting for the multiplicity of spheres and risks of privacy breaches within plural-legal contexts in African societies raises questions about the appropriate regulatory scope of statutory privacy laws and the promise and perils of emerging digital technologies in reinforcing or mitigating the risks of privacy breaches within and across multiple information spheres. Further, from a plural-legal context, policy and intellectual discourses around Africa's weak legal culture in data privacy laws (Abdulrauf and Fombad, 2017; Alunge, 2019) inadvertently underestimate the difficulties and complications of governing privacy as a processual and regenerative semi-autonomous field. At the very least, such policy and intellectual discourses must contend with the tensions in enforcing statutory privacy rules from above when customary-informed rules from below remain tenacious and evolving. Similarly, novel practices influenced by statutory and customary-informed rules and technological change ought to be judged by whether they serve the “values, ends, purposes, or goals” (Nissenbaum, 2010: 166) of the different social spheres.
Third, all three empirical moments make clear that the idea of Africa as a privacy-myopic place does not tell the whole story. The first (data anonymization norms) and second (multiple boundaries of information spheres and privacy breaches) empirical moments illustrate that informational norms have deep cultural roots and are an integral part of cultural practices. It is therefore not surprising that “gossip” (“konkonsa”) is disapproved, with possible individual and collective harms, and that remedy for harms resulting from a breach of informational norms is regularly dealt with in traditional realms of authority such as the chief's palace. Perhaps, the fact that some interview respondents were less critical about giving their personal data to fintech operators and the Government of Ghana may seem to support Bakibinga's (2004) privacy myopia conclusion–the tendency among Africans to underestimate their personal information and the need to protect such information. Undeniably, the roll-out of multiple digital platforms in Africa tests the boundaries of information norms as the virtual and real spheres converge and diverge across multiple scales. At the heart of such convergences and divergences may not be so much privacy myopia but an ongoing process of technology-aided rescaling of boundaries of information spheres and privacy breaches thereof. In this process, Africans are emerging as privacy-conscious but constantly renegotiating the rescaling boundaries of contextually rooted information norms within often oppositional plural-legal contexts.
Finally, this study aims to initiate further interrogation into the extent to which Africans’ privacy consciousness is discounted, limiting a multiscale, context-informed, and evolutionary articulation of privacy concerns in policy and scholarly debates. As African societies encounter statutory laws and other externally-driven regulatory regimes on privacy, there are limited in-depth inquiries into how the co-mingling of customary-informed norms, statutory and supranational laws, and other external forces, such as digital technologies (e.g. fintech instruments) may generate “rules shopping” (Benton, 1994: 237) opportunities. Thus, the still missing pieces involve a careful understanding of how such rule-shopping opportunities may allow citizens to explore, negotiate, modify, and evolve the rules/norms deployed to justify privacy decisions and/or modify their sense of privacy in ways that may transgress or comply with statutory or supranational privacy laws. Within this field of privacy transgressions and compliances, Africans may not be privacy myopic as some may suggest, but may be evolving their privacy sensibilities as they strategically shop for information norms/rules and leverage the tools and logics of emerging digital platforms to transgress or comply with privacy norms. The conceptual and empirical concern here is to determine how Africans’ lived and negotiated privacy experiences embody their context-informed and evolutionary privacy consciousness as they navigate the semi-autonomous field of privacy within their distinct local cultures.
Given these points, we hope this study will initiate further research. The study's two main contributions are, first, much-needed empirical data privacy research, and second, the analysis of data privacy expectations outside of dominant North-American and European contexts. Although primarily drawing on empirical observations from Ghana, we believe the insights on navigating informational norms are relevant for the broader African context and its multiplicities.
Footnotes
Acknowledgments
The authors thank all the residents and stakeholders who took the time to speak with them about the topic, and the peer-reviewers and editors for their thoughtful feedback.
Ethical contributions
This work has received approval under the ethics code at the University at Buffalo. The study has also been granted exemption from further ethics approval under the following IRB ID: STUDY00004273.
Consent to participate
Informed consent was obtained from all individual participants included in the study.
Author contributions
Both authors have contributed equally to the manuscript.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was funded by Digital Africa, the Gold Coast Research Institute, and the Just Institutions Lab at the University at Buffalo.
Declaration of conflicting interests
Corresponding author Aisha P. L. Kadiri was employed by the funding institution Digital Africa until December 2022. The authors have no other relevant financial or non-financial interests to disclose.
Data availability
Data is available upon request.
