Abstract
Data sharing practices between governments and the private sector are characterized by a lack of transparency which has potential implications for human rights. Minimal scholarship exists investigating how companies address human rights risks stemming from government requests for user data. Understanding corporate response processes to government requests is central to advancing human rights research at the intersection of tech company conduct. This becomes even more pressing as emerging technologies gather increasing amounts of data. Scholarship demonstrates that transparency reporting cannot assist in analyzing data sharing practices between the private and the public sectors due to a variety of constraints. Using semi-structured interviews with senior staff at technology companies, this paper presents an empirical analysis of how technology company representatives and external advisors seek to align their processes when responding to government requests for user data. It describes a set of six themes using human rights terminology which company representatives aim to employ when responding to requests.
Introduction
The information disclosed by the US-headquartered telecommunications provider Yahoo! about the Chinese journalist and political dissident Shi Tao to the Chinese government in 2005 led to his arrest and imprisonment (MacKinnon, 2007). This is only one of many examples illustrating how the information that technology companies hold about their users can be used against them. It also illuminates the core dilemma for companies around government requests for user data: whether to comply with local governments’ orders or protect their users’ interests and rights, including those related to privacy. The Shi Tao case was a watershed in internet governance and illustrated to the broader public the problems of public–private data sharing relationships. Equally, the case emphasized the importance of analyzing the human rights implications of data sharing requests—since the company would have had to consider the potential consequences for the rightsholder, it might have decided against complying with the request. Alongside policy developments in recent years, this incident triggered a reaction within the technology industry at large. To a certain extent, this moment of reckoning also sparked a change in the conduct of many companies. Yet there remains a great deal of ambiguity around public–private data sharing relationships, especially with regard to how companies respond, which ranges from complicity in mass surveillance to individual companies pushing back against government overreaching (Deibert, 2019; Deibert, 2015a; Flyverbom et al., 2019; O'Neil, 2017; Zuboff, 2019).
In this context, government requests for user data present companies with a dilemma (Soghoian, 2011). Millions of consumers conduct their private communications on the products and services provided by technology companies, and despite public debate about privacy, most companies struggle to meaningfully communicate around the threat of unusual requests from government agencies and how these may adversely impact their users, nor do they speak about the degree to which companies assist or resist such access. This level of opacity hinders the public's and/or users’ ability to differentiate between companies in terms of the degree of assistance they provide to government agencies (Soghoian, 2011). The opacity of these data sharing relationships can have particularly severe impacts on rightsholders in situations where the rule of law is limited (Ebert, 2021). As a result, individual corporate practices heavily influence responses to government data disclosure requests, particularly in fraught political settings (Ebert, 2021).
Broadly, we can distinguish between two types of government request for communication records (Bernal, 2016): intercepts and user data. Intercepts are requests made by governments or other public authorities, including requisition orders and administrative requests, which require the disclosure of call content. In cyber policy, this is commonly referred to as “traditional surveillance” (Bernal, 2016: 247). In the case of a requisition order, a legal directive is issued by a government agency or authority, typically to an organization or individual, demanding the provision of specific goods, services, or information. Requisition orders are used in a variety of contexts, such as in emergencies, law enforcement, or government operations. They can require the delivery of goods or services, the production of documents or information, or compliance with certain directives. A requisition order might be issued to a company to provide certain financial records as part of an investigation. Requisition orders are legally binding and are usually backed by relevant laws or regulations that grant the requesting authority the power to make such demands.
User data contains information such as call details (i.e. traffic data which includes sender, destination, frequency, and duration), customer identification data (e.g. first and last name, address, and date of birth), and geolocation, billing, and payment information. In cyber policy, this is commonly referred to as “new surveillance” (Bernal, 2016: 247), and it is the type of data this paper mainly focuses on. There are no statistics available on the compliance rate, and companies are unwilling or unable (due to legal restrictions) to disclose such numbers (Soghoian, 2011). Most government requests for user data usually come from government entities in the law enforcement/national security context (ibid). However, in certain political contexts, other government entities will try to access user data from companies. In what follows, I describe a hypothetical example of how different types of user data might be held by a social media company: user name with disclosure of full name; traffic data including information about audio or video calls made within the social media platform; sender and destination, comprising the person's user ID and the IDs of other users the person communicates with; the frequency with which the person engages in audio or video calls; the duration of each call; customer identification data; date of birth; geolocation information such as where the person is when they use the social media platform, which helps personalize content and features based on the person's location; billing details if the company offers premium features or services, including the type of subscription, its start and end dates, the cost, and details of the payment method the person uses (such as credit card information or PayPal account details).
Governments can issue a range of requests with different degrees of legal compulsion. Often the threshold for intercepts (e.g. targeted disclosure of content) is higher, so authorities opt for data requests instead, which may be just as intrusive in a datafied environment as Bernal (2016) points out. The focus of this paper is government requests for user data that companies would refer to as “unusual,” “unexpected,” or “surprising.” Such instances often occur in a cross-border context with conflicting regulatory frameworks and organizational norms. One example could be that a local government (host State) requests data that is located in the headquarters of the company (home State) while the political environment in the host State is volatile (e.g. large-scale protests against the incumbent government). In many such instances, the regulatory environment of overlapping jurisdictions is politically charged (Ebert, 2021). Hence, there is no clear-cut distinction as to what would constitute an “unusual” request and when the company would label it as such. However, most of these requests have three factors in common; (1) the requests are surprising in nature (have rarely or never occurred before); (2) they challenge the established ways of dealing with government requests; and (3) they initiate a corporate response process that might differ from a routine one.
This paper conducts a qualitative study of actual management practices in order to contribute to the debate on transparency and accountability by providing insight into corporate management processes, specifically in the technology sector (Albu and Flyverbom, 2019; Felzmann et al., 2019; Heimstädt and Dobusch, 2020; Goethals, 2019; Obara, 2017; Obara and Peattie, 2018). This paper additionally aims to increase the information base specifically around the handling of government requests for user data (Parsons, 2019; Rahman, 2016; Soghoian, 2011) and to draw out the themes that companies rely on for structuring their response processes when data disclosure requests come in. Furthermore, it demonstrates how a human rights lens has been used by technology companies to anchor responsible business conduct when dealing with unusual government requests for user data (Schrempf-Stirling and Van Buren, 2017; Wettstein et al., 2019).
Situating the topic of data disclosure requests in internet governance debates on transparency and accountability
Information on government–private data sharing—its scope and scale and the number of countries where it is happening—barely exists. This means that gaining a perspective on government requests for user data through company procedures can enhance the transparency and accountability debate. Besides mounting a critique about ever-increasing state surveillance, the private sector has come under intense scrutiny for its role in facilitating state surveillance, monetizing user and non-user data for advertising revenues, and maximizing business profits in other ways which risk eroding the foundations of democratic rule (O'Neil, 2017; Zuboff, 2019).
To give three examples of what such government requests can entail, it is helpful to look at the most recent report published by members of the Global Network Initiative (GNI) (see Table 1). The GNI is a business-led multi-stakeholder platform with civil society and academic members. GNI members regularly undergo company assessments on their compliance with the GNI principles, which are based on human rights. The public reports of these assessments illustrate the complex interplay of interests around disclosure requests (GNI, 2020) and show they occur not only in authoritarian regimes but also in notionally liberal democratic states (measure which are justified as, for example, counter-terrorism). The debate around transparency and secrecy in internet governance also includes human concerns, in particular, around privacy and defamation, freedom of expression, freedom of thought and opinion, and other adverse impacts on human rights (Deibert, 2015a, 2015b; Flyverbom et al., 2019; Soghoian, 2011). Yet, besides critiquing the immense power technology companies wield, there is a lack of research exploring how companies make sense of these demands.
Excerpt of three case studies from the GNI assessment report 2018/2019 (GNI, 2020).
This research will address this gap by documenting how companies are responding to government requests. The paper contributes to the transparency and accountability literature in critical data studies by bringing a Business and Human Rights dimension into the equation and demonstrating how analyzing corporate behavior can provide additional insights in debates on transparency and surveillance. Scholars have strongly criticized the opacity around handling government data disclosure requests to companies (Parsons, 2019). This paper takes a constructive perspective on transparency and accountability in the sense that transparency and accountability in company processes for responding to government requests for data are the topic of inquiry. The research therefore aims to depict how employees in companies use transparency and accountability in a human rights framing as analytic resources in their everyday work, and what consequences flow from compliance or non-compliance to data requests from governments (Heimstädt and Dobusch, 2020: 3).
At the same time, there are also elements that might echo the ongoing debate around transparency and accountability in AI governance. For example, a parallel might be found with transparency debates about automated decision-making, where “retrospective transparency includes the notion of inspectability and explainability” (Felzmann et al., 2019: 2). Here, inspecting the “internal ongoings” in a company around the handling of a data requests allows for “decompos[ing] a decision” (ibid), namely, grasping the structure and weighing the decision-making factors within the system—though not necessarily in a numerical sense, but rather in depicting why a certain company decided to disclose data, how it reached its decision, and what type of data it disclosed.
With regard to enacting transparency, Albu and Flyverbom (2019) distinguish two broad approaches in the literature on organizational transparency: a) transparency as verifiability and b) transparency as performativity. The first approach conceptualizes transparency as the disclosure of information: organizations are transparent when they provide information about their internal practices, such as around data collection and analysis. The second approach focuses on struggles and discourses within projects related to transparency and on unintended consequences and downsides. This includes a more holistic notion of transparency, taking into account the socio-material and ritualistic practices of organizations when they “perform” transparency (Albu and Flyverbom, 2019; Felzmann et al., 2019). As a result, transparency practices are conceived as social and organizational phenomena whose meaning extends beyond the information conveyed.
Transparency and accountability also play key roles in the debate regarding the role of business in society, namely corporate social responsibility (Scherer and Palazzo, 2011; Scherer et al., 2016) and Business & Human Rights (Baumann-Pauly and Nolan, 2016; Wettstein, 2009, 2022). Yet, when it comes to the conduct of technology companies, there is a lack of empirical research on the integration of human rights in company processes and the implications for transparency and accountability (Schrempf-Stirling and Van Buren, 2017; Wettstein et al., 2019). Creating transparency and accountability to address and mitigate human rights impacts stemming from, or linked to, business activities (including in the technology sector) is an important field of action for business (Ebert et al., 2021). Hence, by showing how technology companies aim to create transparency and accountability by aligning conduct with international human rights norms, this paper makes an important contribution to the field. It does this by using a specific dilemma scenario around unusual data disclosure requests as a case study and providing empirical insights into the phenomenon.
Scholarship has demonstrated that transparency reporting can scarcely assist in addressing the opacity in data sharing practices between the private and the public sector due to various constraints, such as legal restrictions and/or fragmented regulatory regimes (Ebert, 2019; Parsons, 2019; Soghoian, 2011). So, while publicly accessible data can help shine light on the subject to only a limited extent, it's possible to turn to research focusing on the private sector's role in data sharing between the State and technology companies. Yet, there is very little research available. Despite data handling between the State and the private sector being a key area of concern, this vast gap makes it necessary to find innovative ways to provide further insights (Soghoian, 2011). In particular, there is a lack of empirical data on how companies deal with data disclosure requests in complex contexts. This gap is difficult to address from a quantitative point of view due to the demonstrated lack of reliable and comparable corporate reporting (Parsons, 2019). At the same time, there is also a lack of qualitative data on how human rights are managed in companies in general (Obara, 2017; Obara and Peattie, 2018), and even more so for complex settings that can amplify crisis moments, such as when responding to government requests for user data. This being the case, how are companies responding to unusual requests for user data?
More empirical data is required to better understand the dynamics around data requests by governments to technology companies in complex contexts. Vast and disproportionate access to user or metadata in complex contexts can enable a government to preempt political mobilization and silence dissent. It is crucial to learn more about the scale of the issue and how to address it. One way to contribute to solving this puzzle is by looking into corporate management practices when dealing with unusual governmental requests. Specifically, dissecting corporate processes promises greater insight than analyzing specific cases, as a post-mortem analysis of specific data request cases is likely to be severely restricted by laws in a variety of jurisdictions.
Unusual government requests for user data are a suitable case for at least two reasons. (1) Data flows govern processes in the public and private domains to a large degree, on the national level as well as across borders. The information tied to such data has become a powerful asset; hence, companies will not take decisions in this domain lightly. Rather, they will employ a high level of caution as government requests for user data that target the private sector can be used by States to uphold national security in specific instances of well-justified suspicions of (potential) crime, as well as for nefarious purposes, such as silencing political opposition or dissent. The majority of government requests have traditionally been related to the interception or blocking of communication traffic (Tuppen, 2016). These requests can be targeted at the level of an individual, a group of individuals, a website, or even extend to mass action across an entire network, such as a shutdown (Tuppen, 2016). (2) Government requests for user data are nothing new. However, what is new is the public demand that human rights are taken into consideration when dealing with such requests. Due to a reckoning with the adverse impacts on human rights in incidents in early 2000 (e.g. the Shi Tao case (MacKinnon, 2007)), the level of awareness in the industry of such requests is already high, which increases the chance that companies might already have in place certain processes to manage the phenomenon, which can be studied.
The topic of data governance between State actors and technology companies has received a lot of scholarly attention in the recent decade (Deibert, 2019; Deibert, 2008; Deibert, 2015a; Deibert, 2015b; Cohen, 2016; Zuboff, 2015, 2019). Yet the field is strongly limited with regard to scholarly empirical research, for example, because of strong limitations in accessing information due to factors such as State secrecy laws and national emergency restrictions (Soghoian, 2011). The lack of government disclosure leads scholars to construct their own large-scale datasets, for example on internet shutdowns across the African continent (Freyburg and Garbe, 2018). Scholars have raised concern regarding the abuse of such implied secrecy about the actual data handling between the State and the private sector and the potential chilling effects on democracy (Penney, 2017; Schneier, 2015). Such concerns make it necessary to find creative ways to dig deeper into how companies really aim to manage this important and highly sensitive data handling relationship.
To date, there is no effective globally binding framework to govern these types of data disclosure requests. The increasing plurality of regional frameworks has become almost obsolete by augmenting datafication in data-driven business practices and public services. If a government decides to request access to data originally collected for non-governmental purposes, it may in theory seek access to information about millions of internet users around the world. Demands for disclosure of atypical data have the potential to disrupt the separation of the public and private spheres and can present risks to rightsholders, ranging from adverse impacts on privacy to many other connected rights (Ebert, 2021). Often, human rights seem to be sidelined in governmental rationale, thereby facilitating intrusions into the privacy of an individual or group of individuals. Conversely, the framing of data protection as a right appears to have imposed much greater obligations on private actors than most other human rights. Governments often circumvent such obligations through narratives of crisis or national security (McDermott, 2017). Atypical requests, in theory, occur at times of crisis or create a crisis in organizations. These requests have both legal and ethical implications, in terms of the law applied and the ethical considerations taken into account in managerial decision-making (Ebert, 2021). For example, such requests can result in an increase in obscurity for affected stakeholders and raise questions about what is really going on.
Methodology
Corporate management responses to data disclosure can shed light on the delicate balance that internet, communications and technology companies aim for when dealing with government requests (Maitlis and Sonenshein, 2010; Weick, 2010, 1988; Weick and Sutcliffe, 2001). In line with a constructive perspective on the transparency and accountability debate, the focus of this paper is on “analyzing how people in and around organizations—practitioner-analysts—mobilize these topics in their everyday activities to create social order” (Heimstädt and Dobusch, 2020: 3). Government requests for user data can be understood as sites of ethical contestation (ibid). Here, new and ambiguous matters of transparency and accountability are claimed, contested, and configured (ibid). They can reveal how much leeway companies have in responding to data disclosure requests and how their response models differ when dealing with authoritarian regimes or complex political leadership more broadly.
As already pointed out, analyzing the corporate response process has the advantage that it does not rely on publicly available data. Scholars have repeatedly highlighted that corporate transparency reports do not provide sufficient information in terms of size and granularity of data and other types of qualitative contextual information (Parsons, 2019). Also, companies do not apply the same metrics to report on data requests, which makes meaningfully comparing information across companies impossible (ibid). National legislation in several countries prohibits companies from reporting on data access requests (Soghoian, 2011). In some situations, legal requirements or licensing agreements restrict the leeway companies may have in reacting to requests. In other instances, companies might have considerable room for maneuver when responding to unusual data requests and what strategies to choose.
This thematic analysis uses qualitative data collected through semi-structured interviews with 30 executives and senior experts working in, or closely with, technology companies (most of them telecommunications and social media companies), and related policy documents collected between autumn 2017 and spring 2020. Hence this analysis relies on a document analysis as well as expert interviews (see Table 2). The document analysis consists of a qualitative review of publicly available policies of GNI member companies on cooperation with law enforcement agencies and data sharing more broadly. GNI member companies were selected due to their publicly stated commitment to improve transparency. The document analysis sample is not congruent with the sample of interviewed companies—this is to avoid making the interview partners easily identifiable. The interview material was coded with broad thematic coding and triangulated with the insights from the document analysis. The sample consists of 25 public policy documents, transparency reports, and blogs that discuss government requests for user data. These were used to a) inform the interview questions and b) triangulate some of the statements made by experts. Interview partners were selected based on a combination of sampling strategies (Miles and Huberman, 1994: 28): (a) politically important cases with a desirable attention based on their market size, and information available in the public domain (press/media), (b) critical company cases for logical generalization and maximization of the application of information to other cases, (c) extreme and deviant cases (as identified by the interviewed expert) for information richness and learning from highly unusual manifestations of the phenomenon under scrutiny. The interviews touched upon (a) governance structures for dealing with disclosure requests. From this, I targeted how (b) management processes an incoming request, with a focus on (c) the involvement of external stakeholders in the response process. Finally, I requested information pertaining to (d) communications and the relationship with the requesting authority.
Interviewed experts: Anonymized overview (own illustration).
The analysis of the interview data and document analysis depict differences and similarities in how companies respond to government requests for user data. This leads to a more nuanced picture of companies’ actual practices and to what extent and under what circumstances they try to resist requests. Data disclosure requests are an extremely sensitive topic to discuss for most companies. Many of the staff directly handling such requests or overseeing the staff doing so need security clearance. As a result, it was very challenging to find corporate representatives willing to be interviewed, and it required a significant level of trust building over the course of more than two years to reach a sufficient number of interview partners.
The interview transcripts were coded with broad thematic coding, and themes emerged inductively from the material, which enabled the retracing of corporate processes in dealing with government data disclosure requests (Miles and Huberman, 1994; Strauss, 1987; Gioia et al., 2013). These codes were used to retrieve information from the interviews and organize it. The entire interview material was coded twice. The first round consisted of “informant-centric coding” (Gioia et al., 2013), closely related to the interview material. For the second round, the level of abstraction was increased to group the participant-centric codes into “aggregate dimensions.” These aggregate dimensions are broader in scope and ensure an internally consistent way of clustering content through similar codes (Gioia et al., 2013).
Findings
General process
Overall, the assessment of the sample of publicly available policy documents confirmed the claims made by Soghoian (2011) and Rahman (2016) on the insufficiently informative nature of transparency reporting with regard to numbers. Based on a range of publicly available policy documents from technology companies and the reconstructed expert interviews, the process inside a company for responding to requests for data can be structured as follows (see Figure 1): (1) At the company level the staff assesses the formal requirements of the request: is the request (a) made in writing, (b) issued by a legitimate authority, (c) referring to an applicable law, or (d) received by a designated contact? Then (2) the company level assesses the scope: (a) is the request legally speaking appropriate to achieve the aim, (b) what are potential adverse impacts on the ground, and (c) what is the past record of the requesting authority? Based on these two stages, the company decides whether to comply with, delay, narrow (interpretation and/or technical scope), or reject the request. The response team usually consists of employees from different divisions, including law and policy employees who deal with these requests, as well as security experts.

Responding to a data request: general process model (own illustration).
Differences between telecommunication companies and social media companies
It became clear that the situation of telecommunication companies—which for the most part also act as internet service providers—and social media companies differs substantially. The insights from the interviews showed such strong differences that it made two distinct models necessary regarding the role of (1) local staff and infrastructure and (2) data location (Ebert, 2021). Telecommunication companies are more vulnerable to government pressure regarding danger to local staff and infrastructure as, in most cases, they have personnel and extensive equipment located in operating markets. In some cases, telecommunication companies operate through third-party services in a country and hence do not possess their own equipment on the ground. Additionally, the data of telecommunication companies is mostly locally stored. Here, two models of assessment by telecommunication companies can be described: one option rests on local accountability, where the local staff will review the request and decide whether to escalate it to the headquarter or group level if it is labeled as highly critical. The other option focuses on ensuring very close communication channels between the local level and the headquarter/group level. When the request is received at the local level, the local staff will review it with a standardized assessment sheet or similar management tool but jointly evaluate it with the expert staff located at the headquarter/group level. Conversely, social media companies can operate in a market without having their own staff or infrastructure on the ground (if a local representative is not required by the local law). Consequently, pressure via personnel or confiscation of infrastructure does not matter as much to social media companies as for telecommunication companies. As a result, if data is requested by a government that is stored outside of the jurisdiction in which the request comes in, and the headquarters of the company is also located outside of that jurisdiction, the local government's leverage is not very strong.
Themes used in companies when responding to data disclosure requests
These themes describe the broad spectrum of considerations that are used in companies when responding to government data requests (see Figure 2). The wide range of themes reflects the multifaceted nature of information gathered in the expert interviews, as well as the highly contextual nature of each data request. Unusual data requests make a broad contextual analysis necessary; they also reflect the spectrum of possibilities in a human rights-based situational analysis. Hence, the breadth of the themes below is a consequence of the nature of the subject. At the same time, these themes are modular and combinatory in nature, as not every request might relate equally to each theme but rather in different degrees of intensity.
External environment Organizational factors and structures that staff are appropriately trained on accessing data files’ potential impacts on marginalized and vulnerable communities in the country of the request. (Company representative) Shared understanding a top-down approach, with a human rights policy or a digital rights policy that is passed by senior leadership and there's board level involvement and executive leadership for overseeing this response process and the interaction with law enforcement and give feedback within that company from these intake-level up to the higher level is important. (Company representative) We have formal meetings each year which occur at least once and have occurred two to three times in the last years due to a number of factors, one being staff turn-over so we are doing a re-introduction to the topic for new staff that are coming onboard. (…) we wanted to make sure that everyone is in the loop in terms of our responsibilities. (Company representative) Emergence of routine procedures for handling unusual requests we have a specific sort of process and training program for our more problematic countries (…). So, they follow the same policy guidelines (…) but we also had to allow for the very sort of severe security situations that some of these countries can find themselves in. (Company representative) Early actions taken in a moment of crisis if we find it an issue we would escalate even further up, kind of C-suite level (…) potentially the CEO. (…) unique about the process and through the whole process, the CEO at local level will also be involved in the process. So, they are (…) immediately notified when there is sort of an unusual or challenging request. So, the CEO at local level and (…) us at group level, are informed and then depending on the seriousness of it and the complexity of it, it's escalated further up the chain. (Company representative) Government relationships At least with MLAT it's dealt at State-level and then referred down to us, so we at least know that it's under the rule of law, that theoretically it had a State evaluation before it comes to us. Whereas if there is a direct request from a State that we do not think that we should be the ones who have to do that evaluation process – it must significantly increase the level of scrutiny that it requires. It takes away that insurance frame and also increases the complexity. (Company representative) Relationships between headquarters and subsidiaries But what they can do is to say “we have a policy which obliges us to escalate the issue to group”, that they can do and eh the policy in our procedures makes clear that decisions on such unconventional request need to be approved at group level. (Company representative) In fact, we have the escalation process as soon as it's anything that's not a normal standard request that doesn’t meet the requirements. We have a whole escalation process when the request would go to a what we call “group authority request team” which is here at group level. (Company representative) So the process is local, so there is that is a very national issue. There is no, so to speak, no centralized here in [Western European capital] but in all countries, it is done locally and in all countries there are again specific teams for this with corresponding mechanisms then also, what they do, what they don’t do, there are trainings and then also escalation mechanisms if things happen that should not happen. (Company representative) We speak to them on a daily basis. And that for me is the real key here: Building your trust and your relationships with the guys on the ground on a day-to-day basis (…) So I think the relationship between head office and local operations is absolutely paramount to being able to a) deal with these situations in a rights-respecting but also safety-conscious manner and b) provide the kind of transparency and disclosure that external parties are seeking in regards to this area. (Company representative) smarter ways to interact between subsidiary and group level. Whoever is receiving those law enforcement process request, should try not to allow security forces to deputize and otherwise get control over employees is important. (Company representative) Human rights mindset [I]n all of these cases we are trying to resist. We are using our merger into the civil society, to human rights defenders and to have a civil society organization to help us in some cases where the demand is completely off the the record and in which the publication from NGOs or any kind of civil society organization to point out the fact that this is not the right way to do, this is also something that we are trying to use. This is not governance, but it is definitely a way to defend our side when it is not too difficult or too dangerous. (Company representative)
It became evident when speaking to experts about data disclosure requests that the political context and embeddedness of the request with the external environment in the operating market is pertinent. While the overall process for assessing a data disclosure request seems on first glance relatively vague and generalized, there is ample room for external environmental factors to shape the response process to the request. An interviewee highlighted that “the tools that people can use to understand the context, might make it impossible/possible to have (…) conversations with local stakeholders. You have to make a judgement in every situation as to whether the environment is such that you can have that kind of constructive situation around human rights.” The socio-economic environment and “healthiness” of the rule of law influence the degree to which an assessment of a request is marked by suspicion toward the local government, making in-depth case-by-case analysis necessary. Interviewed experts emphasized the need for a thorough assessment of the political and social ecosystems in which an unusual request is made. Such measures must be undertaken to enhance understanding of the potential impacts on rightsholders, and local stakeholders more broadly, as well as to identify avenues for mitigating impacts when a disclosure of data is unavoidable.
The interviewed experts perceived the corporate set-up of data storage, location, security, and data access as key factors determining the company's vulnerability to unusual government requests for user data and as key characteristics in shaping a company's governance processes. Adding to this, interviewees highlighted meaningful reporting and streamlining of processes related to data disclosure requests as an important organizational factor not only for building internal knowledge but also for communicating externally around these issues. Equally, experts emphasized streamlining across organizational units as an important element in reducing the likelihood of staff being vulnerable to “surprise requests” which might result in “untrained” staff disclosing sensitive information. For example, some companies designate a specific staff member as a government contact to ensure “that the authorities do not send the request to any other part of the company than this unit.” Adding to this, experts also highlighted the need for companies to employ a multi-departmental response that will allow for drawing on different sets of expertise. This would “increase the scrutiny of the request and ensure that the legal staff are not the only ones treating the data requests but rather include a public policy team lead” as well. This will make sure:
Interviewed experts disagreed in their reflections about whether companies succeeded in creating a shared understanding for dealing with data disclosure requests. Companies put a lot of effort into improving international communication strategies in order to mainstream conduct around unusual governmental requests across geographies, with trainings, manuals, and guidelines. One company representative highlighted how the executive management aims at ensuring:
While to a certain extent experts emphasize the uniqueness and element of surprise inherent to each unusual request, the expert interviews demonstrated that the occurrence of unusual requests has become a considerable issue. As a result, despite diversity in the nature of each request, experts pointed out that the formality requirements mentioned earlier regarding data access requests (in writing/legitimate authority/applicable law/designated contact) have become routinely enacted. Experts reflected on how the frequency of unusual requests is so high that to a certain extent the “unusual” gets normalized by incorporating response trajectories for unusual requests within companies. For example, experts described how interdisciplinary teams are routinely meeting to deal with the continuous pattern of national emergency narratives. Another example mentioned in an interview was the establishment of rapid response teams to avoid moments of crisis:
While structural aspects matter (see organizational factors and structures above), experts also highlighted a strong temporal dimension in dealing with government requests throughout the interviews. Company representatives across different types of companies and independent expert advisers agreed that the early actions taken in a moment of crisis matter, and that companies aim to find ways to respond swiftly when detecting politically motivated elements in a request, to ensure staff safety. For example, some company representatives described an executive-level escalation process being triggered by certain flags:
A complex theme to capture was the relationship between the company and the government more broadly. Often, the government was perceived by interviewees as an antagonist, in particular due to the perceived dichotomy between the need to uphold privacy while increasing the securitization of human rights. One interviewee expressed it in the following way: “Double standard: Busy pointing fingers at Global South, but countries are mirroring Western laws.” Companies perceived governments as the root cause of dilemma situations stemming from overbroad laws and/or direct access regimes justified by a national emergency narrative. Also notable was the complexity of legal interpretations across jurisdictions, so that Mutual Legal Assistance Treaties (MLATs), for example, were not always perceived by interviewees as helpful. MLATs allow States to cooperate in information-sharing to support legal proceedings relating to criminal or public law across nations’ borders, e.g. when a suspect of one nation is likely to be located in another nations’ territory or data relating to that individual is stored in another nation's territory. In other words, if suspect A is a national of nation A but their data is located in nation B, nation A needs to submit a request for suspect A's data to the relevant company based in nation B, via nation B's government. Nevertheless, some company representatives said that such treaties shielded them from abusive state behavior to some extent:
From a practice-oriented management perspective, the theme emerging around the relationship between the headquarter and subsidiary of the affected company was powerful. In order to shift pressure away from the people on the ground—such as affected rightsholders and staff—companies applied a range of strategies. For example, headquarters were depicted as the center of decision-making with the power to make the subsidiary appear uninvolved in actions or to establish the jurisdiction of a “fishy” data request made in a manner that protected rightsholders by social media companies. Generally speaking, social media companies have more leeway to push back as they have less infrastructure and fewer people on the ground. As a consequence, they employ tactics to shift responsibility to the headquarters for certain measures and claim that the activities implemented at the subsidiary level were mandated by the group.
As this research was conducted using a human rights framing, it is not surprising that many experts referred to human rights in their responses. This ranged from companies stating that they are actively setting “the default” to a rights-respecting one when responding to data disclosure requests (for instance, through labeling metadata as content, which has higher standards of protection against being accessed by governments) or generally having a policy that provides a stance against government interference and actively pushes back and/or builds on active resistance while also constructing and strategically cooperating with multi-stakeholder alliances.

How companies make sense of human rights when responding to data request: a thematic analysis (own illustration).
Discussion
Recent scholarly debates in human rights literature pertaining to business conduct have focused on technology company conduct (Ebert, 2019; Ebert et al., 2020, 2021). This paper adds to the existing if scanty body of empirical studies on how companies internalize human rights considerations (Wettstein et al., 2019; Schrempf-Stirling and Van Buren, 2017), for example, in their governance structure and day-to-day decision-making in the specific context of data requests in the State–business nexus. The research shows that human rights are being internalized as a terminology to act on unusual data disclosure requests from governments to technology companies. The results also indicate that most large companies in the sample have adopted processes to cope with the phenomenon. The approaches the companies are adopting differ between a local accountability structure and a more headquarters-orientated one, both with the same aim of protecting rightsholders.
The local accountability structure emphasizes contextual knowledge and regional legitimacy and focuses on local company representatives making key decisions when unusual data requests from the government are received. The headquarters-oriented approach aims at shielding local operations from pressure on the ground by elevating the issue to the group level and seeking to protect local management by doing so. Furthermore, the insights also clearly demonstrate that early actions taken in moments of crisis matter significantly; for example, companies with an escalation process felt better placed to respond to unusual requests.
This research also has implications for scholarship in critical data studies as it sheds light on a “black box” (Brevini and Pasquale, 2020; Pasquale, 2015) that has not been sufficiently investigated so far due to lack of empirical data on how companies respond to unusual data requests from governments. The research affirms the immense pressure that governments put on companies to disclose corporate data. Such insights add to emerging debates about freedom of thought and the impact current trends in contemporary internet governance might have (e.g. freedom of thought for political actors who are leaving data traces in corporate realms while conducting their day-to-day activities) (Alegre, 2022). Such granular data traces might potentially become accessible for governments and leading to an Orwellian sense of State surveillance.
The study underscores the complex relationship between transparency and accountability within corporations. It challenges the notion of a one-size-fits-all approach and emphasizes the need for context-specific analysis of how transparency practices are implemented and their impact on accountability. This paper contributes to better understanding how transparency practices take place in the corporate sphere, taking into account specific cultural and organizational settings and how transparency practices are embedded in such contexts (Felzmann et al., 2020). The analysis demonstrates how broad the spectrum of contextual analysis applied by company staff is and how transparency practices do not take place in a social vacuum but rather are specific to their cultural and organizational settings. At the same time, re-visiting the debate around the verifiability versus performativity of transparency practices provides insight in this context (Albu and Flyverbom, 2019). The question arises: to what extent do the results shed light on whether these transparency practices are verifiable or performative in nature? While the documents analyzed and triangulated with the expert statements discuss government requests for user data, and while some companies provide indicative charts, it appears that the practices are mostly performative in nature in the sense that they were exercised with certain strategic intentions (e.g. minimizing the amount of data handed over to shield harm from rightsholders). It appears there are certain parallels with discussions in the aftermath of the Snowden revelations where scholars have argued the interactions between companies and government might have been more “complex and dynamic communication processes rather than simple and straightforward transmissions of information” (Albu and Flyverbom, 2019: 283). At the same time, it is important to also consider that while technology companies bring forward strong narratives of respect for human rights and transparency, there are also areas that are obfuscated by generalizations, which prevent understanding how certain thresholds for compliance or non-compliance are enacted internally.
Transparency remains “a fuzzy concept that defies precise linearity” (Felzmann et al., 2020: 3353). The appropriate reporting format should be implemented with the audience's needs and capabilities in mind. The constructive perspective on transparency and accountability, in turn, allowed for understanding how the construction of these themes, which are used to respond to data disclosure requests, are entwined with relations of power in contextual settings and confirmed how the relation between transparency and accountability is complex and cannot adequately be described through a single theory or method (Heimstädt and Dobusch, 2020: 5). Rather, the results demonstrate the highly contextual nature of procedures for handling requests for user data and the combinatory logic of relevant factors that take into account the decision-making process to allow for, at least, “retrospective transparency” in some cases (Felzmann et al., 2019). But it also depends on the type of transparency and which granularity is meaningful in the context of handling the government request for user data. There are certain parallels that can be drawn here with the debate on transparency in AI and algorithmic decision-making more broadly (Zerilli et al., 2019). For example, what constitutes meaningful transparency for AI in terms of “explainability,” especially with regard to the intended audience of interest and their level of data literacy (Larsson and Heintz, 2020).
The research also connects with the discourse on the standardization of transparency reporting (Parsons, 2019). The results demonstrate that companies are willing to actively manage difficult questions around data disclosures but generally have to work on a case-by-case basis. Therefore, aggregating results as they are currently displayed in transparency reporting might only provide an indicative picture of actual conduct. Greater public scrutiny of government conduct could assist in preventing the most atrocious abuses of national emergency narratives for opposition surveillance, for example. Equally, such public scrutiny could enable stakeholders to differentiate between those companies that fail to actively manage data handover dilemma situations from those that employ active measures to push back or minimize the risks for rightsholders that are potentially associated with certain unusual requests. This kind of accountability would enable additional avenues for empowering human rights teams in technology companies by differentiating laggards from leaders in responsible business conduct when it comes to handling user data. Hence, this paper can bring an additional element to the debate on how to establish better transparency around data disclosure requests from governments (Flyverbom et al., 2019; Parsons, 2019; Soghoian, 2011). As scholarship has called for more reliable figures on data disclosure requests, this research shows that corporate response processes and the ways in which governments request data matter significantly. So, while there may be national security barriers in disclosing information on requests in quantitative terms, standardization and/or obligatory measures could mandate disclosure on the precise processes when dealing with data disclosures, both on the side of the requesting government as well as that of the request-receiving company.
Finally, in terms of practical implications, companies can use this research to enhance their accountability mechanisms and compliance with human rights standards. It highlights the importance of proactive measures to shield user data and protect rightsholders. Equally with regard to standardization, the insights from the analysis can be used to guide policymakers and industry associations in developing guidelines and regulations for companies to follow. This could lead to more consistent and reliable reporting practices across the corporate landscape. Additionally, as well as an expansion of the platform and telecommunication companies involved, an expansion into different types of technology companies beyond platforms and telecommunications companies could deliver insightful results.
Footnotes
Acknowledgments
The author would like to thank Florian Wettstein, Gina Neff and Isabelle Wildhaber for their continued support, as well as Felix Simon and Maggie Mustaklem for insightful debates.
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
