Abstract
The intervention attempts to engage critically with the Smittestopp app as a specifically Norwegian technofix. Culturally and politically, much of the Covid-19 response and the success of social distancing rules have been organized around the widespread trust in the government and public health authorities, and a focus on the citizens’ duty to contribute to the dugnaðr. The intervention argues that Smittestopp has been co-created by the mobilization of trust and dugnaðr, resulting in the launch of an incomplete and poorly defined data-hoarding product with significant vulnerabilities.
Keywords
This article is a part of special theme on Viral Data. To see a full list of all articles in this special theme, please click here: https://journals.sagepub.com/page/bds/collections/viraldata
Introduction
This intervention examines the creation of the “Smittestopp” (“Stop transmission”) app as a central public health tool for the Norwegian response to Covid-19. So far, Norway has been a Covid-19 success story. The government closed schools and businesses and halted international travel in mid-March, and significantly restricted freedom of movement throughout the country. Despite being “unprepared” for a pandemic threat which in 2019 was ranked by the Norwegian Directorate for Civil Protection as the greatest threat facing Norway (DSB, 2019), the health sector has coped well. By the first week of April, amid concerns about the lockdown’s skewed impact on children, people in institutions, and immigrant populations (Erdal et al., 2020), the outbreak was declared to be “under control” (Fouche, 2020). The lockdown was gradually eased from late April. By early June, in a population of around 5.4 million, there had been 8504 confirmed cases, 238 deaths, and 257,303 individuals tested (FHI, 2020a). While the vigorous government response had immediate and severe implications for the Norwegian economy, its impact is expected to be partly mitigated by the country’s sovereign wealth fund (Government.no, 2020a).
What follows is an initial attempt to engage critically with the Smittestopp app as a specifically Norwegian technofix. The intervention—by its very nature, a preliminary and incomplete exercise—covers the period March, April, and May 2020. My aim is not to unpack the technical and legal aspects of Smittestopp, but to provide a first scoping of Smittestopp as a socio-technical system. I draw on Norwegian language media reports, social media entries, statements from experts, and government-issued correspondence, reports, contracts, and public health communications. English-language sources have been used where possible but otherwise all translations are my own (I am a native Norwegian speaker). The analysis also draws on insights from the national “Covid-19 and the rule of law” [Rettsstaten og Korona] webinar series I organized in spring 2020 with Malcolm Langford at the University of Oslo, and my ongoing research on ethical humanitarian innovation and the pitfalls of experimentation in emergencies (Sandvik, 2019, 2020a, 2020b, 2020d; Sandvik et al., 2017).
As observed by the sociologist Craig Calhoun (2010), during a crisis, the focus on “urgency” shapes how we understand what is happening in the world. Once an emergency is declared, it then determines both who is supposed to act and what is supposed to be done. In the case of Norway and Covid-19, the who is the government, and the what is (more) governance across the care-control continuum. Culturally and politically, much of the response to the pandemic, including the successful social distancing rules, has been based on a widespread trust in the government and public health authorities, particularly the Norwegian Institute for Public Health (FHI). This trust intersects with a popular belief in technological progress as inevitable, apolitical, and problem-free. As has been noted with respect to the country’s advanced digital economy: “Norwegians are quick to adopt new technology and have a high level of trust in digital services… Norway also has a well-developed infrastructure that makes it possible for most of the population…to participate in the digital transformation” (MoNE, 2017). Another central element is the deeply ingrained cultural concept of “dugnad” derived from the old Norse dugnaðr meaning unpaid and voluntary work carried out collectively. In the following, I will argue that Smittestopp is based on a problem framing drawing on technology-oriented solutionism, culturally specific forms of trust, and the dugnaðr concept.
On 16 April, after a few weeks of research and development, the Norwegian government launched the Smittestopp app (The Local, 2020). The app is designed to “see who has been in the proximity of infected persons, thus helping to curb transmission” (NRK, 2020a). Smittestopp is presented as “an app that helps the health authorities to limit the transmission of Coronavirus” by producing “anonymized data about movement patterns in society” which are “used to develop effective infection control measures”(Helsenorge, 2020a). The emphasis is on the utility of the app for users, the government, and other citizens. It is a tool which will help you if you have been in close contact with another Smittestopp user who has been infected by the coronavirus. Having the Smittestopp app on your smartphone will help the health authorities to provide appropriate advice on infection control measures. If enough people use the Smittestopp app, it will help to reduce the number of people who become ill from the coronavirus. (Helsenorge, 2020b)
To contribute to comparative analysis of Smittestopp with Covid-19 tracing apps in other national contexts, the intervention considers five aspects of Smittestopp: (1) its origins, (2) its attributes, (3) criticisms and controversies, (4) issues of effectiveness and effects, and (5) the calibration and cooption of resistance in public discourse. A conclusion problematizing the mobilization of trust and dugnaðr follows.
The origins of Smittestopp
The Smittestopp app has been developed by the governmental research company Simula on behalf of the FHI. In recent years, Simula has been highly successful in attracting research funding and has also sought to carve out an increasingly visible public profile. There is no reason to believe that Simula’s initial engagement was motivated by anything but good faith, but at the time of writing, the genealogy and timeline for the app is unclear. I have reconstructed it as follows: 11 March, the Simula CEO Aslak Tveito writes to the FHI Director General Camilla Stoltenberg to offer services with respect to data analysis and management (Copy of FOIA request, 2020a) adding that “it will not cost anything, we would like to contribute.” Elsewhere, Simula explains that they reached out to be “nice consultants” on 11 March, but then it got a lot bigger (Kibar et al., 2020b). In a statement from 27 March, Simula clarifies that the development for Smittestopp started 13 March, consisting of a team of 20 researchers and developers and “has gone on every day since then, with very long workdays.” At this date, no agreements on financing had been signed, Simula was covering its own costs (Simula, 2020a). In a document called “protocol for procurement” the FHI describes this phase as a “dialogue about an idea for an innovation” in response to Simula’s initial pitch, and that Simula was linked up with UK partners (see below) and other government entities from 14 March (Copy of FOIA request, 2020b). A development contract between FHI and Simula—affirming that FHI had approached Simula with a view to produce a tracing app—was only signed 8 April (Simula, 2020b). The contract for operation and maintenance was signed 16 April. This contract is explicitly described by FHI as a collaboration and not a contractual relationship (Copy of FOIA request, 2020b).
This timeline sits awkwardly with other publicly available information which connects Smittestopp directly to the Science paper by Ferretti et al. (2020) and the work happening at Oxford by Professor Christopher Fraser. In a press statement from 2 April, Fraser explains that this work was “enabling” several international partners, including the FHI “to assess the feasibility of developing mobile apps for instant contact tracing in record time” (University of Oxford, 2020). According to the project leader for Smittestopp, Fraser introduced the concept to John-Arne Røttingen, the Director of the Research Council of Norway and a previous FHI-employee, with whom he was acquainted. Røttingen subsequently presented the idea of a Norwegian-based app to FHI Director General Stoltenberg. The Ministry of Health and Care services, under whose authority FHI is located, gave permission for work on the app on 8 March. The project leader for Smittestopp was brought onboard full time only from 25 March. The project leader explains that FHI were given the get-go to develop the app only when the regulations for the app were issued on 27 March (Nordal, 2020).
Smittestopp was never submitted to any tender process. Simula works on scientific computing, software engineering, machine learning, communication systems, and cryptography, but appears to have had limited experience with and competence on apps (they outsourced the making of the app to two companies called Scienta and Shortcut), data protection and privacy issues, and epidemiology. The app has a total budget of 45 million Norwegian kroner. Simula’s share of the contract is 16 million kroner (or 1.55 million Euros in total). This includes 5 million for developing the app (sub-contracts for 2 million) and 11 million for operation and maintenance (involving affiliated Simula companies, Simula School of Research and Innovation, Simula Metropolitan, Simula UiB, and Simula Consulting). Figuring out exactly what happened here, why, when, and which networks were involved will be important not only to account fully for Smittestopp but also for developing a critical understanding of how the ecosystem of Covid-19 tracing apps works as a form of global data governance.
The attributes of Smittestopp
The legal basis for the app is “the Regulations on digital infection detection and epidemic control in connection with an outbreak of Covid-19” of 27 March (based on the 1994 Act relating to control of communicable diseases), which will remain in force until 1 December 2020. The regulations give a detailed list of the type of personal data that will be registered in the Smittestopp tracking system (mobile phone numbers, age, location data, proximity to/contact with infected individuals). Personal data from this system may be linked with a range of governmental databases. The regulations also include measures regarding purpose limitation and mission and function creep, and restrictions on the commercial use of data and on sharing it with the police or the judicial system.
In an English language explainer, the way it functions is described as follows: When you are in close contact with another person who has the Smittestopp app on their smartphone, the apps will send signals to each other. If the other person has coronavirus disease, the app will remember that your phone has been in close contact with them. You will then receive a text message explaining what you should do to ensure that you do not pass on the infection and advice to help you look after your health. You will not be told who was ill with coronavirus disease, only the date on which you were in close contact with them. (Helsenorge, 2020b)
Criticism: Hasty development, poor security, experimentation, and unclear purpose
The app is a mix of infection tracing and research on population movement, supported by the government’s interest in measuring changes in behavior that have followed new government rules. There is a lack of clarity about what the app does and what it will do as its functionalities, implementation, and popular use continue to evolve. The app has been subjected to a barrage of criticism from the technology and data protection community in Norway. These criticisms can be grouped as follows:
The first group concerns quality. The app has, quite literally, been developed in “record time.” Immediately after the launch, there were much-publicized problems with user-friendliness, functionality, downloading failures, and high battery use. The process of ironing out these technical glitches is still ongoing. The GPS/Bluetooth combination is also controversial. As we know from other countries, the Norwegian government did not have to opt for this solution. Problems of efficiency, the generation of false positives, intrusiveness, and insecurity have been endemic to the fleet of Covid-19 tracking apps emerging the spring of 2020. Smittestopp is affected by many of the same challenges (Howell et al., 2020; Ryder et al., 2020; Taylor, 2020).
Considerable concern has also been voiced about security, in particular the use of SMS as a means of notification of infection and its potential for spoofing. This involves falsification of the sender address, and the manipulation of content for purposes of fraud—tricking recipients into giving out sensitive personal data, including financial data—or harassment. According to the developers, SMS was chosen because it is a “simple solution” which everyone “knows how to respond to.” The idea was that the app was going to be simple and quick to develop (Gundersen and Skille, 2020). Long before the launch, the Norwegian tech and data protection community was voicing intense dissatisfaction with the closed source code approach, which was perceived to sideline white hat hacker assistance—i.e. a share of the dugnaðr—from the Norwegian tech community (Valen, 2020). In early April, a national group of experts from academia and industry was tasked with monitoring and evaluating the app.
A noteworthy characteristic of the app is how openly experimental Smittestopp has been throughout its lifespan (Also Gynnild, 2020). Simula and FFI have been criticized for poor risk assessment and for suggesting that security testing could happen in “a controlled manner” once the app had launched. In their first report from 9 April, the expert group warned against launching the app without appropriate testing. Despite these warnings, FHI launched the app for the entire population (Rise and Venæs, 2020). Seven weeks after the launch, the SMS function was still being tested in three urban locations, on a number of factors including calibrating the number of minutes that should be required to trigger a warning. The leader of the expert committee evaluating the security and personal data aspects of the app expressed concern that limitations on the use of personal data may mean that the app will “find too few infected people.” Also problematic is that developers will not be able to use data from test municipalities (Kalveland, 2020a) due to the restrictions on using “real” personal data in software development. Notwithstanding these issues, the SMS function will be activated for the whole country when the testing phase is completed. The combination of experimentation and intrusiveness interplays with concerns related to deploying the app in Norway, a country with only 5.4 million inhabitants. Even if you remove information about phone numbers and individual users’ ID, in a sparsely populated country certain location data will be sufficient to identify individuals. The FHI explained that, in places where there is little interpersonal contact, “we cannot rule out the possibility that the recipients of SMS notifications will know the identity of the infected individual.” Moreover, weeks into the deployment, the solutions for anonymizing data for research purposes had not been fully developed (Gundersen and Grut, 2020). The final report from the expert group issued 20 May concluded that the security and privacy provisions of the app were inadequate (Kalveland, 2020b).
In this context, the nature of the beast matters. Simula and FHI—and the regulatory framework—all emphasize that data will not and cannot be shared, and that data will not be used by government bodies for other purposes than those originally envisioned. They also underscore that “Simula will not commercially benefit” (Schei and Trædal, 2020). Yet, both institutions are situated within larger international technology networks that share code, models, and data. An attempt by a Norwegian data security activist to use the freedom of information act to find out the source code was rejected because of possible commercial considerations regarding international interest in the code, which could potentially be made available through a license system (Schei and Trædal, 2020). Simula has later issued a statement explaining that the Smittestopp solution is already being used in international tenders with commercial partners but that Simula will not profit from this, as the app has been developed with public funding: any revenue will be “dedicated in full to the further development of the app and for research related to digital contact tracing” (Simula, 2020b). I propose that careful attention must be given to the notion of “profit” as Smittestopp—with its attendant inventory of digital bodies—travels to international markets and becomes part of the global data economy.
Will it work? If so, how?
There are several ways of thinking about the app’s effectiveness, one being to ask what is required for it to be effective and what its effects will be. A quantitative approach might accept the idea of the Smittestopp app while being highly critical of this particular execution of the concept, whether because of the technical composition of the app, the approach to testing, or the strategies for communicating with the Norwegian public. Much of the Norwegian tech community, with their iterative proposals for “fixing” the app takes such an approach. A social engineering approach, by contrast, will question whether this kind of app can work at all. To great effect, security expert Bruce Schneider appeared on the front page of a major Norwegian tabloid to declare that “the app is meaningless”[meningsløs], asking what one would actually do with an SMS alert, since contact—defined as a proximity of 2 meters and less—is not the same as transmission (Kibar and Oterholm, 2020). Some Norwegian commentators have taken a similar stance (Aalen, 2020a).
Yet another approach is to consider not effectiveness but effects: how will the app work? Among the issues to be addressed is the question of what the digital suspicion produced by the app will do to social relations—or to the digital and physical bodies of those who are alerted that they have been in compromising proximity to an infected individual. What will be the consequence of false positives or negatives? Whereas the original story about the making of the app is framed around accepting difficult but necessary tradeoffs between speed and protection, the continuing emphasis on the emergency rationale gives pause for thought. This type of futureproofing—download the app and walk free tomorrow, is at odds with the research component of the app, which allows for storage of data beyond the 30 days—for personal data—as well as carrying the public message that transmission rates may well spike again in late fall and that the worst is ahead. At the same time, with the current reproduction rate, it is likely that very few SMSs about infection will be sent out: population movement data, however, will arrive in industrial quantities. Indeed, one month after the 16 April launch, the app had helped discover 0 infected individuals (Fransson, 2020), but had registered nine billion GPS positions (Gundersen and Grut, 2020).
Three weeks after the launch, initial government tech optimism was replaced with very public concern about whether the app would work. The success of the app has always been heavily dependent on the enrollment of a large number of active users, preferably more than half the adult population. By late April, 1.5 million Norwegians had downloaded the app (NRK, 2020c). However, only just under 900,000 had activated it (Brombach, 2020). As of mid-May, the app had been downloaded 1.52 million times, but the number of active users—those submitting GPS or Bluetooth data through the app over the previous seven days—had started to plummet. In the first week of May the app lost 150,000 users. By the second week another 60,000 users were gone (Veberg and Hager-Thoresen, 2020). By early June, the app had only a little over half a million active users. In response, FHI said that this was “very sad” and explained the decline as being caused by people turning off the tracking function by answering “no” when reminded to share data or deleting the app due to the drain on batteries. FHI emphasized that it was important that using the app was voluntary and hoped that carrying a charger around and charging more frequently could be a part of the dugnaðr (Jørgenrud, 2020). Importantly, this type of statement also raises the question of a controversial issue which has not been aired in the public debate, but which can fairly be said to be a not uncommonly held view, that is, that it would be better if downloading Smittestopp was non-voluntary. In early June, the Minister of Health emphasized that even if few people use the app, the data collected is very important to analyze the spread of the pandemic, the effects of public health interventions, and to optimize infection tracking. (NTB, 2020). For its part, FHI explained that due to complicated data processing issues generated by the enormity of the collected data, it had so far analyzed no data (BT, 2020).
Technical decisions could radically increase the pool of people needing notification, thus helping raise the app’s profile. The developers and owners of Smittestopp have a great deal of wiggle room in producing a social reality of usefulness. With reproduction rates under 1, we need to know more about “production targets” once the testing phase is completed and the full functionality of the app is put in place. The developers have argued that the testing phase is a strength of the Norwegian approach, i.e. a rapidly developed work-in-progress tracing tool. This capability means, for example, to respond to changed messages from the public health authorities for example through the technical ability to increase or decrease time frames (from 15 to 5, or from 15 to 20 minutes of proximity, within a time range made less or more than 24 hours) and varying physical distance (from 2 meters to 1 or 5, and so forth). For example, technical decisions such as shortening the time frame from 15 to 5 minutes, and/or an expansion in distance from 2 to 4 meters would drastically increase the pool of people to notify of potential infection, thus helping raise the app’s profile. At the same time, it would collect and combine more data. Similarly, one could imagine a policy change of graded backdating, where individuals could be notified after the fact of contact with an infected individual who only later tested positively for Covid-19.
There are also more concrete concerns that the app will have side effects and do things we do not want it to do through secondary effects on society and on the freedom of the press and access to justice in particular. The constant government assurance that use of the app is voluntary, and that the user can delete his or her data at any time, needs to be qualified through better technical understanding of how the data of various users co-constitute each other and of how various professions and social groups relate to the app. Importantly, the Association of Norwegian editors has expressed concern about source protection (Jensen and Nybø, 2020; Nored, 2020). With reference to protection of journalists’ sources, prime minister Solberg has said, that while some people might have special reasons, for example, a journalist meeting a source … they can always turn off the location tracker. I have respect for people’s feelings but the fewer the people participating, the more intrusive will be the interventions in other areas. [my italics] (Kibar et al., 2020a)
The public discourse: Calibrating and coopting resistance
In the final part of this intervention, I reflect on the public discourse around Smittestopp and whether we are witnessing a cooption of resistance and the potential for critical engagement. I am interested in three aspects intimately connected to the domestic cultural context of Smittestopp:
The first concerns the provision of critique for public consumption as part of the dugnaðr. The launch has been followed by vigorous and very public debate—with legal experts and experts on domestic and IT security weighing in—often with conspicuously STS-inspired arguments focusing on unpacking the functionalities of the app and its techno-politics (Aalen, 2020b; Kaufmann, 2020). Similarly, the slippery slope argument has been routinely used, and this argument has officially been met by the assurance that Simula and FHI welcome critical scrutiny. The media coverage of global expert mobilization against these types of apps (Skille and Gundersen, 2020) has also served to situate Norway and its domestic debates as being of world-wide significance and to provide reassuring confirmation for a self-conscious nation that critical debate is going on. As researchers begin their empirical investigations of Smittestopp, it will be interesting to learn more about the extent to which the significant amount of rapid-fire academically grounded criticism (including, perhaps, blogs by this author, see Sandvik, 2020c) has been incorporated into the socio-technical work of making the app politically and culturally appropriate and acceptable—that is the digital dugnaðr—or the opposite.
The second issue concerns the direction of criticism into bureaucratic channels and legal complaint processes, which is a key governance strategy of the heavily regulated Norwegian welfare state. Widespread unhappiness with the R&D for Smittestopp in the domestic tech community follows a familiar cultural trajectory: the app has already been the subject of a formal complaint to the KOFA, the appeals board for public procurement, because the project was not put out for tender by the FHI. The point at issue is whether the criterion for exemption from the tender regulations—that the app is “essential” (livsnødvendig) was fulfilled. Already by 16 March, Simula had obtained a legal opinion on whether exemption from a regular tender process would be legal (Copy of FOIA request, 2020c). The FHI bases its case on the provisions in the regulations on public procurement from 2016 (Hovland, 2020) for force majeure and the need for rapid delivery due to concern for the population’s life and health. KOFA’s decision is not expected any time soon. In early June, Smittestopp and FHI began testing a new version of Smittestopp based on the Apple/Google solution (Digi, 2020)—generating yet new questions about the lack of a tender process in a context where it is very difficult to argue that Smittestopp is deployed to deal with an emergency.
The data protection authorities have carried out several investigations into the controversial technical make-up of the app itself, and have issued a set of criticisms of FHI regarding, among other things, function creep and the lack of risk and vulnerability impact analysis (Thon, 2020). As I am finalizing this case-study in early June 2020, on their website, the data protection authorities explain that “We now have to assume [my italics] that adequate considerations according to the requirements of the law have been carried out and that the FHI has adopted safe solutions for the data” (Datatilsynet, 2020). As is well known from the social movement literature, legal and bureaucratic mobilization requires energy, financial resources, symbolic capital, and time. Whether it will matter if aspects of the app and R&D process are declared to be “illegal” is unclear.
The final issue concerns the pro-tech, pro-dugnaðr position of actors enrolled to monitor Smittestopp, including those tasked with being human rights watchdogs, in a situation where Smittestopp has been mired in public controversy from the very beginning. The small epistemic communities involved in decision making in Norway have very tight networks and depend heavily on each other (grant making, revolving doors). While Norway gains much of its foreign policy clout from being a humanitarian and human rights “superpower,” that the Norwegian human rights community considers human rights problems as something happening elsewhere has been obvious during the Covid-19 outbreak. Civil society has generally made very limited attempts to use human rights language to articulate criticism of the government’s Covid-19 response. In fact, with respect to Smittestopp, they have done the opposite: the National Institution for Human Rights declared its qualified support for the app, framing this as a tradeoff between tech adoption and intrusive interventions that curb human rights, in particular the human rights of the most vulnerable. They stated that legal proportionality and necessity considerations suggested the app represented a middle way between “pest and cholera” (between two evils) “Smittestopp has problematic aspects, but the alternative is probably worse” (Mestad and Skre, 2020; NHRI, 2020). The head of Amnesty International Norway commented that Smittestopp is “a mild breeze compared to the kind of surveillance that is being rolled out across the world” (Egenæs, 2020). The Director of the Norwegian Helsinki Committee proclaimed: “A Human Rights yes to the Smittestopp app” (Engesland, 2020). Notably, leading human rights academics have offered no comprehensive critiques of Smittestopp, as if the domestic context were comparatively unimportant (or already too crowded with critics). My general concern here is that from such a position of “governance human rightism” it will be difficult to identify, articulate, and sustain a position of critical distance from the Norwegian government’s Covid-19 response, also beyond the politics of the tracing app.
On the other hand, on 19 May activists and commentators from the Norwegian tech community issued an unprecedented joint statement on the app at Medium encouraging the Norwegian Institute of Public Health “to migrate to and only rely on a system that is privacy preserving by design… and that is subject to public scrutiny, as a means to ensure that the citizen’s data protection rights are upheld.” 1 The individuals behind the joint statement have continued to self-organize through a slack channel called “Smittestoppwatch” where they monitor the app, exchange information obtained through Freedom of information Act requests, and provide a supportive community for discussions on Smittestopp. This group of concerned and capable citizens—made up of activists and commentators from the public and the private sector—has the potential to become a new and dynamic force in Norwegian civil society. In my conclusion, I will explain why I think this type of engagement in critical conversations is vital for any pandemic response and central to the democratic legitimacy of the dugnaðr.
Conclusion: Dugnaðr, fear and the rule of law
Regardless of whether or how Smittestopp succeeds or not, Norwegians—along with everyone else—need to engage in critical debates about whether apps like this can work for infectious diseases and if so, how we want them to be added to the public health toolbox. In the case of Smittestopp, from a lessons-learned perspective, questions about the app’s effectiveness with respect to Covid-19 are less interesting than questions about its effects on politics and interpersonal relations.
With the caveat that this intervention is a preliminary commentary, and not an analysis based on empirical research, the crux of my critique is as follows: Smittestopp is premised on trust and dugnaðr and on being framed as a national digitization project. I am concerned that the problematic lifecycle of the app seems to threaten an emergent rupture between trust and dugnaðr, with potential long-term implications for national digitization. This pertains in particular to the relation between trust and the extreme haste with which the government opted to launch a highly invasive and technically unfinished app on their population—seemingly without appropriate risk assessments and with vague (and sometimes weirdly out of touch) public communication about its multiple functions, purposes, and problems. It also concerns the kinds of tradeoffs and binaries Smittestopp actors have deployed to convince citizens to “do dugnaðr”: They are partly technical, referring to specifications of the app or for example, when the FHI deputy director says that bringing a charger is a “small price to pay” if it “helps to such an important degree to stop transmission” (Zondag, 2020). However, as noted with reference to the prime minister’s equation of downloading Smittestopp with freedom, the tradeoffs are also framed as deeply moral. On his web page, the CEO of Simula claims that refusing to do digital dugnaðr because of personal data and privacy concerns is selfish: Everyone must play their part. We should all follow the recommendations of our health authorities, and one such piece of advice is to download Smittestopp. It is a compassionate act to download and use it, and a selfish act not to. (Tveito, 2020)
Moreover, whatever its legality with respect to procurement or data protection, the app—and the official rhetoric surrounding it—represents an unprecedented governmental willingness to engage in surveillance of the population. While Simula is fronting the app, the FHI is a government agency, and the Minister of Health is actively involved in supporting Smittestopp. There is always a risk that the combination of fear and consensus will result in a failure of judgement. What kind of freedom does Smittestopp bring, and what happens when this type of surveillance becomes normalized—and expected as part of Norwegian citizenship duties? Norwegians already happily let their digital bodies be produced, invaded, and controlled by international tech giants. This type of invasion is now spearheaded by the government, through the mobilization of trust and dugnaðr, which has resulted in the launch of an unfinished and poorly defined data-hoarding product. We need to ask whether this makes Smittestopp an innovation which is also disruptive as regards the rule of law.
Footnotes
Acknowledgements
The author is grateful to Maximilian Mayer, Bruno Martins, Antoine de Bengy Puyvallée, Katrini Storeng, Mareile Kaufmann, Hans Petter Graver, Maria Gabrielsen Jumbert, Kjersti Lohne, Helene OI Gundhus, Tatanya Valland, Jon Wessel Aass, Astrid Gynnild, Linnet Taylor, Heidi Beate Bentzen, Trond Arve Wasskog, Anne Kjersti Befring, Simen Sommerfeldt, Jarle Pahr, Malcolm Langford, the Smittestoppwatch crowd, and the students in the Sociology of law master’s program at UiO for the inspiration (and provocation) to write this and for their comments.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The writing time for this intervention was in part funded through the Research Council of Norway grant “Do no harm: Ethical Humanitarian Innovation and digital bodies.”
