Abstract
The features of the coronavirus disease 2019 (COVID-19) pandemic demand a new form of contact tracing performed by digital means: digital contact tracing. However, this mechanism raises several issues in light of relevant European regulations, particularly in terms of personal data protection and privacy. The challenge is to have a digital contact tracing model that efficiently and speedily alerts people to potential infection, so they can get tested and isolated as necessary, but that also complies with European legal standards. This paper will address two main issues. First, it will analyse digital contract tracing and its different features from the perspective of data protection and privacy, seemingly the main concern in this domain. Secondly, the paper will analyse the loopholes and benefits of digital contact tracing in the European context, focusing on the tension between privacy, individual liberties and public health to address its legitimacy.
Introduction
On 11 March 2020, the World Health Organization (WHO) declared the coronavirus disease 2019 (COVID-19) a pandemic.
1
This highly contagious disease, caused by a new coronavirus, poses various challenges for health authorities. First, [t]he virus can spread from an infected person's mouth or nose in small liquid particles when they cough, sneeze, speak, sing or breathe heavily. These liquid particles are of different sizes, ranging from larger ‘respiratory droplets’ to smaller ‘aerosols’
2
Second, symptoms may appear some time after infection takes place, 3 allowing infected people to pass the virus on without even knowing that they are infected. 4 Therefore, the best way to restrict the progression of the virus is by avoiding close personal contact, especially with infected people (although some infected people may be asymptomatic). As soon as someone has tested positive, they should inform everyone with whom they have recently been in contact, allowing them to swiftly seek a test and isolate if necessary. This can break the chain of infection. 5
The use of contact tracing to combat serious infectious diseases is not new. However, it has usually been done manually through interviews with infected people to identify others with whom they have been in contact. This method has some limitations, especially with infectious diseases that can be transmitted in the pre-symptomatic stage. 6 Patients may not remember or be able to identify everyone with whom they have been in contact, especially as some are likely to be complete strangers (for example, people who took the same bus or visited the same shop). 7 Moreover, the number of trained officials required to perform contact tracing for COVID-19 would be prohibitively high given the increasing number of infected individuals. 8 Even with enough staff, the process would simply be too slow to be efficient. 9 These obstacles can be surpassed using digital contact tracing (DCT); 10 that is, contract tracing performed using digital tools. 11
The paper will analyse DCT from the perspective of data protection and privacy, considering the European regulations in this domain, mostly the General Data Protection Regulation (GDPR), 12 the most important data protection regulation in the European Union (EU). 13 The discussion will be supported by references to the apps developed by several European countries. To date – July 2022 – of the group of 27 Member States, 22 states have adopted a contact tracing app, even though many (Austria, Cyprus, Czechia, Denmark, Finland, Hungary, Malta, Poland) have discontinued it meanwhile. Two of them (Greece and Slovakia) are in the process of doing it and one other (Romania) is exploring this possibility. Only three countries (Bulgaria, Luxembourg and Sweden) have no plans in this regard. 14 The majority of these apps present several common features: they work on both operating systems – Android and iOS – using Bluetooth technology with low energy consumption (Bluetooth Low Energy, BLE), based on the Application Programming Interface (API) developed by the Google-Apple partnership (Google-Apple Exposure Notification – GAEN). 15
Simultaneously, the paper will analyse the challenges posed by DCT in light of other legal norms (apart from the ones on data protection) applicable in the EU. It will focus on the tension between privacy, individual liberties and public health, aiming to clarify if any of these vortices has enough power to subdue the others.
The paper has two main targets, the second being a consequence of the first: (1) to analyse the different features of DCT in light of EU law; (2) based on that analysis, to recommend the more suitable features for contact tracing apps in light of EU law.
Different models of DCT in light of European law
A. What kind of data to use: Personal data or aggregated data?
Aggregated data are compiled or summarized individual data, typically used to identify trends, report statistics or conduct comparative studies. ‘Aggregation implies that variation at the individual level is lost’, 16 that is, that data are anonymized, 17 as the individuals concerned cannot be identified. A DCT app using purely anonymized data would fall outside the scope of the GDPR, which protects only personal data, as stated in Article 2 of the GDPR. However, if these originally anonymized data were to become identifiable (which might happen), 18 the protection provided by the GDPR would be reinstated. 19
The European Data Protection Board (EDPB)
20
has a clear preference for the use of aggregated data: [p]ublic authorities should first seek to process location data in an anonymous way (i.e., processing data aggregated in a way that individuals cannot be re-identified), which could enable the generation of reports on the concentration of mobile devices at a certain location (‘cartography’).
21
Nonetheless, it is nearly impossible to perform DCT only with aggregated data, that is, anonymous data. 22 Moreover, there is no guarantee that anonymization is irreversible. Artificial intelligence might offer a way to identify the data subject. 23 Thus, not even the use of aggregated data would solve all data protection issues.
Whenever personal data are involved regulations on data protection must intervene. DCT raises issues in light of the GDPR 24 because it processes personal data; that is, ‘information relating to an identified or identifiable natural person’ (Article 4(1) of the GDPR). DCT may involve the processing of location data, which can be considered personal data according to the definition provided in Article 4(1) of the GDPR, which expressly refers to location. 25
B. Should the app be mandatory or voluntary?
A legal ground for DCT under the GDPR
Under European law, the decision on whether the app should be mandatory or voluntary depends on the selected legal ground for data processing. The GDPR allows for the processing of personal data in some circumstances, set in Article 6(1). First of all, when ‘the data subject has given consent to the processing of his or her personal data for one or more specific purposes’ (Article 6(1)(a) of the GDPR), 26 based on the information to be provided under Articles 13 and 14 of the GDPR. Other legal ground can be found in Article 6(1)(c) (‘processing is necessary for compliance with a legal obligation to which the controller is subject’), based on the understanding that Member States are obliged to prevent and combat cross-border health threats in light of the Health Threats Decision, 27 a milestone in the EU for preventing and combatting public health emergencies. 28 Article 6(1)(d) (‘processing is necessary in order to protect the vital interests of the data subject or of another natural person’) has also been invoked as a possible legal ground on the assumption that DCT is necessary to protect infected individuals and provide them with speedy care and information. 29 A different legal ground for data processing can be found in Article 6(1)(e) of the GDPR: when data processing is ‘necessary for the performance of a task carried out in the public interest’. If this is the legal ground for DCT, the basis for the processing must be established either by EU law or by national Member States (Article 6(3) of the GDPR).
If the legal ground is found in Articles 6(1)(c) or 6(1)(e) of the GDPR, an additional requirement is demanded: an EU law or national law to act as a legal base for the data processing. The law in question must provide for adequate safeguards of privacy, taking into account the impositions of the GDPR, such as the duty to provide all relevant information to the data subject, limitations on the number and type of data collected and stored, and definitions of responsibilities.
If Article 6(1)(a) of the GDPR is not used as the legal basis to implement DCT, consent is not necessary. Nevertheless, consent might still be required. 30 Both the European Commission 31 and the EDPB 32 argued that consent should be sought, even if consent is not the selected legal ground.
Even more problematically, DCT involves data concerning health 33 (as defined in Article 4(15) of the GDPR), a particularly sensitive type of personal data, whose processing is generally forbidden under Article 9(1) of the GDPR. This ban can be waived under certain circumstances, but special protection for data concerning health is still required (Recital 53 of the GDPR). The processing of health-related data is permitted with the specific consent of the data subject (Article 9(2)(a) of the GDPR). It is also allowed when ‘processing is necessary for reasons of substantial public interest’ (Article 9(2)(g) of the GDPR), ‘necessary for the purposes of preventive or occupational medicine’ (Article 9(2)(h) of the GDPR) or ‘necessary for reasons of public interest in the area of public health’ (Article 9(2)(i) of the GDPR). 34 The COVID-19 pandemic clearly fulfils at least one of these conditions, especially the last one, which seems the most suitable legal ground for a pandemic scenario. 35
The position of EU institutions
Both the guidelines developed by the EDPB 36 and the communications from the European Commission 37 state that DCT should be voluntary. 38 The Commission clarifies that not only the downloading of the app but also the use of its different functionalities should depend on the data subject's consent: ‘[t]he installation of an app on a user's device should be voluntary; a user should be able to give their consent to each functionality of an app separately’. 39 The action of downloading the app must be distinguished from the action of having the data processed. 40 Whereas downloading the app requires previous consent, the subsequent processing of data might be based on personal consent or on any of the remaining legal grounds applicable to the situation, as established in Article 6 of the GDPR.
Even if data processing ends up being grounded on a different legal basis from Article 6(1)(a) of the GDPR, this does not mean that the data subject has no say in the process. Consent can still be required, even if this is not the legal ground for data processing. In addition, the exclusion of consent does not exclude other rights, such as the right to be provided with information (Article 13 of the GDPR), the right to access the data (Article 15 of the GDPR), the right to rectification (Article 16 of the GDPR), the right to be forgotten once the data processed are no longer necessary (Article 17 of the GDPR), the right to restriction of processing (Article 18 of the GDPR) and other rights provided by the GDPR.
Effectiveness of voluntary apps
Basing DCT on consent raises two issues: one concerning effectiveness (a strict requirement of voluntariness might jeopardize the purpose of the app) and the other concerning the relative importance of different values (should data protection and privacy prevail over public health?). 41
A decisive criterion to consider when deciding if the app should be mandatory or voluntary is its effectiveness, which, in turn, depends on the threshold of effectiveness, that is, how many people will have to use the app to achieve positive results. The first approach to this question claimed that effective results could only be achieved with a substantial number of users. A study by the University of Oxford pointed to 60% population coverage,
42
which so far has not happened in Europe
43
(in fact, nowhere in the world). Nowadays, however, this objection is greatly weakened. First, it became clear that different studies came to different conclusions. For instance, Abueg et al. concluded that DCT apps present positive results even with low percentages of adherence: it is enough for 15% of the population to use the app to reduce mortality by 6% and infections by 8%.
44
This conclusion is reinforced by examples of positive outcomes reached in countries with lower usage percentages.
45
Secondly, because eventually the authors of the Oxford study came to clarify their conclusions, stating that even a small percentage of users allows beneficial results: Even with lower numbers of app users, we still estimate a reduction in the number of coronavirus cases and deaths […] If you have 10% of people using the app, then the chance of contact between two people being detected is 10% of 10%, which is 1% – a tiny fraction. What we found in the simulation was that that actually isn't the case. We’ve been working to understand why we actually see benefits of usage accruing.
46
A possible solution would be to keep DCT voluntary but make the use of the app a precondition to benefitting from several services in everyday life. 47 For instance, only people using the app could be allowed to access cinemas, restaurants or taxi services. Ultimately, using the app could even be a condition to receive medical care. However, such false voluntariness is not a desirable solution, and the European Commission states that refusing to participate in DCT cannot carry any negative consequences for the individual. 48
The opt-out model
A way to ensure that DCT complies with consent requirements but effectively achieves public health purposes is to use an opt-out model 49 (whereby the app is downloaded by default and those who oppose it must express their opposition) 50 instead of an opt-in model (whereby the user has to download the app, as proposed by the Commission).
The opt-out model would increase efficiency by ‘encouraging’ people to use the app, as this does not come naturally, and people tend to go along with what already exists by default. Mello and Wang offer the example of Singapore: 51 although its population meets all the requisites for widespread use of the app (a population with informatic literacy whose members are accustomed to public intrusions into private life), only 20% of Singaporean smartphone users have downloaded the app. 52
The problem is that the opt-out model is excluded by the GDPR whenever consent is the legal basis for data processing (Articles 6(1)(a) and 9(2)(a) of the GDPR). When that is the case the subject's consent must comply with very strict requisites. Recital 32 of the GDPR clearly states that consent must be unambiguous; that is, an affirmative action on the part of the data subject is required. As expressly stated by this recital, ‘[s]ilence, pre-ticked boxes or inactivity should not therefore constitute consent’.
Therefore, the legal basis for this specific data processing should not be consent, in line with what has been suggested by the EU. Assuming a different legal basis is selected, consent can (and should) be asked but it won't be subject to such demanding requirements. In particular, it will allow the opt-out model and a more flexible solution in case of withdrawal, excluding the requirements of Articles 7(3) and 17(1)(b) of the GDPR. 53
C. How should the app operate?
The operating mode of digital contact tracing apps can vary a lot in regard to the type of technology used (GPS or Bluetooth), the data collected (location or proximity) and the roles of central sources and individual devices (centralized or decentralized). The option between these features must comply with the general data protection principles, listed in Article 5 of the GDPR. 54
The principles of the GDPR
The first set of principles comprises lawfulness, fairness and transparency (Article 5(1)(a) of the GDPR). Lawfulness requires data processing to be performed based on one of the legal grounds specified in Article 6 of the GDPR. Fairness requires that data are collected in an unbiased way, not by tricking the data subject (that is, the app's user). From the moment data are collected, the data subject must be fully aware of the purposes for which data will be used.
Transparency entails the provision of information to both the data subject and the public. In some situations, however, the data controller may be excepted from informing the data subject, as established in Article 14(5) of the GDPR. One such case is when it ‘proves impossible or would involve a disproportionate effort’ (Article 14(5)(b) of the GDPR) to provide such information. This exception might offer a seductive way to avoid complying with some of the requirements imposed by the GDPR. However, the former Article 29 Data Protection Working Party (Article 29 Working Party) clarified that the situation where it ‘proves impossible’ under Article 14 (5) (b) to provide the information is an all or nothing situation because something is either impossible or it is not; there are no degrees of impossibility. Thus, if a data controller seeks to rely on this exemption, it must demonstrate the factors that actually prevent it from providing the information in question to data subjects. 55
Moreover, data processing must be done in a manner open to public scrutiny. For instance, the algorithms used by the app must be made public and frequently supervised to ensure accuracy. 56 The developers of the Portuguese app StayAway Covid tried to be as transparent as possible: the source code and the data protection impact assessment were made public, the National Cybersecurity Centre participated in the code audit and a prior consultation with the supervisory authority was carried out. 57
The principle of purpose limitation (Article 5(1)(b) of the GDPR) requires that data are used for a specific and clear purpose and prevented from being used for different purposes. The data controller must guarantee that the data will not be used (either during the pandemic or afterwards) by employers, insurance companies, immigration services, law enforcement or any other public or private entity to take actions harmful to the data subject. Ultimately, DCT could become a tool for mass surveillance in the hands of governments. A more commendable purpose of the use of data collected by DCT is scientific investigation, eventually to develop effective vaccines 58 or a way to successfully treat COVID-19. Yet no matter how laudable this purpose may be, scientific investigation is not the aim of DCT. Researchers should only use the data for medical investigation under the requites outlined in the GDPR. 59
The principle of data minimization (Article 5(1)(c) of the GDPR) involves several conditions. First, when possible, anonymized or pseudonymized data should be used. 60 These two concepts must not be confused: it is impossible to identify an individual from anonymized data, but this is not true for pseudonymized data. Therefore, pseudonymized data are still ‘personal data’ in light of Article 4(1) of the GDPR, and thus remain under the rules of the GDPR (Recital 28 of the GDPR). Second, only data considered necessary for the envisaged goal can be used. Unnecessary information (for example, a person's identity, usernames, passwords, marital status, full medical history and financial records) cannot be collected. Moreover, effective DCT does not require the locations of the app's users, but merely proximity information. Third, measures should be taken to prevent a user from being identified by those who are notified. It has been argued that the only information communicated to an app user should be where and for how long the suspected contact took place, 61 but even this might not be necessary; instead, we could have a model in which solely the existence of a suspected contact is communicated. In addition, the information should be kept on the user's device (the decentralized model), and only shared with health authorities if the user tests positive for COVID-19. 62 The information should not be shared with the app's host or with any other third party.
The principle of accuracy (Article 5(1)(d) of the GDPR) is linked to the reliability of the data. To ensure that the data are correct (and thus that they lead to accurate public health decisions), control mechanisms must be put in place, not only to identify erroneous data but also to amend observed errors. Moreover, the principle of accuracy rests on the precision of the algorithm used, an aim that can only be achieved through constant monitoring by experts to avoid false positives or false negatives. Public trust, which is crucial to the success of DCT, relies on accuracy. 63
The principle of storage limitation (Article 5(1)(e) of the GDPR) requires that data are kept only for a certain period. Ultimately, the deadline to cease DCT and destroy the data that have been stored is established by the principle of necessity: when the pandemic is deemed controlled, DCT will no longer be necessary. 64 The need for the DCT app to be exceptional is emphasized by both the EDPB 65 and the European Commission. 66 When the pandemic is deemed controlled, the current restrictions on privacy must be withdrawn, the DCT app must be deactivated (which the European Commission suggests should happen automatically, with no need for the user to uninstall the app) 67 and the data collected by DCT must be destroyed 68 without further use or at least anonymized. It should be defined in advance in which scenario the pandemic will be considered ‘controlled’ (a threshold of 60% for herd immunity, for example), 69 and this is a decision that must be taken in light of existing epidemiological knowledge.
However, there might be valid reasons to keep the data. Parker et al. argue that these data may offer a valuable resource to protect future generations from harm 70 in the event of, for instance, future pandemics. Indeed, Article 5(1)(b) of the GDPR states that some data may be stored for specific purposes, that is, ‘archiving purposes in the public interest, scientific or historical research purposes or statistical purposes’, following the safeguards stated in Article 89 of the GDPR, namely anonymization or pseudonymization. Public health is not among these purposes, but scientific research could be used as a justifying purpose.
The principles of integrity and confidentiality (Article 5(1)(f) of the GDPR, in connection with Article 32(1) of the GDPR) require appropriate security measures to be put in place, namely cryptographic techniques to prevent undue access (by employers, insurance companies or pharmaceutical companies, for example). Appropriate security measures must be put in place to guarantee security by default (continuous compliance with the highest security standards) and security by design (implementation of security measures from the very first moment of data processing) 71 (Article 25 of the GDPR). The data anonymization process is not easy and even small details can reveal the person's identity. 72 If the level of cybersecurity is too low, the app becomes a danger in itself. 73 For instance, in April last year, there was a data breach involving the Dutch app: the app source code was published online, which led to the exposure of 200 names, email addresses and passwords. 74
The last of the data protection principles of Article 5 of the GDPR is the principle of accountability (Article 5(2) of the GDPR), according to which the data controller (public or private) and its role and responsibilities should be clearly defined. In this regard, the European Commission recommends that a public authority serves as the DCT data controller. 75 Moreover, affected individuals should be provided with efficient judicial mechanisms to enforce their rights.
The features of this particular form of data processing led to the conclusion that it might pose a high risk to the rights and freedoms of natural persons. Features such as its implementation by new technologies, whose possible consequences and loopholes have not yet been properly identified, its use of especially sensitive data (health-related data), and its large-scale implementation raise concerns. Thus, it is necessary to perform a data protection impact assessment according to Article 35 of the GDPR.
Tracking proximity or location
Location data 76 identify a target's latitude and longitude at a specific time. Location information can be obtained from various sources, such as telecommunication companies, other companies that offer digital services (social networking sites, search engines) 77 and specific apps that collect relevant data. Every time we use a mobile phone, we release location data (otherwise the phone would not work): the phone connects to a cell tower, mapping our movements for the telephone company, with an accuracy level of a few hundred metres. Location can also be tracked using GPS, with a few metres’ accuracy. This method is very common among apps, both those that require location data to operate and those that do not require location information but still ask for it. 78
The collection of proximity data or location data depends on the specific technology used by the app: GPS or Bluetooth. Bluetooth is a wireless technology for short-distance data exchange. An app performing DCT via Bluetooth collects not location data, but only data on users’ proximity to other app users and the time and duration of their contact. Users are identified by anonymized codes that can change periodically. If a person is identified as being infected, users who have been in close physical contact with him/her are tracked and receive a risk alert, so they can self-quarantine, get tested and eventually enter isolation if necessary.
In contrast, DCT using a satellite – that is, the Global Positioning System (GPS) – tracks the location of every user, and therefore collects information on people's movements. A privacy breach would thus have much more serious consequences, as location can reveal a lot about a person.
Furthermore, Bluetooth and GPS contact tracing also differ in their accuracy.
Realistically, GPS accuracy ranges from 5 to 20 m in the open sky (whereas Covid-19 can spread within a 1-m distance). 79 However, such accuracy is achieved only in the absence of obstacles, such as high buildings or thunderstorms, and GPS is barely functional indoors. Moreover, GPS contact tracing does not indicate whether, for example, two people are sitting on the same sofa or whether they are in different rooms, separated by a wall.
Bluetooth is the most precise technology, 80 but even this model can lead to mistakes. For instance, it cannot determine whether a user is wearing personal protective equipment, which, if used, would reduce the risk of infection. Experts point out that trusting implicitly in the strength and duration of a Bluetooth signal can easily lead to false negatives 81 or false positives. 82 Note, however, that the mere notification does not necessarily mean the person is infected and so the user should receive additional information on how to behave.
The main problem is that location data are never really anonymous. 83 They can reveal a user's religion (is the user in a mosque?), sexual orientation (is the user in a gay bar?) and political ideology (is the user in the headquarters of a right- or left-wing political party?). 84 Thus, the EDPB recommends using an app that offers proximity alerts (for infected users) via Bluetooth 85 but does not reveal the location. 86 Proximity data merely reveals that the devices were close to each other, but not where they were.
Centralized or decentralized model?
In a centralized DCT system, 87 information uploaded to the app is kept on a remote server (a central source operating as a single server) that receives information from the app's users, matches this information and notifies the users of other devices in cases of infection. The central database gathers a great deal of information that may be misused, even by authorities, and may also, therefore, become an appealing target for hackers. ‘If the central server is compromised, the entire network is’. 88 Thus, privacy is a relevant concern under this model.
In contrast, in the decentralized system, all information is kept on the user's smartphone, which also notifies other devices in the event of infection without the intervention of a central server. The most common decentralized version of DCT relies on Bluetooth Low Energy technology. Let's take the case of the Google-Apple API (application programming interface), which operates using a decentralized model. 89 When two devices equipped with this technology come into proximity, they both emit and receive (and register) the so-called ephemeral identifiers (EphIDs). Thus, if an app user tests positive for COVID-19, his/her device will be able to show recently registered (within 14 days) EphIDs from other devices. This information will allow the identification of nearby devices and analysis of the potential level of infection risk. 90 No central device receives the information; all devices play a similar role. The difference between the centralized and the decentralized model is not so much the existence of a centralized server, which exists also in some decentralized models, but where the main tasks of contract tracing (the generation of unique identifiers, the calculation of the percentage of epidemiologic risks) are executed. These tasks can take place either in the central server or on each device. 91
These features have important implications for data anonymity. The definition of data as anonymous depends on the understanding of this concept. The former Article 29 Working Party stated that for data to be considered anonymous, it must be impossible to reidentify the data subject. 92 According to this definition, if there is a way, no matter how costly and difficult, to reverse the anonymization procedure, the data are not anonymous. However, this theory is not compatible with the wording of the GDPR 93 or the case law of the CJEU, 94 which do not require it to be absolutely impossible to reverse the anonymization; it is sufficient to demonstrate that this could not be done without making an excessive effort and/or using illegal means. 95
In the centralized system described above, the end user may be identifiable. 96 As information about an end user accumulates, it becomes easier to identify him/her. 97 In other words, under the centralized system, there exists, at least in theory, the possibility of identifying and tracking a user. Data are not truly anonymous, merely pseudonymous. 98
In contrast, in a decentralized system, all information is kept on users’ smartphones. Thus, [t]he only way to link a device across multiple broadcast identifiers is by using information held on that individual's device. This would require access to that device or obtaining recordings from the device. Doing so without permission would represent a computer misuse offence in most jurisdictions.
99
This makes the data ‘nearly anonymous’. 100
For these reasons, the decentralized model presents less risk in terms of security breaches. 101 The EphIDs of each app user are generated by his/her personal secret key, not by an external source. In addition, the calculation of risk takes place solely on the end user's device; therefore, risk-related information never leaves the device, 102 which allows for better compliance with the principle of data minimization. Moreover, the entire technological design of the decentralized systems is based on privacy concerns (privacy by design), 103 which allows better compliance with the principles of data minimization and purpose limitation. 104
The guidelines of the EDPB 105 and the European Commission 106 do not express a clear preference for either model, leaving the decision up to national authorities, but the European Parliament recommened the decentralized data storage. 107 In the EU most countries adopted the decentralized model. The notorious exception is France, which insists on the centralized model, a feature that probably explains its very low percentage of adherence. 108
A recommended DCT model for Europe
A recommended model of DCT for Europe, that is, a model that complies with EU law requirements, is a voluntary app based on the decentralized system, 109 operating via Bluetooth and providing solely proximity data. The opt-out model is recommended to motivate adhesions, but it can only be implemented assuming that the legal ground for data processing is not the data subject's consent (Articles 6(1)(a) and 9(2)(a) of the GDPR). Compliance with the GDPR and with other EU regulations is the best way to reinforce citizens’ trust and thus encourage the use of the app in the event that it is voluntary (as it should be). 110
The API launched by Apple and Google fulfils many of the privacy requirements that have been pointed out in the EU 111 and abroad: 112 (a) the app is voluntary, based on the opt-in model; (b) it is prohibited to collect or use GPS location data from the phone; (c) Bluetooth beacons and keys cannot reveal the user's identity or location; (d) users are prevented from identifying the person who has tested positive; (e) the apps based on this protocol ping at least once every 5 min as long as Bluetooth is enabled; each ping contains the device current Rolling Proximity Identifiers, 113 but these will change every 10–20 min to reduce the risk of pings being used by others to track people's location; (f) the apps will be used solely for notification of exposure to a suspicious contact in the context of the fight against the pandemic carried out by public health authorities; (g) no other use of user data is permitted, including targeted advertising. 114 Such features seem to meet the criteria of privacy by design.
These technical specifications made the Google-Apple API very attractive in the EU and most European countries use it as the standard for their national apps. Although all these apps share the main protocol, each country has developed its specific features for all the other aspects not included in this API. 115 As for the features that are part of the API, they must be accepted as they come and cannot be changed by national authorities. Notably, two high tech companies (with dubious past in terms of data protection), are the ones to set up the path for a secure model (as secure as it can be) of digital contact tracing. 116
In the end, DCT apps were far from successful, namely due to the very low rate of adherence, 117 motivated by several reasons, from battery draining to fears of privacy threats. Even though positive results can be reached even with low adherence rates, this is one of those cases in which the more, the better. Had the apps been better received by the community, the results would certainly have been different.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Universidade de Macau (grant number MYRG2019-00035-FLL).
