Abstract
This article analyzes the interaction of EU competition, consumer and data protection law in the digital economy. We compare the objectives, rules and enforcement structures of these legal regimes, and we discuss market failures that justify regulatory intervention in digital markets. In particular, the Facebook investigations in Germany and Italy are selected as a case study. The Bundeskartellamt’s investigations are remarkable, being the first in which an exploitative abuse of dominance involving a digital platform has been decided under competition law. These we compare with their Italian counterpart, where the AGCM has recently sanctioned Facebook for behavior similar to that investigated in Germany. Yet, the Italian case has been decided under consumer, rather than competition law. This shows the regulatory dilemma faced by European antitrust authorities, which are currently struggling to find a solution to the market failures arising in digital markets.
Keywords
I. Introduction
The digital economy is characterized by rapid technological developments and the combination of economic and digital power. The corresponding unprecedented magnitude of data collection and the indispensability of online platforms for markets and citizens raise challenges for both society and legislators. Currently, the law is struggling to find appropriate answers.
As further discussed in Section II, digital markets are characterized by a number of “market failures.” The lack of informed consent by platform users in combination with the so-called “privacy paradox” leads to a lack of transparency in the terms of use used by online platforms and to the inability of online markets to cater for the privacy preferences users have. These are two examples of market failures that characterize digital markets. In this article, we will—from a European perspective—analyze and discuss the available regulatory tools to tackle these market failures and we will try to find the most suitable one.
Following the entry into force of the General Data Protection Regulation (GDPR), 1 the degree and uniformity of privacy protection has further increased in Europe. A number of provisions contained in the GDPR aim at tackling a number of market failures in digital markets, such as those requiring the data subject’s “informed” consent. In addition, by sanctioning misleading and aggressive commercial practices, consumer law also safeguards the final consumer’s “informed” choice. The provisions of the Unfair Commercial Practices Directive, as transposed by the EU Member States at the national level, are thus applicable to online transactions involving final consumers. 2 Last but not least, in the European Union, market failures in digital markets can also be tackled via antitrust law. Contrary to the situation in the United States, 3 Art. 102(a) of the Treaty on the Functioning of the European Union (TFEU) 4 sanctions “unfair trading conditions” imposed by dominant firms on their customers. Therefore, unfair contractual clauses imposed by dominant online platforms on their users could also, in principle, be sanctioned as an exploitative abuse of dominant position.
In this article, we discuss the interaction of EU competition, consumer, and data protection law in digital markets by looking at the Facebook “odyssey.” In particular, after a discussion of the main market failures that characterize digital markets (Section II) and an overview of the objectives, rules, and enforcement structures of these three areas of EU law (Section III), we discuss the investigations by the German Competition Authority (Bundeskartellamt) in the Facebook case (Section IV). The 2019 Facebook decision is interesting because it is the first case in which an exploitative abuse of dominance involving a digital platform has been sanctioned under competition law. The case is compared with the recent Italian Facebook case, where the Italian Competition Authority (Autorità Garante della Concorrenza e del Mercato) has sanctioned Facebook due to behavior similar to that investigated by the German Competition Authority. Nevertheless, the Italian case was decided under consumer, rather than competition law. The Facebook odyssey thus represents a good example of the “regulatory dilemma” currently faced by a number of National Competition Authorities (NCAs) in Europe, as they are struggling to find a solution to the market failures arising in digital markets. In Section V, we conclude by advancing some ideas on how enforcers could solve this regulatory dilemma.
II. Market Failures in the Digital Economy
A. The Economics of Privacy: An Adjusted Definition of “Market Failure”
In this section, we analyze market failures that characterize the data economy. The focus lies on those business models that trigger the applicability of data protection law and thus have implications for privacy matters. Social networks are an obvious example, but so are many other online platforms. In order to further define our scope of analysis, we first need to take a look at the economics of privacy. The latter has, traditionally, been defined as “the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information about them is communicated to others.” 5 Under Art. 2(1), the GDPR is applicable (only) when personal data relating to data subjects are processed. 6 When this is the case, data controllers are responsible for ensuring compliance with the provisions given under the GDPR and the rights afforded to the data subjects. 7
Economic analyses of privacy have shown that it is not possible to give a clear, uniform answer to the question whether or not the disclosure of personal data is beneficial for data subjects (here: final consumers, users) or data controllers (here: market-dominant companies, undertakings), respectively. 8 Economic efficiency in privacy matters depends on many diverse factors, such as the respective market, the individual preferences of those concerned, and the specific situation. As a rule of thumb, data controllers benefit from an increasing disclosure of personal data on the user side, which is why they tend to collect as many user data as possible. 9 Yet this beneficial effect is not necessarily the case all the time. 10 Furthermore, in many situations, data subjects benefit from data disclosure, too, since they can enjoy personalized and better services. Nevertheless, these effects are context-sensitive, and therefore no general statements can be made.
One of the reasons for these ambiguous results is that the economic value of privacy is twofold in nature. 11 Privacy can serve as an “intermediate good,” which is the case when the nondisclosure of information is beneficial for data subjects or when they can choose to disclose personal data in lieu of a monetary payment. 12 This first dimension, in theory, allows the assignment of a certain monetary value to privacy rights. Yet, privacy also serves as a “final good.” 13 This dimension has a purely normative origin and character. Its legal protection has its origin in constitutional documents, most notably Art. 8 of the EU Charter of Fundamental Rights (“Protection of personal data”). Privacy preferences are personal and diverse by nature. 14 What one person considers to be highly sensitive information might be readily and willingly disclosed by someone else. 15
Against the backdrop of this normative layer, we have chosen to look at the question of whether or not there is a market failure by using a normative approach. Instead of solely looking at economic efficiency, we borrow from Alessandro Acquisti, asking: “will market forces be able to maintain a desirable balance between privacy and disclosure, in a world where most of our personal and professional lives unfold trails of electronic data, and where powerful economic interests favor information availability over information protection?” 16 We start by analyzing the role of consent under the GDPR, since this legal framework significantly predetermines and shapes the behavior of market players.
B. The Regulation and Notion of Consent under the GDPR
The GDPR relies on the traditional principle under EU data protection law that the processing of personal data is prohibited unless a legal basis for it can be invoked by the data controller. 17 Art. 6(1) GDPR provides an exhaustive list of legal bases, such as contractual necessity and legitimate interests pursued by the data controller. 18 One widely relied-on legal basis is consent given by the data subject: Art. 6(1)(a) GDPR. Art. 4(11) GDPR states that consent “means any freely given, specific, informed and unambiguous 19 indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.” Art. 7 GDPR further specifies the conditions for valid consent. For instance, the data subject is free to withdraw consent at any time (Art. 7(3) GDPR). In the Recitals of the GDPR, more detailed explanations are given. 20 When it comes to the processing of special categories of personal data, such as those concerning the data subject’s health or ethnic origin, Art. 9 GDPR is applicable. Processing is prohibited in these cases, unless one of the exceptions given in Art. 9(2) GDPR applies, such as when the data subject has given “explicit consent” 21 to the processing (Art. 9(2)(a) GDPR).
The idea that consent serves as a legal basis for the processing of personal data reflects the notion of “informational self-determination,” 22 one of the leading principles of European data protection law. Correspondingly, Art. 6(1)(a) GDPR aims at allowing data subjects to decide autonomously whether and to what extent personal data relating to them can be processed. 23
These requirements make clear that the GDPR follows the idea that data subjects should have full knowledge and control over what is happening to “their” personal data when granting consent. Data subjects are deemed to act autonomously and fully informed of all facts necessary: who has access to the data, how they are processed, and what the corresponding future implications might be. Both the willingness and the actual ability to decide autonomously whom to give consent form the basis for the function of consent under the GDPR, as expressed by the terms “freely given” and “informed.”
In practice, this idealized picture of effective informational self-determination and autonomy in many situations does not live up to its goals. The user side oftentimes does not act as envisaged by the GDPR’s drafters—and data holders know this and act accordingly.
C. The “Privacy Paradox”
To better understand why markets may fail in the data economy, it is necessary to take a closer look at the so-called “privacy paradox.” This expression refers to the phenomenon that a broad majority of users claim to care about their privacy and the need for data protection, while in reality they do not act according to these expressed preferences and desires. 24 Internet users oftentimes disclose personal data freely and give consent to the processing of their personal data by simply “agreeing” with online “terms and conditions” and privacy policies without properly reading, let alone understanding them. 25
The privacy paradox can lead to two problems that can turn into market failures. Both are related to the role of consent. Firstly, users do care about data protection, but they are not willing to act accordingly and take measures to actually protect their data, such as carefully reading privacy policies or using privacy-enhancing technologies on their computers. Users know that it might be reasonable to be more careful and deliberate, but out of a mixture of “willful data negligence” and laziness they do not act upon their own standards. Secondly, due to a lack of transparency of data-related processes and intelligibility of declarations of consent, users do not know what happens to their data. The “root” of this problem is not unwillingness or laziness, but rather a lack of ability: Even if users put effort into making an informed choice, this is not (or barely) possible for them. Oftentimes, the problems mix, since most privacy policies on websites are so long that hardly any user would be willing to read or able to understand them.
D. Markets Do Not Cater for Users’ Privacy Preferences
From an economics perspective, the following situation emerges. Internet users have clear preferences for a specific amount of privacy protection when using certain web services, such as search machines and social networks. Yet, markets usually do not provide as many privacy options as would be necessary to cater for these preferences. 26 When it comes to social networks, direct network effects can ultimately lead to market concentration. 27 This, in turn, can lead to the situation that new and existing users are faced with a “take it or leave it” lockup and have to either consent to the terms given or abstain from using the service. 28 A well-functioning competitive market would be able to satisfy these demands and offer different options for different privacy preferences. For example, it would be feasible for a social network to offer the option to restrict its collecting and processing of personal data to the minimum necessary and refrain from using the data for marketing purposes in exchange for a monthly payment. 29 This would allow those users who value their privacy higher than the monthly payment to satisfy these preferences. In practice, privacy-friendly options are often not offered, and users are not adamant in demanding them. It is no surprise that companies rarely compete based on the privacy quality of their services. 30
The background to this unsatisfactory situation is twofold: Direct network effects (and other reasons) lead to market concentration and dominance, so that users have no or only a limited choice which service to use. At the same time, their behavior is irrational insofar as their high demand for privacy and data protection does not make them act accordingly on the market, as most people tend to still use the “free” services. Eventually, these factors induce users to agree to privacy policies, because they neither have a real choice nor do they really act according to their preferences. In situations like these, it is highly doubtful whether consent under these circumstances is “freely given” and “informed.” Instead, its granting turns out to be no more than a fiction and an act of formalism, aimed at ensuring legal compliance. 31 This being the case, a market failure within the meaning of the abovementioned definition is given: The balance between privacy and data disclosure in these situations runs counter to the clear preferences of the users (and to the intention the lawmakers had when drafting the GDPR), and markets are not able to satisfy the (privacy) demands users have.
E. Lack of Transparency as a Market Failure
The second market failure is the lack of transparency users face when giving consent online. 32 Oftentimes, users do not know to what extent their personal data are collected, processed, and passed on to third parties. As a result of information asymmetry, they are not always able to make well-informed rational decisions. 33 This market failure can also be traced back to the problematic role of consent, but it is different in nature. Here, the problem is that Internet users regularly consent to the processing of their personal data, even though they are not (or barely) able to foresee what happens to these data. Not being able to make a choice in an informed manner, even if one wanted to, is the main problem here.
Again, online privacy policies are part of the problem. It has been found that users do not actually read them, but often rather blindly accept them. 34 For example, for a 2015 survey commissioned by the European Commission, 21,707 people were asked to what extent they read online privacy policies. It was found that only 18% of the respondents fully read them, while 31% do not read them at all, and 49% only read them partially. 35 Furthermore, these policies are often very long and drafted in a manner that most users do not understand, using broad formulation that is open to interpretation. 36 Thus, users “agree” with policies at large but are not aware of what exactly they consent to.
III. The Interaction of Competition, Consumer, and Data Protection Law in the EU
The market failures in the data economy discussed in Section II open the issue of the interaction of competition, consumer, and data protection law. This is a general problem that affects the enforcement of EU competition law in the data economy and is particularly relevant as regards the application of EU competition law to unfair contractual clauses in online environments. A number of authors have argued that competition law is not the most suitable legal instrument to sanction unfair clauses imposed by online platforms on final users. According to them, these clauses should instead be sanctioned either via consumer or data protection law. 37 In this section, we discuss these concerns by comparing the objectives, scopes of application and systems of enforcement of EU competition, consumer, and data protection law.
Competition, consumer, and data protection law share the overarching aim of protecting the welfare of individuals in the modern market economy. 38 In particular, these legal regimes are concerned with the power asymmetry between individuals and undertakings. 39 In spite of these “family ties,” the objectives, scopes of application, and enforcement regimes of each policy are rather different. 40 In terms of goals, EU competition law aims at safeguarding undistorted competition within the EU internal market. 41 In particular, by sanctioning the anticompetitive behavior of undertakings, competition policy indirectly safeguards the aggregate welfare of consumers. 42 The scope of application of EU competition law is horizontal: This field of law is applicable to private and state-owned undertakings as well as to public entities when they operate in the market. 43 The core provisions have been primary law since the Treaty of Rome. 44 Since the decentralization of EU competition law, the European Commission and NCAs have enforced Arts. 101 and 102 TFEU in parallel. 45 In addition, national courts have a growing role when it comes to private enforcement of competition law. 46
Data protection law safeguards the “fundamental rights and freedoms” of data subjects (i.e., individuals) and “in particular their right to the protection of personal data,” without restricting or prohibiting the “free movement of personal data” (cf. Art. 1(2) and (3) GDPR). Data subjects can be consumers when data protection affects the processing of personal data by private firms, but they can also be citizens who interact with the public administration. 47 In terms of application, data protection law has a different scope than competition law, as a different approach is followed. It is only applicable when “personal data” are processed, that is, those kinds of data that contain information relating to the identity of a data subject (cf. Art. 2(1) GDPR). Anonymized data fall outside the scope of data protection regulation, as do those that were not “personal” from the outset (such as weather data or many kinds of machine data). 48 Similar to competition policy, the core provisions of data protection law are primary law within the EU legal system. In particular, since the Treaty of Lisbon data protection has been recognized as a fundamental right in the EU Charter of Fundamental Rights. 49 Similar to competition law, the system of enforcement of this policy has been decentralized: National supervisory authorities are the main enforcers of the GDPR. 50 On the other hand, unlike competition policy, an EU-wide data protection authority does not exist. 51 Private enforcement also takes place, but to a lesser extent than in competition law matters.
The objective of consumer law is to safeguard the informed free choice of consumers. Unlike competition law, consumer law protects the welfare of individual consumers, rather than the aggregate consumers’ welfare in the economy. 52 Instead of sanctioning anticompetitive behavior that has an indirect negative impact on the welfare of final consumers, consumer law sanctions unfair contractual terms that could mislead consumers and thus harm their free choice. 53 As regards its scope of application, consumer law covers the contractual relationship between undertakings and final consumers, while business-to-business relationships fall outside the scope of this policy. 54 Similar to data protection, a right to a high standard of consumer protection is also included in the EU Charter of Fundamental Rights (cf. Art. 38). Nevertheless, the consumer law acquis is less harmonized at the EU level than competition and data protection law in terms of secondary legislation. During the past decades, the EU has adopted a number of Directives to harmonize national consumer law. 55 However, differences still persist at the national level, in particular in relation to enforcement matters: 56 While some Member States have established an administrative authority in charge of enforcing the EU consumer law acquis, 57 other Member States rely on a judicial system of redress. 58
This brief overview of the objectives, scopes of application, and enforcement regimes shows that these three policies share a number of common features, which could be described as “family ties.” At the same time, the differences given show that these policies cannot replace each other. They coexist since they pursue different goals via different tools, and they have a different scope of application. Consequently, data protection and consumer law cannot a priori preclude the enforcement of EU competition law in the data economy. The same view was also expressed in 2016 by the German Bundeskartellamt and the French Autorité de la Concurrence in their joint report on competition law enforcement in the data economy: [T]he fact that some specific legal instruments serve to resolve sensitive issues on personal data does not entail that competition law is irrelevant to personal data. Generally speaking, statutory requirements stemming from other bodies of law may be taken into account, if only as an element of context, when conducting a legal assessment under competition law.
59
To sum up, competition, consumer, and data protection law share a number of “family ties.” These ties are particularly evident in the data economy. Although these legal regimes share common aims, they have different objectives, scopes of application, and enforcement regimes. As confirmed by CJEU case law, the legality of a conduct under another legal regime does not prevent the enforcement of EU competition law. Whereas EU competition law should not pursue data protection goals, competition law enforcers should have the discretion to intervene in case of market failures in the data economy, even in the presence of overlapping data protection and consumer law applicability.
IV. The Facebook Odyssey
A. The German Facebook Case
1. The Facebook Proceedings of the Bundeskartellamt
In the following section, we will analyze the abuse-of-dominance proceedings conducted by the German Bundeskartellamt against Facebook Inc. (USA), its Irish subsidiary, and Facebook Germany GmbH (based in Hamburg). 63
In March 2016, the NCA—based on German competition law—formally initiated proceedings against Facebook based on the suspicion that the social network was abusing its market power by violating data protection rules. In December 2017, the authority published a detailed preliminary assessment and background information to the proceedings. 64 Based on the assumption that Facebook is a dominant company on the market for social networks in Germany, the Bundeskartellamt held “that Facebook is abusing this dominant position by making the use of its social network conditional on its being allowed to limitlessly amass every kind of data generated by using third-party websites and merge it with the user’s Facebook account.” 65 In line with this reasoning, the authority eventually issued a final administrative decision against Facebook in February 2019, prohibiting the social network “from combining user data from different sources.” The decision was accompanied by a press release, a background paper, and a case summary. 66 At the moment of writing (i.e., March 2019), the actual decision including the detailed reasoning has not been published yet. Facebook has already appealed the decision to the competent Düsseldorf Higher Regional Court (Court of Appeal).
In its decision, the Bundeskartellamt draws a clear line between the collection and use of data on the network itself (“on Facebook”) and from third-party websites (“off Facebook”). 67 Only the latter was the subject of the investigations, 68 and it refers to those websites and apps that have an embedded API (Application Programming Interface) with Facebook that allows for data sharing. The Bundeskartellamt further distinguishes between services owned by Facebook (most notably WhatsApp and Instagram) and other third-party websites that, from a user’s point of view, are not always prima facie connected to the social network at all. 69 All of these websites and apps transfer personal data relating to users to Facebook, no matter whether they, for instance, make use of Facebook’s “Like button” or otherwise actively engage in data sharing. In this context, the “Facebook Business Tools,” provided by the company for free, play an important role.
In essence, the authority has prohibited the social network from using terms of service that force users to consent to Facebook collecting personal data from third-party websites and apps (including Facebook-owned services) and assigning them to the individual user accounts. 70 The Bundeskartellamt prohibited the terms of service to the extent necessary as well as the explanatory data and cookie policies. Furthermore, it also prohibited the actual corresponding data processing itself. Facebook has twelve months to implement the decision.
The merging of data from these different sources will, after implementation of the decision, only be possible when users have given what the authority coined “voluntary consent.” Facebook is obliged to change its terms of service and the corresponding internal processing of user data. If users do not consent, the data sharing must be substantially restricted, for example, by lowering the amount of data transferred or by implementing additional control options for users. 71 As a rule of thumb, the data sets must be kept separate if no consent is given. Accordingly, the President of the Bundeskartellamt in its press release is quoted as saying that this approach “can be seen as an internal divestiture of Facebook’s data.” 72
The terms of service were clearly at the center of the investigations and key to the competitive assessment. The Bundeskartellamt’s accusations follow a two-step logic. Firstly, Facebook confronts its users with a “take it or leave it” offer. Users have to accept the excessive amount of data collection, also from third-party websites, without limits—or abstain from using the service at all. Secondly, the authority refers to infringements of the rules on data protection.
The violation of data protection law is a key part of the Bundeskartellamt’s decision to find Facebook’s conduct abusive. With regard to jurisdiction and competence, the Bundeskartellamt states that in those situations where access to personal data of users of a social network is a significant factor for its market position, not only data protection but also competition authorities are competent when it comes to investigating how personal data are processed by the undertaking. 73
From what the documents at hand tell, the key focus of the data protection law assessment lies on Art. 6 GDPR (lawfulness of processing). The authority states that Art. 6(1)(b) GDPR—contractual necessity—does not suffice to justify the excessive collection of “off Facebook” data. 74 It is acknowledged that the processing of personal data is necessary to a certain extent to run the network itself, personalize it to its users, and sell advertising space to third parties in order to monetize it. Yet, according to the case summary, there clearly is no necessity to merge data from different outside sources to the extent Facebook does. Furthermore, the authority does not see a legal basis for the processing of these data in Art. 6(1)(f) GDPR (legitimate interests clause). This is the result of an extensive assessment and balancing of the interests of Facebook, third parties, and the users. 75 The authority considered “the consequences for the affected users, taking into account the data type and the way in which it is processed, reasonable expectations of users and the respective positions of Facebook and its users.” 76 Facebook’s market power and the corresponding power to unilaterally impose the terms of service on its users also played a significant role for this outcome. In the end, the authority concludes, only “voluntary consent” can serve as a legitimate legal basis (cf. Art. 6(1)(a) GDPR). Yet, the terms of service used by Facebook (including the data and cookie policies) do not live up to this standard, since users must either accept them as a whole or abstain from using the social network at all. 77 Users are de facto forced to accept, which is also the result of direct network effects: Users oftentimes will not find their friends, colleagues, and so on, on other social networks, so they lack proper alternatives. The term “voluntary consent” seems a bit odd, taking into consideration that the GDPR does not use this terminology. Instead, consent must be “freely given,” and it must be specific, informed, and unambiguous as well: Art. 4(11) GDPR. By accusing Facebook of demanding far-reaching consent and making it an unconditional prerequisite for registration, the Bundeskartellamt seems to refer to Art. 7(4) GDPR. According to this provision “utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract” when assessing whether or not consent was freely given. Based on the circumstances under which consent is “granted” to Facebook, it seems indeed convincing to find that it is not effective. 78 The Bundeskartellamt argues that these data protection law infringements must be seen as a “manifestation of market power.” 79 The authority finds that Facebook’s dominant position at least facilitated these violations (normative causality). Also, the market power in connection with the data protection violations allowed the company to gain a competitive edge vis-à-vis its competitors and also to increase the barriers to market entry. 80
It is not surprising that the Bundeskartellamt takes into consideration the legality of Facebook’s conduct under data protection law as part of its competitive assessment, and thus makes it an integral part of its abuse-of-dominance investigations. In May 2016, the authority, in a joint paper with the French Autorité de la Concurrence, expressed the opinion that even though data protection and competition law pursue different goals, the use of privacy policies and the corresponding processing of personal data can be taken into consideration if they affect competition. 81
It is noteworthy that in its publications regarding this case, the Bundeskartellamt makes reference to both of the privacy-related market failures described in Section II. The NCA maintains that Facebook makes users lose “control” as they “cannot perceive which data from which sources are combined for which purposes with data from Facebook accounts.” 82 This makes clear that the authority sees a “lack of transparency” with respect to Facebook’s business model, in particular as the transfer of data takes place even when users choose to disable web tracking in the settings of their browsers or devices. 83 The authority also states that “[t]he investigations have shown that users in Germany generally consider the terms and conditions for processing data to be important and that they are aware of the implications of data transfer. However, because of Facebook’s market power users have no option to avoid the combination of their data.” 84 Thus, privacy preferences of many users are not catered for by Facebook.
2. Legal and Factual Background
The authority’s approach to the case is unorthodox on two levels: Firstly, the Bundeskartellamt chose to rely on competition law to investigate Facebook’s conduct, as opposed to consumer or data protection law. Secondly, it relies on Art. 19(1) of the Gesetz gegen Wettbewerbsbeschränkungen (GWB, i.e., Act Against Restraints of Competition 85 ). Therefore, the NCA has sanctioned Facebook under German, rather than European competition law. This might be partially explained when seen against its legal and circumstantial backdrop.
As regards the policy choice, several authors have criticized the Bundeskartellamt’s decision to prosecute the case under competition law. They argue that the case should rather be solved under consumer or data protection law. 86 Indeed, relying on competition law rules means that the NCA has to define the relevant market and prove market dominance of the online platform. Hence, the burden of proof is significantly higher when it comes to sanctioning unfair contractual clauses. However, the most apparent reason the Bundeskartellamt took this (obviously more difficult) route is that otherwise it would simply lack competence. 87 In Germany, unfair commercial practices that affect consumers can only be enforced in civil courts by qualified institutions, associations, and chambers of industry and commerce. 88 As opposed to several other European competition authorities, 89 the Bundeskartellamt does not have the competence to adopt administrative decisions to sanction unfair commercial practices under German consumer law. Germany has no general public enforcement authority, but only specialized authorities for specific sectors, such as the Bundesnetzagentur (Federal Network Agency) and the Bundesanstalt für Finanzdienstleistungsaufsicht (Federal Financial Supervisory Authority). Yet, the Bundeskartellamt’s limited competence in the consumer law field might widen: As a result of the 2017 9th amendment to the GWB, 90 it was given competence to launch sector inquiries (in case of alleged serious consumer law violations relating to a significant number of consumers) and act as amicus curiae in corresponding civil court proceedings. Even though these competences are only analytical and advisory in nature, they represent a first step toward more actual power for the authority in the consumer law field. A 2018 study commissioned by the Federal Ministry for Economic Affairs and Energy recommends extending the Bundeskartellamt’s competence to consumer law matters by, for instance, granting powers to intervene (such as to issue cease-and-desist orders). 91 Also, Andreas Mundt, the Bundeskartellamt’s President, and other officials publicly argue along these lines. 92
The choice of Art. 19(1) GWB 93 (i.e., the national equivalent to Art. 102 TFEU) might also have been due to rather pragmatic reasons. By relying on national law, the Bundeskartellamt does not run the risk that the case will be referred to the CJEU, as it will necessarily remain within the national jurisdiction. This explains why the authority bases its assessment on, inter alia, case law by the German Federal Supreme Court, namely the VBL Gegenwert II and the Pechstein case. 94 These cases deal with the general clause of Art. 19(1) GWB, its applicability in the context of exploitative business terms, and the role constitutional law plays when assessing corresponding abuse-of-dominance allegations in the context of unbalanced negotiation positions. Therefore, in the Facebook case, the authority might expect a more easily predictable outcome of subsequent court proceedings. Apart from that, as it could be anticipated that the novel approach taken in the Facebook proceedings is considered a “borderline case” in a “grey area,” the Bundeskartellamt might be prepared to argue that Facebook’s conduct may not be punishable under EU, but at least under national competition law.
In spite of these pragmatic reasons, choosing Art. 19(1) GWB as the legal basis remains problematic. Under Art. 3(1) Reg. 1/2003, NCAs are required to apply Arts. 101 and 102 TFEU when the anticompetitive conduct has an impact on “intracommunity trade.” This provision, broadly interpreted by CJEU case law, 95 would probably be applicable in the present case. Facebook operates in Europe via its Irish subsidiary: The cross-border aspect in this case is self-evident. Under Art. 3(2) Reg. 1/2003, EU Member States can rely on “stricter” national competition law than Arts. 101 and 102 TFEU. 96 This is the case, for instance, with Art. 20 GWB. The latter provision sanctions the abuse of “relative” market power of the supplier vis-à-vis its customers. Thus, Art. 20 GWB has a broader scope of application than Art. 102 TFEU, which, by contrast, sanctions “only” the abuse of a dominant position within the relevant market. In its assessment of the Facebook case, the Bundeskartellamt relies on Art. 19(1) GWB, rather than Art. 20 GWB. By sanctioning any “abuse of a dominant position,” Art. 19(1) GWB reflects the language and the scope of application of Art. 102 TFEU. Thus, the exception under Art. 3(2) Reg. 1/2003 is not applicable in this case. As mentioned above, the German NCA probably opted for national competition law as the legal basis in order not to create a precedent at the EU level and to be able to rely on the case law of the German Federal Supreme Court, rather than CJEU case law. However, these do not seem to be good reasons to explain the choice of legal basis: In its case law on Art. 102(a) TFEU, the CJEU has already sanctioned unfair trading practices imposed by dominant firms on its customers. 97 This case law could also be applied in the Facebook case.
3. Possible Remedies
The Bundeskartellamt conducted the investigations as administrative proceedings, in contrast to fine proceedings. 98 This means that it was clear from the outset that the imposition of a fine could not be expected. Yet, the present case serves as an interesting scenario to hypothesize how the fine would have been determined.
Had the authority decided to impose a fine on Facebook, the competition law approach taken would have significantly increased the possible amount of penalty. The German Act Against Restraints of Competition determines a fine of up to 10% of the total turnover of the undertaking (or association of undertakings) achieved in the business year preceding the decision of the authority: Art. 81(4) GWB. This equals the maximum fine under European competition law: Art. 23(2) Reg. 1/2003. It should be noted that when the Facebook proceedings were initiated, the GDPR was not yet applicable, 99 but the Bundesdatenschutzgesetz (BDSG, i.e., German Federal Data Protection Act) was in effect. 100 As a result, the maximum administrative fine that could have been imposed by the competent Data Protection Authority for data protection law violations at the time was €300,000 (Art. 43(III) BDSG). Thus, in terms of deterrence, the fine to be imposed upon Facebook under national data protection law would not have been even close to the administrative fines now contained in the GDPR (up to 4% of the total worldwide annual turnover of the preceding financial year). 101 Yet, the entry into force of the GDPR here in the end would not have played a role fine-wise, since opting for the “competition law route” means that fines would be calculated based on the GWB anyway.
B. The Italian Facebook Case
1. The Facebook Decision of the Autorità Garante per la Concorrenza e il Mercato
While the German Facebook case was decided based on national competition law, the Italian NCA has recently adopted a decision sanctioning Facebook for a breach of Italian consumer law. On November 29, 2018, the Autorità Garante della Concorrenza e del Mercato (AGCM) imposed a fine on Facebook for the following violations:
102
– Misleading consumers: According to the Italian NCA, Facebook misled consumers by promising at the moment of registration a “free” service, while the service is actually “paid” for with the personal data that users transfer to Facebook.
103
The latter are used for targeted advertising. – Aggressive commercial practices: Facebook engaged in a number of practices which aimed at discouraging users from blocking the transfer of their personal data to third-party websites.
104
In particular, while users were able to actively opt out of the transfer of personal data to third parties, the default option in the Facebook settings was to allow this transfer. Furthermore, Facebook warned its users that by modifying the default settings, their “social experience” could be affected and they would not be able to access certain contents and services provided by Facebook. Facebook thus misled consumers by encouraging them not to change the default settings.
2. Legal and Factual Background of the Italian Facebook Case
As already mentioned, the AGCM decided the Facebook case under the Italian Codice del Consumo (Consumer Code). 105 The Code implements the Unfair Commercial Practices Directive. The Directive defines a commercial practice as “unfair” when it is “contrary to the requirements of professional diligence” and “materially distorts…the economic behaviour with regard to the product of the average consumer whom it reaches or to whom it is addressed”—that is, the practice misleads the consumers and thus affects their ability to make an informed choice when purchasing the product. 106 In particular, the Directive includes a subcategory of unfair practices that “mislead” consumers by providing “false” or “untruthful” information about the product. 107 In Annex I, the Directive includes a list of commercial practices “which are in all circumstances considered unfair.” In particular, “describing a product as ‘gratis’, ‘free’, ‘without charge’ or similar if the consumer has to pay anything other than the unavoidable cost of responding to the commercial practice and collecting or paying for delivery of the item” is considered “unfair.” 108 Secondly, the Directive defines “aggressive commercial practices” as practices where the seller limits the “consumer’s freedom of choice or conduct” by “harassment, coercion…or undue influence.” 109 Annex I also includes a list of practices that are considered aggressive per se.
In its decision, the AGCM briefly discusses the “unfairness” of the Facebook registration system. 110 The AGCM acknowledges that the users’ data were the counterperformance in the contract concluded with Facebook. 111 Therefore, Facebook did not adequately inform consumers about the nature and the extent of the counterperformance required to access the social network. By relying on the per se prohibition in Annex I of the Unfair Commercial Practices Directive, the AGCM “easily” reaches the conclusion that Facebook has misled its users. The AGCM’s analysis of the “aggressive” nature of Facebook’s practices, by contrast, is quite detailed, since it could not rely on any per se prohibition in Annex I. 112
The recent Italian Facebook decision is quite interesting when compared to the decision by the Bundeskartellamt. The cases concern similar but not identical conducts. In particular, the aggressive commercial practice under scrutiny that the Italian NCA takes issue with in relation to the transfer of personal data to third-party websites is similar to the “off Facebook” practices investigated by the Bundeskartellamt. Both authorities investigate the automated flow of personal data between Facebook and third parties. Nevertheless, the two cases have been investigated using different legal bases. The AGCM, with its competence in both the competition and the consumer law field, opted for the “easier” route in the present case. In particular, by sanctioning the case under consumer law, rather than under Art. 102(a) TFEU, the AGCM has avoided both the complex assessment of the relevant market and the question whether or not Facebook has market power. As shown in Section IV.A.2, the Bundeskartellamt did not have such a choice. Due to its lack of competence in the field of consumer law, the authority was forced to investigate the case under competition law. In particular, the Bundeskartellamt relied on national competition law in order to rely on national (and arguably more favorable) case law. The question remains whether this “strategic choice” of legal basis is a legitimate one to find an appropriate way to tackle the market failures identified in Section II.
3. The Fine Imposed by the AGCM in the Facebook Case
The AGCM imposed a fine of €10 million on Facebook. 113 This sum was reached by imposing the maximum applicable fine under Italian consumer law (i.e., €5 million) for each of the two unfair commercial practices found. 114
As discussed in Section III, in contrast to EU competition law, the enforcement system of consumer law is different in each Member State. The sanctions also differ considerably. However, consumer law fines are, in general, substantially lower than competition law fines, where the maximum sanction is generally 10% of the annual turnover of the sanctioned undertaking. 115 Generally speaking, consumer law aims at sanctioning commercial practices that harm “small” groups of consumers, rather than tackling structural issues affecting the degree of competition in the market. The fine imposed by the Italian NCA in the Facebook case is a good example of the limited deterrence effected by sanctions under consumer law when they are imposed on multinational corporations like Facebook. It is doubtful whether the fine will have any deterrent effect, and whether it may increase the degree of transparency offered by online platforms to their users. Finally, it is worth noting that the Italian NCA has simply imposed a fine, rather than negotiating a behavioral commitment aiming at modifying Facebook’s terms of use.
In conclusion, the relatively “small” fine imposed by the AGCM seems to have a rather limited deterrent effect and is unlikely to change Facebook’s future market behavior. Even the “doubling” of the fine from €5 million to €10 million (by finding two different kinds of violation) seems like a mere “drop in the ocean.”
V. How to Solve the “Regulatory Dilemma”?
The comparison of the German and Italian Facebook cases is a good example of the “regulatory dilemma” currently faced by public enforcers in Europe. As already seen, the digital economy is characterized by a number of market failures that could be tackled either via EU competition, consumer, or data protection law. As discussed in Section III, these areas of EU law share a number of “family ties,” but they also have different objectives, scopes of application, and enforcement structures. Therefore, the three policies coexist in the digital economy. Compliance with one legal regime does not ensure compliance with the other legal regimes. This principle was openly recognized by the AGCM, which rejected Facebook’s arguments that the terms of its online registration were not “unfair,” as they had been approved by the Irish Data Protection Authority. 116
Coexistence of different legal regimes, however, may generate legal uncertainty for firms. In particular, it opens the possibility that firms will be sanctioned multiple times for the same type of conduct under different legal regimes. In order to avoid such a scenario, in 2014, the European Data Protection Supervisor (EDPS) proposed the establishment of a “Digital Clearinghouse.” This means that national competition, data protection, and consumer law enforcers meet periodically to establish forms of cooperation in the digital economy. 117 This proposal was well received and since 2017, a number of meetings have taken place. 118 However, the degree of cooperation is still worthy of improvement. At the national level, regulators of the three policy areas could cooperate in a number of ways, such as exchanging information in the context of investigations, organizing joint sector inquiries, or engaging in discussions regarding remedies. Such cooperation should also “cross” national boundaries and lead to the establishment of a transnational network of regulators active in different policy areas. In the Facebook odyssey, the Bundeskartellamt has closely cooperated with a number of data protection authorities, 119 while the AGCM heard the opinion of the Italian Regulator for Electronic Communications before adopting its decision. 120 However, no cooperation has taken place between the German and the Italian NCA. The fact that the two authorities—both members of the European Competition Network (ECN) 121 —have not coordinated their enforcement action in this situation is a sign that the Digital Clearinghouse has, to this day, not been institutionalized in an effective manner.
This “regulatory dilemma” affects, in particular, those public enforcers with competence in two policy areas. This is the case for the Italian NCA, which decided the Facebook case under consumer, rather than competition law. The Facebook decision is not isolated. The AGCM has decided a number of cases under consumer law involving unfair commercial practices in digital markets without considering the possibility to assess the case under Art. 102(a) TFEU. 122 As discussed in Section IV.B.3, consumer law generally provides for more limited and less deterrent remedies than competition law. The public enforcers should not always opt for the “easiest route” (i.e., consumer law), where they can prove a breach of the law without going through the complex process of relevant market definition and market power. Public enforcers with double competence should opt for the legal regime that provides the most deterrent remedy in the case given. Besides higher fines than data protection and consumer law, competition law offers the possibility to tackle issues via behavioral commitments. The latter are flexible, since they can be negotiated with the parties, and they can be reviewed if market conditions change. Finally, competition law remedies can create a regulatory asymmetry that could foster competition in the market: Imposing tougher conditions on dominant firms would favor the entry of new players into the market. In contrast, omnibus legislation, such as data protection and consumer law, creates equal regulatory burdens for both dominant firms already active in the market and new entrants. As a result, it oftentimes fails to tackle the market failures discussed in Section II.
These are preliminary ideas on how public enforcers could solve the regulatory dilemma in digital markets. The debate on which “regulatory road” to take is still far from being settled. It will most likely be one of many important points of debate in the context of the regulation of digital markets in Europe in the years ahead.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
