Abstract
Anti-competitive notions, it seems, are increasingly informing the critical debate on a data-driven economy organised into scalable digital platforms. Issues of market definitions, how to value personal data on multisided platforms, and how to detect and regulate misuses of dominant positions have become key nomenclature on the battlefield of addressing fairness in our contemporary digital societies. This article looks at the central themes for this special issue on governing trust in European platform societies through the lens of contemporary developments in the field of competition law. Three main questions are addressed: (1) To what extent are the platforms’ own abilities to govern their infrastructures, that is, to be de facto regulators over both human behaviour and market circumstances, a challenge for contemporary competition regulation? (2) In what way is the collection, aggregation, or handling of consumers’ data of relevance for competition? (3) How can the particular European challenges of governing US-based digital platforms more broadly be understood in terms of the relationship between transparency and public trust? Of particular relevance – and challenge – here are the platforms’ abilities to govern their infrastructures, albeit through automated moderation, pricing or scalable data handling. It is argued that this aspect of coded, and possibly autonomously adapting, intra-platform governance, poses significant anti-competitive challenges for supervisory authorities, with possible negative implications for consumer autonomy and wellbeing as well as platform-dependent other companies.
Keywords
Platform governance: From privacy to competition law
Much critical thought on data-extractive digital platforms have had a base in a privacy-informed frame, but less have had its base in the field of competition, focused on here. However, significant contributions in recent years’ scholarship is valuable for understanding current developments in European competition law – on socio-technical arrangements in platforms (Van Dijck, 2020) or the meeting with infrastructural studies (Plantin et al., 2018), and the notion of platform or data capitalism (Sadowski, 2019; Srnicek, 2017) in a platform society (Van Dijck et al., 2018).
The overarching purpose of this article is to look at the central themes for this special issue on governing trust in European platform societies through the lens of contemporary developments in the field of competition law. The following three main questions are addressed:
To what extent are the platforms’ own abilities to govern their infrastructures, that is, to be de facto regulators over both human behaviour and market circumstances, a challenge for contemporary competition regulation?
In what way is the collection, aggregation or handling of consumers’ data of relevance for competition?
How can the particular European challenges of governing US-based digital platforms more broadly be understood in terms of the relationship between transparency and public trust?
Competition regulation is increasingly seen as an important juridical field to counterbalance an unfair dominance of large digital platforms to the benefit of both consumer welfare and innovative markets. This is of interest from a wider array of markets beyond, for example, the media and the distribution of news. It includes a range of important fields, such as the app distribution, retail, search, marketing and underlying data cloud infrastructures, and more. All of these markets are in Europe largely dominated by very large US platforms like Google, Apple, Facebook and Amazon. The markets are also interconnected in the sense that, for example, a retail company may be dependent on both a cloud and market infrastructure for its operations, search and ads to be competitive, and app stores to reach its customers. All of these large-scale platforms have a direct relationship to billions of users and millions of companies, both as market enablers and, occasionally, competitors on the same market. The balancing and governing of these relationships are therefore of key relevance for competitiveness in a range of markets, as well as the very much interlinked concerns of consumer autonomy and privacy.
Platforms and the law
As seen in both investigations and regulatory proposals, it seems the anti-competitive aspects of very large and powerful digital platforms have entered the regulatory centre stage in European policy. The notion of competition and the concern of tech platforms’ misuse of their dominant positions, plays an increasing part in the critical debate on a data driven economy organised into scalable digital platforms. Most notably, the European Commission published a draft of the Digital Markets Act (DMA) in December 2020, largely proposing the introduction of a new competition tool with implications for what is called gatekeepers of core platform services. This is in line with regulation introduced in 2019 and with compliance requirements from 12 July 2020 to promote transparency for companies that use and are dependent on digital platforms (the so-called P2B Regulation, 2019/1150). These regulatory movements are also seen in juridical cases and merger investigations: for example, the European Commission’s preliminary investigation found Amazon to be distorting competition in online retail markets, and opened a second investigation into its e-commerce business practices. A similar investigation on Apple’s App Store rules is open. Furthermore, the European Commission has investigated, and approved, Google’s purchase of Fitbit, by making Google commit to safeguard the interoperability between competing wearables and Android, and to not use health and wellness data collected from wrist-worn wearable devices and other Fitbit devices of users in the EEA for Google Ads (European Commission, 2020). Also, the online advertising markets have increasingly been getting attention from a competition point of view, especially around the handling of data. This is perhaps most recently seen in the European Commission’s opening of an antitrust investigation of Google in June 2021, where the Commission will investigate if the company is distorting competition by restricting access by third parties to user data for advertising purposes, while reserving such data for its own use (European Commission, 2021).
Antitrust as a regulatory concern for large technological platforms is of increasing significance also in the United States, where the CEOs of Apple, Google, Facebook, and Amazon testified in Congress on 29 July 2020 before the House Committee on the Judiciary, on questions of anti-competitive monopolies. In parallel, a consortium of state attorneys and the Justice Department investigated whether Google is abusing its dominance in the digital advertising market, which resulted in a complaint of great significance being filed on 19 October 2020. In December 2020, the US Federal Trade Commission launched an antitrust lawsuit against Facebook regarding earlier acquisitions of two competitors – Instagram and WhatsApp. In parallel to this lawsuit, 48 states and districts filed a law suit accusing Facebook of abusing its market power to quash smaller competitors. Regardless of the outcome of these cases, they indicate a newfound emphasis on using antitrust as a legal means for addressing power dynamic of tech-driven platform societies, why pushes for legal reform can be expected also in the US.
The core issues of competition law, such as the definition of markets and misuse of dominance, will thereby be reassessed in light of digital and data-driven extraction: How to value personal data on multisided platforms? How to better understand the implications of digital conglomerates (Lim, 2020)? And how to detect and regulate misuses of dominant positions? These questions, it seems, are not only relating to digital markets in a narrow sense, but have become central for the balancing of interests in contemporary digital societies at large, and, by extension, of key relevance to the focus of this special issue on public trust.
Article structure
Scholarship in the competition law domain has observed some of the conceptual challenges linked to digital platforms, and a majority of the analysis in the subsequent ‘Platform code as law’ section of this article is framed as relating to how the dominant platforms in fact govern their infrastructures and implement policies in a coded, semi-autonomous fashion, having direct implications for dependent other companies as well as consumers. They not only create markets, but also shape and control them. Given that both multisided business models for ‘digital conglomerates’, as well as their analytical and algorithmic capabilities, may depend on the collection, aggregation and processing of large amounts of consumer-related data, the argument here is that the value of data may have to be reconceptualised for the purpose of competition concerns. Thus, the ‘Reconceptualising the value of data’ section of the article investigates the notions of data capitalism and attention brokerage under that rationale. As consumers become data sources, it is not all that clear how contemporary consumer protection and competition regulation relates to the lack of actual direct monetary transactions for services and products. The agency in data-driven services may at best provide for individually relevant features, and at worst lead to manipulative data collecting practices, argued for in section ‘Transparency and consumer trust’. One major challenge relates to ensuring sufficient transparency and how it relates to public trust in scalable automation and data collection on complex data-driven markets, with implications for both consumers and competing businesses depending on digital platforms.
Platform code as law
In 2019, the status of European competition protection in a digital age was evaluated by a group of researchers led by economist Jacques Crémer. As an adviser to Margrethe Vestager, EU Commissioner for Competition, they considered that the basic framework of competition law, enshrined in Articles 101 and 102 of the Treaty on the Functioning of the European Union (TFEU), continues to provide a sound and sufficiently flexible basis for protecting competition in a digital age (Crémer et al., 2019). However, they noted that the specific characteristics of platforms, digital ecosystems and the data-driven economy require the development of a range of relevant concepts, doctrines and methods. They stated that the actual implementation of competition protection needs to be adapted and refined for the digital context (Crémer et al., 2019: 39). They thereby echo arguments put forward by both legal researchers and economists that traditional concepts in competition may have shortcomings in its application to the new generation of technology companies. Still, it is evident that some of the conceptual questions – of direct relevance for competition scholarship – can benefit from the critical scholarship on digital platforms mentioned earlier. As argued by Van Dijck (2020), there is a growing need to understand how platformisation works and to conceptualise platformisation in a way that it cannot avoid governance due to unfit or obsolete regulation (c.f. Van Dijck et al., 2019). Media sociologist Andersson Schwarz (2017) have pointed to a ‘platform logic’ that ‘solidify markets’ into infrastructure, particularly at a large macro-level where a dominant few ultimately become utilities. From a competition perspective, the proprietary lack of transparency (c.f. Andersson Schwarz, 2017; Pasquale, 2015), and the way that intermediary platforms are increasingly moving towards becoming infrastructures for users are problematic – a process Plantin et al. (2018) have called the ‘infrastructuralization of platforms’ (p. 306). This infrastructuralisation is part in what Van Dijck (2020 10) refers to as dynamics that boost platformisation, next to vertical integration and cross-sectorisation – all of which bringing competitive challenges.
Here is first focused what can be called a real regulatory function in these players and its link to lack of transparency. Crémer et al. argue in the extension of the platforms’ regulatory functionality for the need of greater supervisory transparency for competition authorities. Market complexity, large-scale automation and non-transparent proprietary approaches all contribute to the difficulty of assessing from the outside how individuals’ information is used (Larsson, 2019). The position of being a ‘private regulator’ is a way to conceptualise gatekeepers with ‘intermediation power’, emphasised by the Expert Group for the EU Observatory on the Online Platform Economy (2021: see section 2.1).
Crémer and his co-authors also highlight a consumer-oriented point of competition relevance, which is to do with the importance of monitoring how consumers’ decision-making takes place in data-driven multisided markets. Concerns have been raised about how the digital data-driven environment can be designed to steer or nudge individuals into avoiding real-conscious choices – which at best can mean individualised relevance in the interests of consumers, and at worst pure manipulation of both consumers and competitors (c.f. Yeung, 2017). This arguably means a new level of granularity and adaptive automation has become feasible, impacting on questions of market control. This is further explained by Crémer et al. (2019): As platforms act as regulators, they gain an impact on individuals, firms and society that reaches beyond ‘pure’ market power. While respecting business secrets, public authorities should arguably find ways to ensure a sufficient understanding of how platforms work, i.e. the ways in which they fulfil their ‘regulatory’ function. The information needed for this endeavour might need to reach beyond the already existing possibilities to get full access to data and algorithms in the context of competition law cases. (p. 71)
The platforms’ governance over their own data handling and automated policy implementation echoes Lawrence Lessig’s (2003, 2006) notion of ‘code is law’, that is, that the digital architecture controls other actors, both companies in dependence and consumers, and their market conditions. This is supported by recent commentary on the DMA: ‘[g]atekeepers often find themselves in a position to act as rule-setters, and sometimes it seems that their code is more important in the market than legal obligations set by the democratic institutions’ (Podszun et al., 2021: 6). There are recent and considerable European cases on the coded version of misuse, resulting in the European Commission fining, for example, Google 4.34 billion EUR in 2018 for illegal practices regarding Android mobile devices and 2.42 billion EUR in 2017 for abusing its dominance as a search engine by giving an illegal advantage to Google’s own comparison shopping service in the Google shopping case. There are ongoing investigations of Apple and its AppStore, and the aforementioned extended investigation of Amazon. Code, here it seems, is a way to automate market control.
With the advent of machine learning and the adaptability and goal orientation of learning technologies, the ‘code’ may not be as fixed as Lessig once intended, but an algorithmically dependent automation that may engage in tacit and informal collusion, that can arise through the use of pricing algorithms that monitor the pricing and market behaviour of others (Ezrachi and Stucke, 2017). Amazon has been criticised for using automated pricing as a way to mislead consumers (ProPublica, 2016), which could be called automated leverage behaviour.
The platforms abilities to govern their infrastructures, that is, to be de facto regulators over both human behaviour and market circumstances, albeit through automated moderation, pricing and data handling, seem a key challenge for contemporary competition regulation. A particular challenge is then found in platforms both creating markets, governing them and competing on them. Much of these challenges relate to the collection, aggregation and control of user data. As the distinction between infrastructural, intermediary and sectoral platforms is ‘increasingly fluid, allowing data flows to move across the connective system’, this also means a fundamental challenge for traditional competition regulation. For example, what Van Dijck (2020: 9–10) refers to as ‘cross-sectorization’, exemplifying with Amazon moving into the medical sector, the transportation sector, and the insurance sector (in addition to retail market infrastructure, targeted ads and cloud infrastructure) challenging traditional definitions of markets in the assessment of misuse of dominance (c.f. Crémer et al., 2019).
Reconceptualising the value of data
Already over 20 years ago, the value of personal data was discussed, among other things expressed in a Wired article about playlists (Sullivan, 1999). The US law professor, Paul Schwartz (2003), stated that ‘the monetary value of personal data is high and still growing, and US companies are advancing rapidly to take advantage of this trend’. More recently, Spiekermann et al. (2015) describes what they call ‘personal data markets’, in light of the fact that personal data have increasingly been perceived as a marketable asset. While there are many attempts on how to calculate the value of personal data in relation to various services, it is sufficient here to state that personal data may indeed have a distinct value and therefore be of interest for competition regulation, particularly so for the assessment of mergers and market dominance. There is a complex data collecting ecosystem of ‘companies collecting extensive amounts of data on consumers’ (Christl, 2017) with different roles, including data brokerage and selling of data. The larger platforms tend however to be described in terms of how they aggregate in order to find patterns, or ‘unearth hidden correlations buried in large amounts of data’ (Srnicek, 2017: 57), for example, in order to create predictive tools for match-making. As put by computer scientist and legal researcher Michael Veale (2020), ‘firms do not intrinsically care about data, but their ability to optimize’ (p. 7). This is, on one hand, a way to state the importance of data for the development of machine learning and artificial intelligence (AI), also clearly pointed to in the European Commission’s White paper on AI (c.f. Larsson et al., 2020a). On the other hand, it is a way to point to the fact that analytical products may need more though from a competition perspective, where pattern recognition services and machine-learning processes may be the valuable asset (c.f. Srnicek, 2017: 62), possibly adding a question of what it is that ought to be pooled as lawmakers attempts to level a playing field, or what it is that can be misused when assessing mergers.
The differentiated business models and ‘cross-sectorization’ of megaplatforms, however, means that they do not necessarily have to be profitable within the boundaries of a specific market if they have a business model that can make profit in another market, or side of a multisided platform (c.f. Srnicek, 2017). The competitive assessment thus ends up in a new light, for these digital conglomerates (Lim, 2020). In a discussion on ‘lean platforms’, Srnicek (2017) states that data mining has become a key way of competing. According to Srnicek, the platform-driven commodification of individuals’ data does not lead to an end to competition or the struggle for market dominance, but it does affect the form of competition. In particular, it means a move away from competition in pricing, and an incentive to collect, extract, analyse and control data.
There is a normative importance of what Sadowski (2019) frames as ‘data extraction’ (p. 2). By framing data as a form of capital, Sadowski casts a different light than much competition scholarship on the imperatives motivating contemporary organisations also beyond the digital platforms. Furthermore, West (2019) connects surveillance and markets in a ‘data capitalism’, and does so in retrospect between the mid 1990s and the mid 2000s. She stresses that data capitalism is not just about monitoring but also about how the market links data with new types of control, and exploits this advantage, but at the same time, hides it with misleading descriptions of openness and consumer efficiency. In other words, there is a critical eye here that links competition with privacy issues to as aspects of control and abilities to govern as an inherent quality of data-driven platforms.
While it may be a relatively new and under-researched topic in competition law, the notion of data sharing and data pooling arrangements for pro-competitive purposes is part of the competition discourse with regard to digital platforms (Crémer et al., 2019: 92–98). While forcing firms to collaborate is regarded as hard when it comes to data (Lundqvist, 2021), the notion of data pooling has also been criticised from a privacy and consumer perspective, given that much data extraction is done in a non-transparent way without informed consent (c.f. Larsson et al., 2021; Libert and Binns, 2019). Consequently, several studies argue for the need to include data protection considerations in certain competition law assessments regarding digital markets (Binns and Bietti, 2020; Wasastjerna, 2020), and the European Commission (2021) is also explicitly stating that it will take into account the need to protect user privacy in the above mentioned antitrust investigation of Google’s behaviour on ad tech markets.
The vertical intergration and cross-sectorisation of infrastructuralised platforms forces the conceptual framework of competition law to address the ‘multisidedness’ of digital markets, including the value of personal data and the inherent challenges of relying on a clear market definition to be able to address alleged misuses of dominance. The reframing of ‘personal data as means for payment’ has been useful for addressing aspects of so-called free (or zero-priced) services that has passed relative ungoverned in traditional consumer protection (Larsson, 2018). This framing has served as a way to point to that while a consumer may know what they are buying, the insecurities in what parties are collecting their data, for what reasons and where it is travelling, ultimately means that the consumers do not know the price. This conceptualisation has informed changes to the scope of the Consumer Rights Directive to contracts where the consumer provides personal data in exchange for a digital content product or a digital service. This extension means that consumers who ‘pay’ with their personal data have specific information rights. While this reframing has its merits, it is still in need to be developed for competitive concerns on digital markets in terms of how data are collected and how it becomes valuable.
A different way to understand the value of personal data is to return to the conceptual framework of how markets may be defined. As mentioned, the market definition is central to competition issues, but also remarkably difficult in a digital context (Crémer et al., 2019). When it comes to multisided platforms, the different sides’ interdependence becomes absolutely central to the analysis. Proposals have been developed to capture or conceptualise the data-driven economy in terms of how it is designed to capture human attention, expressed in terminologies such as attention economy (Wu, 2017), specific stakeholder concepts as attention platforms (Evans, 2019) and attention brokers. These actors represent all the types of services that aim to attract users’ interest and then monetise the consumer’s presence, perhaps most clearly expressed in targeted advertising. The argument about attention as the basis for the market definition is thus that it is human attention (Wu, 2017) or time (Evans, 2019), as a scarce commodity, which is the resource that competition is very much about (Wu, 2017). This makes particular sense for so-called multisided ‘conglomerate’ platforms that offer zero-priced services at one side in order to monetise consumers’ attention on another side. One can consider the digital marketing, where zero-priced search engines, media outlets and social media platforms all compete for consumer attention that on another side of the platform is sold to bidders who want to show advertisements to relevant consumers. This setting can be highly complex with, for example, automated real-time bidding involving a multitude of buyers and sellers on programmatic ad-auctions. One may thereby better understand the motives in various types of ‘attention brokers’ – including media and retail platforms – in how they reorganise into scalable digital platforms. This can be seen in how news sites are more reliant than others on third-party web tracking, for example, shown in studies on the US news markets (Libert and Binns, 2019) and comparatively in five different sectors in Sweden (Larsson et al., 2021). Attention can be sold, and more engagement means more attention to create revenue from.
Transparency and consumer trust
As commercial data collection has become ubiquitous in digital and app-based life, interest in research has also grown on how consumers, citizens and users experience, understand and trust this. Much of what a user sees on the Internet, from ads to search results, is directed or customised by algorithms that have made analytical and automated conclusions about that user. Previous studies have documented that users find this type of automated individualisation both useful and disturbing (Ur et al., 2012). In some research, the manipulative influence are expressed as ‘dark patterns’ (Narayanan et al., 2020), which can be found in some e-commerce sites (Mathur et al., 2019), and where the incentive is to ‘get’ the clicked consent from visitors in order to serve profiling analytics that largely feeds into the ad market. Studies on consent requests for third-party tracking has shown that both the nudging methods as well as the consumers ‘consent fatigue’ is widespread (Grassl et al., 2021). A recent study that mapped third-party cookies on the Swedish web and interviewed consumers about it showed that they were largely unable to make an informed choice in the cookie consent questions, and that the most third-party cookies were found on retail and e-commerce sites (Larsson et al., 2021). A key question here is how transparency affects consumer trust in data-driven platforms. The expert group of the EU Observatory on the Online Platform Economy (2021) argues that platform transparency would benefit from further conceptual work, for example, on the tradeoffs between transparency and legitimate business interests. An inherent challenge lies in that these systems may be ‘virtually impermeable’ (Van Dijck, 2020: 12) from the outside, including for other companies that struggles to create well-informed business strategies, for governments that seeks to supervise and create fair markets, and consumers who want to understand how their data are used or what automated decisions they are affected by are based on.
Consumer choice is also directly competition relevant (Crémer et al., 2019: 63–65). As mentioned, there are risks of manipulation of individuals, in addition to an often very large information asymmetry between the data management platform and the individual consumer, which thus also risks undermining competition (Jin and Wagman, 2020). Data-driven and individualised price discrimination, so-called ‘personalised pricing’, can also bring anti-competitive benefits to digital platforms (Botta and Wiedemann, 2020). The issue of transparency is relevant not the least in the sense that without insight in how their data are collected, traded or used, there is no reasonable way for a consumer to choose or change service because the collection and algorithmic processing is largely hidden from the consumer (c.f. Kemp, 2019; Larsson et al., 2021). Combined with the complexity in real-time data-driven markets (Christl, 2017) and proprietary claims (Pasquale, 2015), these steering capabilities are likely to lead to risks of misuse of dominant positions. If the digital data-driven environment can be designed to avoid real-conscious choices, and also guide consumers to avoid certain types of decisions, both as a conscious design and as an outcome of algorithm-driven approaches (c.f. Yeung, 2017), consumer autonomy can be undermined. At worst, consumers in part become oblivious and manipulated data sources that produce commodifiable information fed into analytical products sold on other sides of digital conglomerate structures (c.f. Crain, 2018).
There are, however, studies that indicate a ‘cooling effect’ following from a lack of trust in digital markets with a widespread data collection. This applies to several digital uses, including e-commerce and search engines (Datatilsynet, 2020: 16f). This can be compared with studies on what makes consumers avoid or resist the so-called smart services, that is, with reference to the integrity and fear of unauthorised re-use of their information (c.f. Mani and Chouk, 2019).
In European policy, there is currently a strong push for transparency as a means for creating more trusted markets and business practices, seen in the DMA, but also the Digital Services Act, the so-called B2P Regulation, recent AI policies as well as the proposed AI regulation, and, of course, the General Data Protection Regulation (GDPR). Transparency is, however, a multifaceted concept with inherent contradicting interests (Larsson and Heintz, 2020). Consumer-oriented studies have indicated that also consumers may be resigned in relation to the commercialised data collection (Draper and Turow, 2019), and prefer more structural solutions ensuring fairer practices, such as market-wide agreements or more active involvement from supervision authorities (Larsson et al., 2020b). The desirable balance for regulation to strike would then be to ensure sufficient supervision on a more structural level, ensuring a ‘scrutinability’ over opaque automation. This would be in line with calls for methodological development in supervisory authorities in order to empower consumers on digital markets (Larsson, 2018), or to develop a ‘computational antitrust’ (Schrepel, 2021). In relation to the proposed DMA, some advocate a stronger integration of national authorities and private parties in enforcement (Podszun et al., 2021).
Conclusion
Taken together, the field of competition refers to a number of concepts of central relevance to competition protection in a digital context: platforms, digital ecosystems, the role and value of data, the need and importance of transparency. These are all concepts that are examined in economic and legal disciplines but also more generally in the social sciences and the humanities. This article collects parts of this literature in order to contribute to a more comprehensive or complementary understanding of key concepts in the digital economy; particularly focusing on the platforms’ own abilities to govern their own infrastructures, ultimately becoming a ‘corporatocracy’ (Andersson Schwarz, 2017: 388) with implications for other companies as well as consumers and public values at large (Van Dijck et al., 2018).
Looking to public trust in contemporary digital Europe, the dependence on US platforms for important and popular consumer services and commercial markets is striking. This is evident not only for social media and the distribution of news, and the digital ads markets, but also for anything relating to search, retail, cloud storage, app markets, mobile phone’s operating systems, digital maps and more. Much of the services on the consumer side is so-called zero-priced services, which pose challenges for traditional competition regulation in how to assess their value and even defining what markets they operate on. This means that not only are European consumers’ in fact data sources for many multisided American digital platform conglomerates – that is, collected, aggregated, profiled and monetised on other sides of the same conglomerates. But it also means that European companies are depending on how these digital platform markets are operating, and are internally governed. Oftentimes, the platforms also compete on the very same market that they are creating, leading to incentives for unfair practices. These incentives have led the European Commission to open investigations on, for example, Amazon, Apple and Google in recent years. This has also spurred the European Commission to propose new tools for competition regulation through a draft for a Digital Markets Act, which seems more akin to the regulation of telecommunications than traditional competition law (Lundqvist, 2021).
The area of competition, it seems, is preparing for major battles that have to do with the societal balances of digitalisation and the governing of public trust, and clearly so, for European public governance over US-based firms. It entails the relations between large and small companies, deals with issues if dominance and its abuse, and regards conceptual developments of how digital conglomerates operate in a combination of scalable datafication and algorithmic processing over cross-sectorised and multisided platforms. Related to issues of competition is the question of human decision-making in the digitised interface of increasingly autonomous and personalised prediction systems – with data-driven consumer personalisation, at best, and individual citizen manipulation at scale, at worst. As the infrastructuralised corporate platforms by no means are contained by antitrust frameworks alone, it needs to be stated that they are indeed to be structured by data and consumer protection too. Recent EU policy on trustworthy AI will likely also provide with normative guidance (Larsson, 2021; Larsson et al., 2020a), in both informal and formal law.
In sum, this means that the regulatory powers of European public institutions will be busy in the foreseeable years trying to mitigate the unfair impact of non-European but very large platforms’ abilities to govern their own computational and highly non-transparent infrastructures. Although painting a complex techno-legal imagery in flux, at stakes is the public trust in digital markets that through datafication and platform logic includes an increasingly large share of all corporate and human activities, with a key battle coming under the competition umbrella.
Footnotes
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship and/or publication of this article: This work was partially supported by the Swedish Competition Authority, as well as the Wallenberg AI, Autonomous Systems and Software Program – Humanities and Society (WASP-HS) funded by the Marianne and Marcus Wallenberg Foundation and the Marcus and Amalia Wallenberg Foundation.
