Abstract
Algorithmic management has been a prominent focus of platform labour studies and is lately receiving increasing academic and policymaking interest. Conceptualisations of algorithmic management are mainly premised on ridehailing, crowdwork and food delivery platform research, while literature on algorithmic management of housecleaning and domestic work platforms is limited. This article draws on research in housecleaning platform labour in Denmark, demonstrating how algorithmic management unfolds in these platforms, which are the institutional devices which support algorithms in practice, and how platform workers experience, make sense of and resist this type of management. The analysis of the findings highlights two factors that greatly influence the algorithmic management process. The first is the role of customer support departments in underpinning – and sometimes substituting – the functions widely considered to be carried out by algorithms. The second factor is that intersectional subjectivities of – predominantly migrant and female – workers of Danish housecleaning platforms affect the ways in which workers experience and respond to their algorithmic management. The article concludes in proposing
Introduction
Studies on platform-mediated work, just like digital labour platforms themselves, have been expanding rapidly in recent years to investigate and analyse the controversies generated and prolonged by this novel type of work. Platform labour studies (Grohmann and Qiu, 2020; van Doorn and Shapiro, 2023) have engaged extensively with various aspects of platform work. Among these aspects, the algorithmic management of platform workers has been a prominent focus of this literature, both conceptual and empirical. Since Lee at al.'s (2015) conceptualisation of algorithmic management, based on their research on Uber drivers in the USA, scholars from various disciplines have introduced diverse definitions of algorithmic management, focusing on specific traits of such management which they claim to be generally present in platform work. A common feature of these definitions is that they originate from research on food delivery, ridehailing and crowdwork platforms (e.g. Jarrahi et al., 2021; Lee et al., 2015; Möhlmann and Zalmanson, 2017; Rosenblat and Stark, 2016). Not surprisingly, case studies of algorithmic management of workers in these specific platformised sectors prevail in literature (e.g. Griesbach et al., 2019; Kusk and Bossen, 2022; Wood et al., 2019). On the contrary, accounts of algorithmic management in domestic work platforms are rather limited (e.g. Gerold et al., 2022; Ticona et al., 2018).
This article investigates how algorithmic management unfolds in the most popular housecleaning platforms in Denmark and what this entails for platform housecleaners. My analysis stresses the importance of two major factors influencing this process. The first factor is that intersectional subjectivities of predominantly migrant and female workers of Danish housecleaning platforms (Floros and Jørgensen, 2023) affect the ways in which they experience and respond to their algorithmic management. The second factor is the role of customer support departments in underpinning – and sometimes substituting – the functions widely considered to be carried out by algorithmic management. The theoretical point of departure for the analysis is the definition by Lee et al. (2015: 1603), who claim that what constitutes algorithmic management are ‘software algorithms assuming managerial functions and surrounding institutional devices that support algorithms in practice’. Adopting a broad approach on what an institutional device is, this study engages with this definition to unveil the multiple institutional devices at play in the Danish context of platform housecleaning.
The analysis of the empirical findings challenges depictions of algorithmic management as omnipotent or despotic (e.g. Griesbach et al., 2019; Veen et al., 2020). The article argues that such depictions – despite their relevance for complex algorithmic software managing delivery couriers or taxi drivers in real-time – are not illustrative when it comes to the lived realities of platform housecleaners. Following the analysis of my findings, I employ feminist geographer Cindi Katz's (1996, 2017) notion of minor theory in the discussion. The combination of algorithmic management and minor theory affords a more nuanced description of the entanglement of social, human, and algorithmic components, and foregrounds the subtleties which co-produce what I define in this article as
The empirical material for this article comprises interviews with platform housecleaners, managers, and employees, as well as extensive digital ethnography of management practices in a closed Facebook group of a popular Danish housecleaning platform. The article contributes to the rapidly expanding body of qualitative research on domestic work and housecleaning platforms (e.g. Floros and Jørgensen, 2023; Gerold et al., 2022; Orth, 2024; Ticona and Mateescu, 2018; van Doorn, 2020). Moreover, it contributes to critical approaches which on the one hand draw attention to the limits of algorithmic management and on the other hand stress the central role of workers’ agency in shaping platform labour (e.g. Bonini and Treré, 2024; van Doorn and Shapiro, 2023; Woodcock, 2020, 2021). Finally, this article adds to literature on the often-ignored yet crucial roles of humans in the governance of digital labour platforms (e.g. Gray and Suri, 2019; Irani, 2015; Kusk et al., 2022) from a location-based platform perspective.
The paper is structured as follows: First, I briefly present how algorithmic management is theorised and employed in platform labour literature and showcase how the concept has been used to discuss domestic and housecleaning platform work. After that I elaborate on the methodological choices underpinning this study. In the analysis, I demonstrate what constitutes algorithmic management in the context of Danish housecleaning platforms, which are the institutional devices which support algorithms in practice, and how platform workers experience, make sense and resist algorithmic management. Following this, in the discussion I analyse the findings through the prism of minor theory to challenge the techno-centric narrative of algorithmic management utilised by platform companies to their own benefit, before providing some concluding remarks.
Theorising algorithmic management
Algorithmic management is a concept first coined by Min Kyung Lee and her colleagues to describe the data-driven managerial functions assumed by algorithmic software in the context of ridehailing services Uber and Lyft (Lee et al., 2015). Lee et al. define algorithmic management as ‘software algorithms assuming managerial functions and surrounding institutional devices that support algorithms in practice’ (2015: 1603). In their definition the authors do not specify what they mean by institutional devices. Their backgrounds are in Computer Supported Cooperative Work (CSCW) and Human-Computer Interaction (HCI), and their research orientation points towards facilitating satisfactory and meaningful ways of optimal cooperation between humans/workers and technology, rather than simply proposing efficient solutions and techno-fixes (Lee et al., 2015: 1611). According to the arguments presented in their paper, institutional devices could take the form of corporate transparency protocols, online drivers’ communities promoting collaborative sensemaking, and humans-in-the-loop enhancing a humanistic approach to management. Other scholars who draw on this definition study institutional devices in more literary terms, including geolocation systems, sensors and surveillance software as assisting algorithmic management in practice (e.g. Kellogg et al., 2020; Newlands, 2021). In this article, I choose a broader perspective, interpreting ‘institutional devices’ to include customer support departments, state agencies, policymaking documents, as well as platforms’ non-algorithmic technological artefacts and terms and conditions agreements. This perspective aligns with previous literature from scholars arguing that algorithms have always been ‘more than’ rather than ‘just’ technical protocols (Daston, 2018; Pasquinelli, 2023; Seaver, 2019).
Following this initial definition, the concept received an ever-increasing attention in scholarly work (Noponen et al., 2023) and was quickly adopted in institutional discourse (e.g. European Commission [EC], 2021; International Labour Organization [ILO], 2021). Food delivery and ridehailing platforms (Griesbach et al., 2019; Kusk and Bossen, 2022; Lee et al., 2015; Rosenblat and Stark, 2016; Veen et al., 2020), as well as crowdwork platforms (Jarrahi et al., 2020, 2021; Möhlmann et al., 2021; Wood et al., 2019) have been the primary focus of literature on how algorithmic software and automated solutions assign, evaluate, optimise and remunerate work in the gig economy. Managerial functions assumed by algorithmic systems are not exclusive to the gig economy (e.g. Delfanti and Frey, 2021). Scholars argue that such management constitutes a form of digital Taylorism expanding to other employment sectors including logistics (Altenried, 2019) and on-demand manufacturing (Andrijasevic, 2021). Nevertheless, the concept of algorithmic management has predominantly been investigated in relation to platform work (Noponen et al., 2023).
Beyond Lee et al. (2015), more scholars have provided definitions of algorithmic management, based on the particularities of the cases they were investigating and their respective research fields. Indicatively, Rosenblat and Stark argue that the concept should be extended to include the automated ways in which it imposes corporate policies on workers’ behaviour (2016: 3759). According to them, the performative role of algorithmic management builds on the information and power asymmetries between workers and platform companies. Möhlmann and Zalmanson (2017) take a more technocentric approach to algorithmic management and define it through five attributes. According to their Information Systems approach, algorithmic management comprises (a) automated decision-making procedures, (b) tracking workers’ behaviour, (c) evaluating workers’ performance, (d) interacting with a ‘system’ rather than humans, and (e) low levels of transparency (Möhlmann and Zalmanson, 2017: 4–5). Management scholars define algorithmic management focusing on the limitation of human involvement in management, through delegation and execution of decision-making responsibilities to self-learning algorithms (Duggan et al., 2020: 119). Jarrahi et al. take a step back from technocentric approaches to define algorithmic management as a ‘sociotechnical phenomenon shaped by both social and organizational forces’ (2021: 3). Their definition, drawing on Science and Technology Studies (STS), highlights how technological infrastructure, organisational choices, and relations between workers and managers mutually constitute algorithmic management. Lately, attempts at establishing an all-encompassing definition to promote conceptual clarity introduce broader readings of algorithmic management, such as ‘the use of computer-programmed procedures for the coordination of labour input in an organization’ (Baiocco et al., 2022: 5). 1
Most definitions of algorithmic management are rather broad, to include the multiple technological – or sociotechnical – characteristics of such management. What is common is that all definitions are premised on the existence of some sort of algorithmic system automatically assuming and executing managerial work. Although the definition of an algorithm suggests that it is a well-defined computational procedure transforming specific input to output (Cormen et al., 2009, as cited in Seaver, 2019), Seaver (2019) claims that algorithms do not operate in a vacuum but are part of algorithmic systems, that is ‘dynamic arrangements of people and code […] allowing us to construct algorithms as heterogeneous sociotechnical systems, influenced by cultural meanings and social structures’ (Seaver, 2019: 419). According to his reasoning, algorithmic management should be viewed as a dynamic sociotechnical phenomenon featuring messy implications beyond the algorithm
Algorithmic management: Omnipotence or illusion of control?
Literature reviews of algorithmic management conclude that the majority of literature adopts a critical stance towards algorithmic management's effects on workers’ autonomy and job quality (Baiocco et al., 2022; Noponen et al., 2023). Despite the variety of conclusions produced by research conducted in various contexts, there is an overall convergence in stressing the extensive power and information asymmetry between platform companies and workers. Beyond power and information asymmetry, Noponen and colleagues (2023) discern further concepts which are pivotal in the discussion of algorithmic management, such as the autonomy paradox, income dependence, and argumentation on the existence of digital Taylorism. A thin red line connecting all these concepts is the claim to extensive levels of control afforded by algorithmic management. Except for very few case studies documenting lenient forms of algorithmic management (e.g. Kusk and Bossen, 2022), most researchers agree that digital labour platforms exert broad control over the ways workers are motivated, instructed, surveilled, disciplined and evaluated through algorithmic management (Kellogg et al., 2020; Wood, 2021). Some studies report such totalising algorithmic control that they go as far as to describe it as
Contrary to its depiction as a potent technological tool in the hands of platform managers and owners, other researchers have argued that algorithmic management is overestimated both theoretically and in practice (Lamers et al., 2024; Woodcock, 2020). Focusing excessively on the technological aspects or the black boxing of algorithmic management impedes delving deeper into workers’ agency and resistance in relation to such management practices. Studies on workers’ resistance practices contest omnipotent accounts of algorithmic management and challenge monolithic forms of platform power (Bonini and Treré, 2024; Floros, 2024; Woodcock, 2021). Bonini and Treré (2024) demonstrate how gig workers resist the entanglement of both algorithmic and human managerial labour, by responding to power through algorithmic tactics and devices, what they coin as
Algorithmic management and domestic work platforms
In their seminal article, which was one of the first globally to investigate carework platforms, Ticona and Mateescu (2018) comment upon the disproportionate evolution of literature on ridehailing platforms when compared to carework and domestic work platforms. They attribute this fact both to the journalistic hype on companies like Uber and to the gendered – and here one can add racialised (cf. van Doorn, 2017) – dimension of domestic work. Following the same trend, algorithmic management studies have largely neglected domestic platform labour. There are two reasons beyond journalistic hype and gender that seem to dictate this. The first is that domestic work platforms do not employ complex algorithms for optimising distribution of work or price surging in real time and are therefore less attractive to researchers and funding bodies. The second relates to the difficulty of conducting fieldwork with workers who usually belong to sociolegally vulnerable groups and are largely invisible in the public sphere and therefore hard to access (Floros and Jørgensen, 2023).
Ticona et al. (2018) are the first to provide a thorough juxtaposition of algorithmic management in domestic work and ride-hailing platforms in the USA. According to their investigation, ratings, rankings, and account deactivation as a means of penalising platform workers are some of the common algorithmic management tools used for all types of digital labour platforms. Ticona and colleagues provide a taxonomy for distinguishing between on-demand platforms -which include real-time distribution of work – and marketplace platforms – where work is scheduled, and the platform mainly acts as a matching intermediary (Ticona et al., 2018: 21). They emphasise that although the marketplace business model of domestic work platforms is less reliant on complex algorithmic software, this does not mean that researchers should disregard the role of algorithmic management in shaping the way domestic work is experienced and organised (Ticona et al., 2018). In Europe, researchers have investigated how algorithmic management takes place in Helpling, which is a German housecleaning platform (Gerold et al., 2022). In their study, Gerold and her colleagues demonstrate how Helpling's digital infrastructure acts as a site of evaluation, ranking, assessment and comparison of the platform workers, in turn shaping users’ interaction.
Finally, without specifically investigating algorithmic management, several research articles provide valuable insight into algorithmic features of domestic platform labour. Indicatively, Fetterolf (2022) assesses factors that boost or impede profile visibility of platform workers at Care.com, as a result of algorithmic amplification resulting in higher ranking in online searches. Altenried (2024) argues that algorithmic management provides the necessary preconditions for the inclusion – on unequal terms – of a heterogeneous migrant labour force into the labour market. Combining in his analysis algorithmic management with the particularities of local labour markets, Altenried subscribes to a sociotechnical reading of the concept. Recently, Orth (2024) challenged Altenried's claims on migrant labour force heterogeneity to suggest that – contrary to such assumptions based on the traditionally migrant character of domestic work – algorithmically managed platform housecleaning in Berlin, in contrast to food delivery, requires a certain level of digital literacy and soft skills on behalf of platform housecleaners.
Minor theory: Doing theory differently
Investigating algorithmic management beyond the algorithm (van Doorn and Shapiro, 2023) indicates the need to revisit theoretical concepts and do theory differently. Doing theory differently in the case of domestic work platforms also heeds the calls made by feminist scholars to transcend technocentric approaches, which fail to address the gendered – and often racialised – traits of commodified reproductive labour, which traditionally constitutes a largely informal industry (e.g. Huws, 2019; Kampouri, 2022; Ticona and Mateescu, 2018). Feminist geographer Cindi Katz (1996, 2017), inspired by the work of Deleuze and Guattari, advocates the production of knowledge in a minor key, that is ‘producing theory that is […] interstitial with empirical research and social location’ (Katz, 1996: 487). Minor theory's aim is broadening the understandings of established concepts to elaborate on the role of social relations in co-constructing these concepts. Katz claims that doing theory differently means interpolating established conceptualisations and production of knowledge with minor stories ‘vibrating within the claims and arguments of major theory’ (2017: 598). Expanding this approach to platform urbanism, Leszczynski (2020) advocates for a minor theory which foregrounds political potentials of everyday digital practices. Leszczynski argues against the ‘totalising analytics of masculinist critiques’ (p. 14), which present platformisation as the inescapable future of neoliberal capitalist societies, and platform workers as its hopeless victims. Adopting a minor gaze, in the case of algorithmic management means refusing the exclusive mastery of algorithmic software in shaping the way platform work is performed and taking into account how the major definitions of algorithmic management intersect with categories such as gender, race, class and citizenship status. Combining Lee et al.'s (2015) established definition of algorithmic management with empirical insight – through a minor gaze – to describe how work in Danish housecleaning platforms is distributed, experienced and managed, I wish to contribute to the development of emancipatory, socially transformative theory, to unsettle ‘received narratives and material social practices of power’ (Katz, 2017: 597).
Methodology
This article builds on data originating from PhD research on housecleaning platform work in Denmark (Floros, 2024). Twenty-three in-depth interviews with housecleaners working for three different platforms form the basis for the analysis. Including workers voices in domestic platform labour research is challenging due to access restrictions deriving from the invisibility of platform workers in everyday public life (cf. Ticona and Mateescu, 2018). During my fieldwork, I used three different strategies to approach platform housecleaners. First, I posted messages on Facebook groups acting as marketplaces for contracting housecleaning, looking for cleaners who contracted work both through platforms and through such groups. Second, an interviewed manager of a popular Danish platform granted me access to the platform's closed Facebook group, which provided me with an abundance of cleaners’ Facebook profiles, who I could contact without the platform's mediation. Third, I signed up as a customer to an international platform and sent interview requests to workers’ profiles in Copenhagen. In the messages I clearly stated my research purpose, apologised for my cold-calling approach, and instructed cleaners to respond via email and not through the platform's infrastructure, to avoid possible sanctioning. Initially, due to my stated commitment in the messages that I would not share data or collaborate with platform companies in any way, I hesitated that informants responding to these cold-calling strategies might be biased against platforms. Notwithstanding, many of the respondents evaluated positively the operation of platforms, given the lifeline they provided in raising income, and all of them provided nuanced accounts of the pros and cons related to platform work.
All interviews were conducted between November 2020 and October 2022, were carried out both in person and online, and their duration was between 30 and 100 min. The interviewees were guaranteed their anonymisation and that the platform companies will not have access to their interview data. Interviews were conducted in English – and some parts in Spanish – as eighteen out of twenty-three cleaners were migrants (from Latin America, Eastern and Southern Europe and Asia). Their age ranged between eighteen and forty years, twenty-one cleaners were female and two male. Two out of five Danish cleaners identified as non-ethnic Danes. All interviewees provided informed consent for their participation and interviews were sound recorded, translated by the author when needed, and transcribed.
Moreover, the article draws on nine expert interviews with stakeholders (a housecleaning platform manager, a customer support employee and a platform lobbyist, two labour union representatives, a public official, a Danish MP and two academics) and digital ethnography on the closed Facebook group of the popular Danish housecleaning platform that provided me access. Despite identifying myself and the purpose of my research when I first joined the group, I abstain from using identifiable data or quotes from the Facebook group, as informed consent of all participants is practically impossible (cf. Willis, 2019). My data collection method can be described as a ‘fly on the Facebook wall’ (Floros, 2024), non-participatory observation approach, where I kept a personal log of the daily interactions in the group. 2 As the group is both a site where cleaners share everyday experiences and a site for direct and indirect management practices, I am cautious in avoiding mentioning traceable events from this digital ethnography, which could jeopardise participants’ anonymity, privacy or confidentiality (Thompson et al., 2021). Moreover, when analysing data, I was reflectively aware of the vocality of specific cleaners in the group which acts as an outlier, and, likewise, the choice of many cleaners to refrain from posting.
Premised on my epistemological choice to foreground the situated knowledges of platform housecleaners, which through their partial perspectives challenge and nuance all-encompassing theories and conceptualisations (Haraway, 1988), I employed a qualitative-interpretivist research design (Schwartz-Shea and Yanow, 2013: 6). To analyse algorithmic management, I registered every reference in my fieldwork data relating to platform cleaners’ management and factors that enabled it and grouped it according to the different components of Lee et al.'s (2015) definition. My analysis builds on the following grouping of data. First, I manually coded data relating to algorithmic managerial functions and then turned to institutional devices (broadly defined) which support these functions, before assembling all information on whether and how these functions are experienced – and potentially – contested by cleaners. Overall, I followed an abductive epistemological approach, combining previous theory and knowledge with my own field observations, in an iterative-recursive fashion, aiming both at deeper sense-making of the phenomenon and at revising or extending theorisation on algorithmic management of domestic work platforms (Schwartz-Shea and Yanow, 2013).
Brief introduction to the Danish context
The Danish gig economy grew rapidly after 2016, as it was actively promoted by governmental policies and national strategies. The Danish labour market is regulated by an industrial relations system based on collective agreements between employers’ organisations and labour unions. Despite initial optimism created by a few platforms entering into negotiation with unions, these initiatives remained marginal. The lack of a coherent presumption of employment rule and contradictory decisions by Danish public agencies further impeded regulation of the gig economy. The vast majority of platform workers are self-employed, lacking specific labour rights and – especially migrant gig workers – practically excluded from welfare provisions (e.g. sick-pay, holiday pay, entitlement to student or unemployment benefits). Housecleaning platforms are mainly Danish companies but there are also Danish branches of international domestic work platforms. Most platforms operate on a commission-based model, while platforms charging subscription fees to unlock special features are an exception. 3
What is ‘algorithmic’ in the management of Danish housecleaning platforms?
Algorithmic ranking of profiles
The most influential and controversial algorithmic system governing housecleaning platforms is the ranking algorithm. This algorithmic system decides which cleaner profiles are prioritised when a customer conducts a search. The importance of each criterion fed into the algorithm to decide the prioritisation is hard to define, given the opaqueness of these systems. Indicatively, literature on housecleaning platforms as a rule of thumb assumes that bad ratings from customers are very damaging to the cleaners’ profile prioritisation (e.g. Gruszka et al., 2022), nevertheless, other scholars demonstrate that five-star ratings do not reversely guarantee high profile prioritisation (Fetterolf, 2022). The platform manager I interviewed, shared some details on how their algorithm works: It's the proximity to the booker, and then price has an effect, but I think it is third in the party. Then, the second is the rating, yeah, and then it's the price. We need to iterate on the formula as well, because sometimes it is difficult for new (cleaners) to get bookings, since they don't have any ratings. It is like the hen and the egg issue, so we are trying to solve that. (Platform manager)
4
In this quote, we see how the manager confirms the importance of ratings, but also raises the question of securing bookings for new cleaners. This is crucial, since the on-demand model of housecleaning platforms requires a large availability of cleaners to satisfy the diverse needs of its clientele. Almost all the cleaners I interviewed said that when they initially signed up to the platform, they charged a low hourly fee to make it easier for themselves to find work. Given the manager's admission of price being the third most important input to the algorithm, one would expect that ‘cheaper’ cleaners appear higher in search results. During my project, I conducted several searches myself and realised that very cheap cleaners were rarely visible in the first pages of results. The fact that low prices are not sufficient on their own to prioritise profiles and newcomers don’t easily get ratings suggests that there are other values imported into the algorithm to promote newcomers’ visibility.
Another manager from this platform revealed some of these values to the platform's labour force in the Facebook group. They claimed that housecleaners who (a) replied fast to messages from clients, (b) instantly – or at least very fast – accepted offered bookings, and (c) kept their platform availability calendar regularly updated would be prioritised by the ranking algorithm. These three parameters indicate that the platform algorithmically rewards housecleaners who display speed and consistency in their communication with clients and in fulfilling their tasks on the platform's interface. One can assume that there are more values feeding into the platform's algorithm, ‘a very busy formula’ as the manager described it, which they did not share during the interview nor in the Facebook group with the cleaners. Other platforms operating in Denmark prioritise profiles in similar ways and take into consideration factors that they display on cleaners’ profiles through small banners or notes. Depending on the platform, these include recent activity on the platform, number of performed cleanings, verified identity documentation, years of cleaning experience, etc. The sum of the factors prioritised by the different ranking algorithms (i.e. ratings, proximity, trustworthiness, price, reply speed, experience, platform activity, acceptance rate) reveal the values that designers have inscribed into their algorithmic systems and the roles that platform management expects from housecleaners to perform.
The performative nature of algorithmic management
The script of the ranking algorithm can be broken down into three broad categories of desired outcomes. First, management aspires to loyalize housecleaners to the platform to achieve the volume of workers needed for the on-demand model to run smoothly. This is achieved through balancing prioritisation of successful profiles of highly rated cleaners with prioritisation of energetic profiles of newcomers who display speed in negotiating with clients, so that more cleaners with desired profiles (successful, active, consistent) receive booking offers. Second, outsourcing the evaluation of cleaners’ job performance to clients through rating and reviewing serves a double purpose. On the one hand, clients determine themselves which cleaners deserve more work, according to who pleases the customers to a greater extent. On the other hand, customers’ rating power makes cleaners I interviewed consider ratings to be a ‘source of worry’, ‘pressure’ or ‘threat’, leading them to overperform and accept demeaning behaviours. The performative and disciplining effect of ratings on cleaners is intended to maximise their compliance to clients’ wishes. The third aim of the ranking algorithm is to ensure the smooth completion of as many successful tasks as possible. When clients’ expectations are met, this solidifies their attachment to the platform and generates more profit. Fulfilment of expectations includes swift replies, acceptance of bookings, punctuality, and meticulousness on behalf of the cleaner, as well as consistency, should the client require recurrent cleanings.
These inscriptions into the algorithm are obvious to platform housecleaners and are intended to lead to their anticipatory compliance (cf. Bucher et al., 2021). Indeed, many of the platform housecleaners I interviewed were trying to ‘read’ the inner workings of the ranking algorithm to figure out what they could do to enhance their platform visibility. I guess that maybe the people that get positive reviews would appear first and I also think that you need to set your postal code, so maybe this is your first filter, because I used this when trying to find myself … (Cleaner 14)
Here, we see a common practice employed by cleaners, where they try to reverse engineer the algorithmic output by searching their own profiles on the platform, while changing the criteria every time they perform the search. Nevertheless, the most explicit display of anticipatory compliance are cleaners’ submissive behaviours when it comes to customers’ unprofessional demands. I had bookings […] where they asked me to wash the floor on my knees and all kinds of crazy things […]. There can be very strange behaviours but there are some who are really desperate for their ratings and would accept anything (Cleaner 5, as cited in Floros and Jørgensen, 2023)
The above quote showcases how the ranking algorithm dictates platform housecleaners’ choices of conduct. Even when cleaners do not succumb to these demands, they may never report foul behaviour to the platform, afraid of the negative effect a conflict – and a vengeful review – might entail for their profile visibility. My interviews highlight how insecurities augmented by cleaners’ intersectional identities are more likely to dictate silent tolerance of foul behaviour. For example, female, migrant teenager cleaners among my informants tolerated much worse client behaviours than male, non-ethnic Danes or female, Danish teenagers.
Automated disciplining and sanctioning procedures are one more component of Danish housecleaning platforms’ algorithmic management. These are targeting cleaners who reject or cancel bookings, as well as cleaners who are late to respond to clients’ messages. Many of my interviewees reported that their accounts get deactivated after one or two booking rejections. Moreover, every time cleaners cancel a scheduled booking – even in cases where this happens at the request of the client so that a more convenient timeslot can be discussed at a later date – they get multiple automated aggressive SMSs and emails from the platforms, urging them to rebook and threatening them with profile deactivation or deletion. We see here how these automated messages, combined with the ranking algorithm that penalises response delays to clients and rejections, work together in coercing the cleaners to perform the developers’ script.
Institutional devices supporting algorithmic software
However, technological scripts of algorithmic management such as the ranking algorithm say more about the intentions of the developers than about the actual practices of the users (Pelizza and Van Rossem, 2024). Platform housecleaners in Denmark have found ways to work around ranking algorithms and their fear of receiving negative ratings through various practices. Indicatively, they have reported deleting their own accounts after very bad ratings and creating new ones on the same platforms, sending substitute cleaners to bookings to avoid deprioritisation of their profiles due to late cancellations, and blacklisting rude and demeaning customers in cleaners’ WhatsApp groups to avoid bookings with them. 5 In order for algorithmic management to achieve its desired outcomes it needs to be backed up by what Lee et al. (2015) refer to as institutional devices.
Customer support departments
In the Danish context, the principal institutional devices supporting algorithmic management are the customer support departments of housecleaning platforms. Each platform features such a department, as mentioned clearly in their websites, terms and conditions documents, and app interfaces. Customer support of platforms whose profits are based on commissions from each cleaning and not on members’ subscriptions, interfere extensively in the booking process. This guy just sent text messages to my phone, saying that I had bookings waiting […] and it was like bad behavior if I didn’t answer, so they would give you a 24 hour window to respond and […], then I would get a message from this guy […] saying, “Hi (cleaner's name) we need you to check out your […] account because people are trying to book you and you are not answering”, and I feel like these were not standardised messages. They were, like, written at the time, so I felt that this was very pressuring and a bit too much actually. (Cleaner 22)
6
This quote demonstrates that housecleaning platforms do not solely rely on algorithmic systems and automated messages to nudge, warn or discipline cleaners. On the contrary, they employ phone calls, personal emails, and text messaging to communicate in person with cleaners, trespassing the boundary of a neutral intermediary assisting self-employed partners. Personalised communication is very efficient. Younger housecleaners have a hard time talking back to customer service. Especially female migrants highly dependent on their platform income report feeling intimidated by personal communication.
A former customer support employee in one of the most popular Danish platforms admitted that their department was keeping track of customers’ complaints. Once a cleaner reached three complaints per month, customer support decided whether to delete their profile or not. In the employee's own words, the cleaners that did not ‘live up to the expectations’ were simply ‘kicked out’ (Platform employee). Evidently, the ranking algorithm, online payment systems, and automated messages are not sufficient on their own in providing an overall smooth experience of platform housecleaning. Resolving disagreements between clients and cleaners, managing property damage, no-shows and late cancellations requires human intervention. Therefore, in the case of housecleaning platforms customer support is the most vital institutional device supporting algorithmic management.
One of the investigated platforms provides clients with the option that the platform picks a cleaner for them, after the client chooses the price and time for the task. Customer support then contacts all cleaners in a closed Facebook group, assigning the cleaning to one of the workers who apply for the task. Although many interviewed cleaners believed that this worked along the lines of first-come first-served, there were specific criteria in play. The platform employee explained how customer support assigned low-priced bookings (meaning less commission for the platform) to newcomers in which they ‘saw potential’ and high-priced ones to cleaners they ‘trusted a bit more’ (Platform employee). Given that Danish clients do not engage too much with rating housecleaners, customer support assigned such bookings to cleaners who they knew to be consistent and trustworthy but had not yet succeeded in convincing their previous clients to rate them. This shows how customer support complements or substitutes the three broad functions inscribed into algorithmic management (loyalizing cleaners to the platform, nudging the rating of cleaners, successfully completing tasks/fulfilling client's expectations).
Other institutional devices assisting algorithmic management
Beyond customer support, there are several diverse institutional devices assisting algorithmic management. One device is a minimal online tool embedded in the search function of the platform interface to instruct clients performing searches on the duration of the cleaning. Clients insert the size of the house and receive an automated suggestion based on the square meters they report. Nevertheless, several customers intentionally misreport the size, thinking that they can save some money, since the cleaner will work more intensively to fulfil the task in the – shorter – suggested time. Once clients have entered the size, they assume that the assigned time is sufficient since “the system said so”, which makes negotiations very difficult. […] The algorithm does not take into account that larger houses usually have huge kitchens and several bathrooms, all of which need more time for cleaning than plain floor area. It's as if we’re destined to fail. (Cleaner 1, as cited in Floros, 2024)
What is interesting in this quote is the use of the word algorithm. The cleaner did not use it to reply to an algorithm related question, on the contrary it was a point that she raised on her own. This device is a simple converter rather than a complex algorithmic tool. However, the cleaner understands the amalgam of the technological structure of the platform to pertain to some form of an algorithmic imaginary (cf. Bucher, 2017). This conflation supports Lee et al. (2015) conceptualisation of algorithmic management, who argue on the inseparability of algorithmic software and institutional devices. Moreover, the quote indicates how the introduction of this simple tool by the platform – allegedly a neutral intermediary between clients and cleaners – augments labour intensification. Cleaners who end up working based on fraudulently calculated durations feel pressed to fulfil the clients’ expectations to avoid bad ratings and subsequent profile deprioritisation. Once again, here it is mainly newcomers to the platform that are less experienced and able to handle such situations. In the Facebook group cleaners advise their peers to check the size of the houses against the Danish Building and Housing Registry, which is an open access online database containing information such as size and layout for all private homes in Denmark. However, there are still cleaners who accept these bookings out of necessity, even when they know that the duration is miscalculated.
Another device supporting the desired outcomes inscribed into algorithmic management is the check-in/check-out function used by cleaners performing a task. This is embedded in the app but is not connected to documenting duration of work nor does it have an impact on the payment of the cleaning. The role of this device is to discipline workers into being punctual. Once I failed to check in because I was one minute late and then I had to call support so that they could put me in the (system). […] It's not possible to say that if you miss your check-in for one minute then you're not allowed to report that you started working. […] it so happened once that I was a bit late to get to the place that I was cleaning and because people are waiting for you, you cannot check in before the customers see you … (Cleaner 7)
Cleaners who fail to check in before the beginning of the scheduled cleaning must contact customer support to register that the task is carried out. This device does not only provide extra surveillance potential for the platform to be used in the assessment of workers by customer support but also governs workers’ behaviour into fulfilling clients’ expectations of seamless and punctual service provision.
Finally, corporate institutional devices assisting algorithmic management are also the various online forums and social media groups used by housecleaning platforms for both direct and indirect management practices. The closed Facebook group that I observed is used by managers of the platform to – among other things – communicate alterations to terms and conditions, imposition of new fees to cleaners, and to warn about the zero-tolerance policy of the platform towards cleaners who attempt to take clients off the platform and into informal housecleaning. Managerial discourse in the Facebook group is often contested, especially by cleaners who are well-established on the platform, and have a lot of clients. However, the eloquence of these cleaners does not make up for the silence of the big majority of cleaners in the group chat. No, actually if I write in the Facebook group […] maybe it can turn bad because they will think that “oh, she is advising” … I mean I don't want to hurt them, it is not good to ask in Facebook in public, so it can go bad, so I never tell them. (Cleaner 10)
In this quote, a young migrant student states that she never posts messages in the group because she is afraid of being targeted by management. Her dependency on platform income impedes her from publicly sharing her opinions and problems. Despite platform management stating that ‘in the group we allow almost everything, we haven't kicked out anybody yet and […] (some) refer to us as modern slave owners, but we don't delete stuff like that’ (Platform manager), a lot of the cleaners I interviewed preferred to abstain from this forum. Indeed, in my research there is no evidence of management targeting cleaners who openly criticise the platform. However, the varying extents to which cleaners with different sociolegal status or work experience engage in the group is one more proof that management practices do not have a uniform impact on all cleaners.
Beyond these more traditional management practices, there are also indirect practices taking place. Two indicative examples are the staging of a lottery with small prizes, where only cleaners who regularly updated their availability on their platform calendar were eligible to participate, and the half-knowledge shared by managers as to which factors have an impact on the ranking algorithm. In both cases the aim is to give an extra push to platform housecleaners to perform the script of algorithmic management, in order to optimise the service experience of clients.
Danish policies, strategies and public agencies’ decisions reinforcing algorithmic management
Notwithstanding the support of algorithmic software by the aforementioned devices, algorithmic management in housecleaning platforms would not be equally efficient if Danish labour market regulation presumed cleaners to be employees and secured certain rights for them. However, there is no presumption of employment rule for platform workers in Denmark to date. On the contrary, in two decisions fiercely criticised by Danish labour unions (Union representative A; Union representative B) and politicians (Danish MP), the Danish Competition and Consumers Authority concluded that cleaners are not subject to extensive sanctioning, supervision or management by platforms, and should be institutionally considered to be self-employed (Konkurrence- og Forbrugerstyrelsen, 2020a, 2020b). In this case, both the public authority and their policy document act as institutional devices supporting algorithmic management. The decision endorses algorithmic management and provides immunity to housecleaning platforms, which can thus prolong their operational model.
Along the same lines, policymaking documents which promote the operation of housecleaning platforms in Denmark, despite their vague terms of employment and opaque management systems, are equally co-constituting algorithmic management. According to my framework, two indicative policy documents acting as institutional devices supporting algorithmic management in practice are the following. First, the Strategy for Growth through the Sharing Economy, which promoted platforms as – among other things – a ‘labor market that provides better opportunities for a flexible working life’ (Regeringen, 2017: 16). Second, the final report of the Danish Disruption Council, which endorsed labour platforms as part of an inescapable labour market future that should be fostered and promoted (Danish Government, 2019). The lack of regulatory frameworks reinforces algorithmic management practices as became evident in the tensions over the – recently agreed upon and significantly watered-down – final version of the impending EU directive on platform workers. The directive highlights the controversial operational models of platforms, especially in relation to their black-boxed algorithmic management (EC, 2021).
Beyond the policy documents promoting gig work, other institutional devices include exclusionary migration and welfare policies, which often dictate specific behaviours to migrant platform housecleaners. As I have demonstrated in my analysis, cleaners who are more dependent on platform income and less likely to find another job due to their family obligations or sociolegal status are more prone to comply with behavioural norms imposed by algorithmic management. In this sense, welfare exclusions, such as this of non-EU students from receiving state-supported student benefits, or visa regulations, which limit the amount of time that specific migrants can work, create intersecting vulnerabilities to the primarily young and migrant labour force that these platforms attract (Floros, 2024). Munkholm (2020) describes extensively welfare exclusions suffered by platform workers in Denmark and refers to them as systemic setbacks. During my fieldwork, I came across examples of cleaners who either had to leave the country due to work-related chronic injuries and their ineligibility to receive welfare support, and cleaners who struggled during the pandemic due to platform workers’ practical ineligibility to meet the criteria for receiving COVID income supplements for the self-employed. My argument here is that algorithmic management does not play out in a social vacuum nor are the workers simply complying with the technological affordances of platforms’ algorithmic configuration. The lack of a regulatory framework for platform work and the restricted access to welfare provisions for migrant platform workers augment the insecurity they experience. This insecurity acts complementary to platforms’ algorithmic management practices in shaping platform housecleaners’ working behaviour.
Resisting or complying with algorithmic management?
Nevertheless, the amalgam of algorithmic software and diverse institutional devices that support it does not guarantee the full compliance of platform housecleaners to management's intended outcomes. On the contrary, a lot of cleaners are trying to work their way around the disciplining features inscribed in algorithmic management. Even the workers who, as I demonstrated above, are more intimidated by deactivation threats, bad ratings, and aggressive communication on behalf of customer support departments, tend to adopt various everyday resistance practices (Floros, 2024) as they gain experience or become more established on the platforms. The performative effect of algorithmic management on cleaners’ behaviour is unstable and dependent on intersecting factors. Cleaners adopt varying strategies
Although a detailed analysis of cleaners’ resistance practices is beyond the scope of this article, 7 in this paragraph I will briefly refer to some ways in which they refuse to comply with algorithmic management's prescriptions. First, the most common form of non-compliance is the attempts by numerous cleaners to get their clients off the platforms. This is not happening simply for reasons of tax evasion and splitting the commission of the platform between client and cleaner. It is also a way for cleaners to avoid the threat of ratings and the pressure exercised by the platform whenever they want to reschedule or cancel a booking. Second, on some platforms it is possible for workers to drop their previous profile, if it has poor reviews and bad ratings, and set up a new one from the start. This is a way of cancelling their algorithmic ranking. Third, cleaners reject bookings where the house owners report fake size to avoid working for people who are trying to scam them into overperforming or overworking by using the menace of a bad rating as a pressure lever. Fourth, a few cleaners prefer using filters in their email accounts to avoid getting spammed by the multiple automated emails they receive from platforms, nudging and warning them about every little detail regarding their profile, such as cancellations and rejections. Finally, on the collective level, members of the large community of Latin American platform housecleaners in Denmark communicate through a WhatsApp group, where they collaborate in their grappling with algorithmic management. In the group they blacklist clients who are aggressive, abusive, demanding, or rude (cf. Altenried and Niebler, 2024), they exchange advice on what to upload on their profiles, what price to choose, and what the algorithm might prioritise, and they also find last-minute substitutes to work for them. Cleaners prefer informal exchange of bookings to avoid being flagged or penalised by customer support if cancelling. Likewise, in case they cancel, they risk a bad review or losing a stable source of income if the booking is recurring. Summing up, these non-compliant practices challenge all three broad categories of desired outcomes inscribed into algorithmic management.
Discussing findings and conceptualising minor algorithmic management
Woodcock (2020) argues that the supposed omnipresence and alleged efficiency of algorithmic management is an illusion, intended to control workers’ behaviour in a cost-efficient way. My findings, in a wildly different sector and setting than food delivery in the UK support this conclusion. Housecleaning platforms in Denmark deploy algorithmic software and automated tools, which however need to be supported by humans-in-the-loop to enhance the efficiency of platforms’ operating models. Moreover, restrictive labour and migration policies are a
Drawing on my findings and ‘thinking in a minor key’ (Katz, 2017: 597), I argue that what occurs in Danish housecleaning platforms can better be described as
First, without the extensive managerial work carried out by humans, the algorithmic management of platform housecleaners would be less efficient (cf. Irani, 2015; Kusk et al., 2022). Here the focus is on how indispensable customer support departments are for the functionality of housecleaning platforms. These departments are not there simply for handling complaints and mediating disputes between cleaners and clients. They also actively support and complement – and in some cases substitute – platforms’ algorithmic systems by sanctioning, assisting, nudging and evaluating cleaners to streamline the provided service.
Second, minor algorithmic management intends to stress the role of platform housecleaners’ intersecting subjectivities, heeding minor theory's purpose to unsettle established concepts by foregrounding the social circumstances and relations that co-construct them (Katz, 2017). A prerequisite for the operation and efficient management of housecleaning platforms is the entanglement of restrictive and exclusionary migration, welfare and labour regulations. It is precisely this entanglement that accentuates platform housecleaning as an appealing undertaking for particular categories of migrants (cf. Orth, 2024). Adding the adjective minor within this context acts as a constant reminder of how global social inequalities and neoliberal policymaking frameworks underpin the operation of algorithmic management. My argument here suggests that initiatives for the regulation and transparency of algorithmic management cannot drastically improve the working lives of migrant platform housecleaners as long as exclusionary migration, welfare and labour regulations remain intact.
Third, minor algorithmic management argues against accounts that portray platform housecleaners as being intensely controlled by algorithmic software. Although algorithmic software and the institutional devices that support it exert a varying extent of control to migrant cleaners -according to their experience, sociolegal status and dependence on platform income-, this control is often balanced by cleaners’ everyday resistance practices, their solidarity networks and individual resourcefulness. Minor algorithmic management is thus a concept simultaneously acknowledging the major transformative role of novel technologies in shaping the future of work, while at the same time challenging the inescapability of this future by foregrounding the lived experiences of platform housecleaners, who narrate their own minor stories of resistance.
Although specific within the context of Danish housecleaning platforms, the everyday resistance practices of the workers I studied are part of a broader repertoire of workarounds to, reappropriation or even outright defiance of algorithmic management observed by scholars in other platformised sectors. Whether platform housecleaners in Denmark, Filipino digital workers sharing strategies for coping (Soriano and Cabañes, 2020), Indonesian ride-hailing drivers using fake accounts on modified apps to cancel rides without sanctions (Ferrari and Graham, 2021) or food delivery couriers in Italy and the USA coordinating collective order refusals (Bonini and Treré, 2024), the fact that algorithmic management is challenged in practice, both at the individual and collective level, stresses the need to think in minor key.
Conclusion
In this article, I presented how algorithmic management of housecleaning platforms occurs in Denmark. Drawing on the definition introduced by Lee et al. (2015), I argued for the need to adopt a broad perspective in identifying the institutional devices supporting algorithms in practice, taking into consideration technological, human, and social factors. On the one hand, human and technological factors supporting algorithmic systems are more decisive regarding how the actual contracting and execution of the gig is managed. On the other hand, by considering public policy, legal and policymaking documents, and public agencies as institutional devices supporting and co-constructing the overall algorithmic management of the platform workforce, this article highlights the multiple connections between platform governance and public institutions. In this sense, platform governance of – primarily female, migrant – cleaners in Denmark is portrayed more as a component of neoliberal statecraft (cf. Peck and Phillips, 2020) and less as a technological affordance of trailblazing algorithmic systems.
Moreover, in this article I demonstrated how the ranking algorithm and the rest of the platforms’ digital tools are inscribed with visions about platform housecleaners’ desired behaviour, and how these visions are promoted by very active customer support departments. I presented how these visions hold an explanatory value for the intentions of developers rather than for the ways actual users engage with them in practice (cf. Pelizza and Van Rossem, 2024). In a nutshell, platform housecleaners in Denmark adopt everyday work practices, which overall differ a lot from the desired practices inscribed into the process of their algorithmic management. The reasons for this also relate to the intersecting subjectivities of – predominantly – migrant housecleaners and the insecurities deriving thereof, as well as to their different individual needs, and dependencies on platform work. Finally, I proposed the concept of minor algorithmic management as more befitting to describe the contingent (non)compliance of migrant cleaners with the amalgam of human, sociopolitical and algorithmic factors comprising housecleaning platforms’ algorithmic management.
Most analyses of algorithmic management to date focus on what van Doorn and Shapiro (2023) refer to as the platform-mediated point of production. By rethinking algorithmic management as minor this article equally draws attention to ‘platform-adjacent’ topics (van Doorn and Shapiro, 2023). These include approaching political support in favour of platformisation of low-wage service work as an ongoing feminisation of labour, paying attention to the intersectional subjectivities of platform housecleaners and their livelihoods, and how these influence their work behaviour, and engaging with informal social infrastructures (such as WhatsApp groups) which help cleaners navigate and resist the precarious elements of platform labour. The minor algorithmic management of housecleaning platforms in Denmark diverges from the uniform description of algorithmic management in policy documents (e.g. EC, 2021) as being highly sophisticated and featuring extensive surveillance capabilities. Applying algorithmic management as a blanket term describing all work environments where algorithms assume managerial tasks enfeebles its rigidity as an analytical tool. Moreover, uniform descriptions of algorithmic management often draw attention to algorithmic complexities and their black boxing by corporate stakeholders, presenting transparency as a panacea for workers’ rights and well-being. Nevertheless, in this article I demonstrated how the disclosure of data feeding the ranking algorithm can have adversary effects, overwhelming the cleaners and guiding them to align their behaviour with what the algorithm allegedly prioritises. Even if platforms provide full transparency of their algorithmic software in a comprehensible way, it is easy for them to apply small adjustments after that and ensure that the final output safeguards their corporate interests.
Literature on algorithmic management problematizes how algorithmic management can become beneficial both to workers and cleaners (Baiocco et al., 2022; Noponen et al., 2023). This article agrees with previous research on housecleaning platforms (e.g. Floros and Jørgensen, 2023; Orth, 2024; van Doorn, 2023), claiming that platform-based solutions are not sufficient on their own, without parallel changes in local restrictive and exclusionary migration, welfare and labour regimes, which co-produce the multiple vulnerabilities of migrant platform workers. Finally, by approaching algorithmic management through a minor lens, this article challenges the corporate technocentric narrative, which underlines the role of complex software in the streamlining and optimisation of platform housecleaning. This narrative is there to obscure the primacy of human agents in the management and sanctioning of cleaners, helping platforms repel policymaking and stakeholders’ claims on misclassification of platform work.
Footnotes
Acknowledgements
I am very grateful to Andrea Pollio, the editorial collective of Platforms & Society, and the two anonymous reviewers for their insightful comments and suggestions on the article. Moreover, I am deeply indebted to the platform housecleaners who shared their valuable time and experiences with me.
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
