Abstract
The use of modern data processing in the workplace, such as algorithms, big data and artificial intelligence (AI) raises numerous legal questions, particularly in relation to labour law, non-discrimination law, and data protection regulations. A key issue is whether, and under what conditions, algorithms and AI systems can be lawfully employed in compliance with data protection laws. This article aims to delve deeper into the issue by outlining the legal framework, exploring current legal challenges through two recent examples, and offering further considerations based on these findings.
Keywords
Introduction
Digitalisation has reshaped the modern workplace. Employees now have global access to information and can be reached by their employers around the clock. Remote and mobile work have become integral parts of professional life. Businesses increasingly rely on advanced communication tools, cloud-based management systems, and data-driven solutions to visualise, optimise, and integrate processes. Emerging technologies like GPS trackers, wearables, exoskeletons, and the growing presence of robots and artificial intelligence (AI) are transforming how work is performed. The COVID-19 pandemic further accelerated this digital transformation.
However, the integration of modern tools and systems makes it possible to generate unprecedented amounts of information about ‘users’ and thus about employees. Every activity and every step, whether carried out in the factory, in retail or hospitality, in an office setting or remotely, leaves a digital trail. These digital trails – and those who leave them - can now be captured, analysed, and integrated by algorithms or AI with minimal effort and at minimal cost. 1
From a legal perspective, the increasing use of such modern data processing in the workplace raises numerous issues. Given that the use of algorithms, big data and AI systems in the workplace necessarily involves the processing of workers’ or applicants’ personal data, data protection regulations are regularly applicable. When considering the data protection framework in the workplace, several legal sources come into play. The most important of these is the General Data Protection Regulation (GDPR). 2 At the level of the European Union, Chapter III of the Platform Work Directive (PFWD) 3 must be noted. These are classified as data protection provisions as well. 4 While other regulatory frameworks, such as the recently adopted AI Act, 5 primarily address the providers of AI systems, data protection regulations specifically focus on the relationship between the affected individuals and those who implement these systems and determine the means and purposes of the processing – typically the employers. 6 While the AI Act and Chapter III of the PFWD are also formulated in a technology-neutral manner, their scope of application is limited to specific types of systems – namely, AI systems or automated monitoring and decision-making tools. In contrast, the GDPR provides for genuinely technology-neutral protection of natural persons. 7 Whether employers use ‘traditional’ algorithms, AI or other forms of technological processing is irrelevant in the context of the material scope of the GDPR.
The principle of lawfulness: Legitimate interests of employers
At the beginning of Chapter II of the GDPR, Article 5(1) sets out the principles governing the processing of personal data. According to these principles, personal data must be processed lawfully. 8 Article 6(1)(1) GDPR specifies this principle of lawfulness, providing an exhaustive and restrictive list of the cases in which the processing of personal data can be regarded as lawful. Thus, in order to be capable of being regarded as lawful, processing must fall within one of the cases provided for in that provision. 9
As the consent of the data subjects must regularly be excluded as a legal basis in the application process and in the employment relationship as it is unlikely to be freely given, 10 and the purposes of modern data processing in the workplace is often not necessary for the performance of the employment contract, 11 Article 6(1)(f) GDPR is of particular importance in practice. 12 It provides that processing is lawful if it is ‘necessary for the purposes of the legitimate interests […], except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, […].’
According to the case law of the CJEU, the processing of personal data is lawful if three cumulative conditions are met:
The processing must aim at the ‘purpose of the legitimate interests’; it must be ‘necessary’ to achieving such purposes; and the legitimate interests must not be ‘overridden by the interests or fundamental rights and freedoms of the data subject’.
13
This assessment is aligned with the three standard steps of proportionality assessments: first, the adequacy to achieve a legitimate objective; second, the necessity for that objective; and third, the balance between the achievement of the objective and the interference with the right. 14 The Article 29 Data Protection Working Party (Article 29 Working Party) has specified this decision-making practice and provided further (non-binding) guidance for data controllers. 15 The European Data Protection Board (EDPB) 16 issued (updated) guidelines in 2024. 17
Legitimate interests
The first step in the proportionality assessment lato sensu of Article 6(1)(f) GDPR, is the pursuit of a legitimate interest by the controller or a third party. The concept of legitimate interest is generally very broad. To be legitimate, a specific and non-speculative interest must only respect the law. It only requires the pursuit of interests that are not forbidden. 18 In addition to legal interests, economic or ideational interests can also be invoked. 19 The decisive factor in practice is that the interest is sufficiently clearly articulated to enable the balancing of the interests to be examined. 20
In the context of work, there are numerous specific (legitimate) interests of employers. For example, the processing of personal data of employees using algorithms or AI systems to ensure network and information security (recital 49) is just as conceivable as processing for internal administrative purposes (recital 48). The prevention of fraud (recital 47) may constitute a legitimate interest, as may the monitoring and control of employees (control is a characteristic criterion of the employment contract), 21 the increase in efficiency and optimisation of work processes, and the right of employers to obtain information in the recruitment process or in the employment relationship. 22
However, the mere fact that data controllers have a legitimate interest in processing data does not mean that Article 6(1)(f) GDPR can be invoked as a condition of lawfulness. The processing must also be necessary and proportionate to achieve the objective. 23
Necessity
Necessity requires the processing to be ‘necessary’ to achieve a certain objective. Determining what is ‘necessary’ requires, aligning with the general theory of proportionality, 24 an assessment of whether the legitimate interests could reasonably be achieved by other just as effective but less intrusive means that better protect the fundamental rights and freedoms of data subjects concerned. 25 In this context, the CJEU clarified that the necessity of processing must be assessed alongside the principle of data minimisation, as set out in Article 5(1)(c) of the GDPR. This principle requires personal data to be adequate, relevant and limited to what is necessary for the purposes intended. 26
Proportionality (stricto sensu)
The core of the legality requirement of Article 6(1)(f) GDPR is the balancing test in the third criterion (proportionality stricto sensu). In the balancing test, the legitimate interests must be weighed against the ‘interests, fundamental rights and freedoms’ of the data subjects. This means that the balancing test assesses the trade-off between the positive impact of the processing on the legitimate interests pursued and its negative impact on the interests, freedoms and rights of the data subjects. 27 The employer's legitimate interests, if insignificant and not very compelling, can generally override the interests and rights of employees only in cases where the impact on those rights and interests is even more insignificant. On the other hand, important and compelling legitimate interests may in some cases, when they are, for example, subject to safeguards, justify even significant intrusion into privacy or other significant impact on employees’ interests or rights. 28 In all cases, the analysis must be objective, case-specific and independent of the individual feelings of those affected. 29
The Article 29 Working Party 30 and the EDPB 31 have further developed the balancing test in their opinions by providing a more structured framework. However, this structure has not yet been clearly reflected in the case law of the CJEU.
The balancing test in the Amazon cases
While the structure of the test formula described above seems comprehensible and practicable, the decision to be made in the context of the balancing of interests (proportionality stricto sensu), in particular, causes great difficulties in practice. 32 The GDPR does not provide substantive guidance on how to balance the competing interests. 33 This approach corresponds to the indication provided by the Article 29 Working Party and the EDPB, which have argued not only against any ‘exhaustive list of interests that may be considered as being legitimate’, but also against ‘defining cases where the interest or right of one party should as a principal or as a presumption override the interest of or right of the other party’. Such a regulation of this legal basis would ‘risk being both misleading and unnecessarily prescriptive’. 34 For this reason, there is a high degree of legal uncertainty.
Legal uncertainty also applies to the question of the legitimacy of modern data processing in the workplace. Although there are now at least some decisions on Article 6(1)(f) GDPR, there is still a lack of case law from the CJEU on the proportionality test in the context of employment relationships in general, and in relation to the use of algorithms, big data and AI in the workplace, in particular. In some countries, there are now first (lower court) decisions on the lawfulness of the use of algorithms and AI in the workplace based on the legitimate interests of employers. Two recent decisions regarding Amazon will be presented in more detail below, as they illustrate the existing legal problems.
Facts of the cases
The subject matter of the cases was (among other things) the use of modern data processing in the dispatch and logistics centres of Amazon in Germany 35 and in France. 36 The facts of the cases were very similar and can be summarised as follows. The company used hand scanners in their logistics centres. The scans completed by the employees were not only used to track the progress of the products through the various stages of processing and distribution; they also made it possible to measure employee activity. The (raw) data collected in this way were analysed by specific software applications and grouped into so-called ‘indicators’, which can be roughly divided into three categories: quality indicators, productivity indicators, and inactivity indicators. 37
Each employee's indicators were available to their manager in real time. In addition, weekly performance reports were produced for all employees based on the continuously collected indicators. These included weekly, daily and hourly statistics on quality compliance and productivity, and raw data on quality defects. 38
These processing activities were carried out for three different purposes: management of logistical processes, management of skills and creation of a basis for individual feedback, and personnel decisions (such as promotions and dismissals). 39
Legitimate interests and necessity
Under the assessment scheme explained above, both the German Administrative Court and the French Data Protection Authority (CNIL) considered that the processing purposes pursued were legitimate interests within the meaning of the GDPR. The processing was also necessary to achieve the objective pursued. The Administrative Court of Hanover admitted that data can be collected for the purpose of controlling business processes, either through random sampling or in concrete situations, should the need arise. In fact, according to the Court, no personal data would be required to calculate the likelihood of delivery guarantees being met. However, such a control system would result in a significant loss of efficiency, as aggregated data does not provide any information about the shift-specific performance of employees. Nevertheless, the determination of the current and individual performance is crucial for the efficient control of logistics centres, as there is a need to respond to deviations from the planned situation before the start of the shift. 40
According to the German Administrative Court and the French Data Protection Authority, data processing is also necessary for the management of skills and the creation of a basis for individual feedback and personnel decisions. Ongoing feedback discussions are considered to be ‘established principles of sound personnel management’ and are also in the interest of employees, as they can continuously improve themselves in this way. 41 Feedback based on aggregated averages would be beneficial neither to employees nor to the company, as they are less meaningful. Performance data is also objective. 42
Proportionality (stricto sensu)
The comments on the proportionality are particularly interesting from a legal point of view. There are significant differences between the two judgments. The German Administrative Court concluded that all data processing was proportionate. In addition to the familiar parameters, such as the type and scope of the data, the duration of the data processing, the purposes of the data processing and the limitation of the right of access, the court also stated that the data did not come from the private or intimate sphere, but rather from the ‘work sphere’ of the employees. 43 Furthermore, the data did not provide any information on the ‘personality structure’ of the employee. The employees themselves would not become ‘transparent’ (‘gläsern’) 44 as a result; only their work performance would. In the view of the Court, however, there was nothing to be said against this. 45
With remarkable reasoning, the Administrative Court rejected the argument of the first instance (the State Commissioner for Data Protection in Lower Saxony) that the extent and continuous nature of the data collection would result in constant pressure to adapt and perform:
According to the Administrative Court it cannot be assumed that employees are constantly afraid of losing their jobs, as there is a growing labour market in Germany, which in turn significantly reduces any pressure to adapt and perform.
46
There were only a few negative responses to the extensive data collection. Negative feedback mainly concerned the way in which feedback was given. According to the Administrative Court, the absence of negative feedback indicated that there was little or no pressure to adapt and perform.
47
The collection and analysis of performance data is also common in other areas of work, such as the judiciary. In the judiciary, for example, figures on completed cases are continuously processed both for the purposes of work organisation and personnel decisions. However, the Court was not aware of any permanent pressure to adapt and perform as a result.
48
Finally, the pressure to adapt and perform is not unreasonable because the stress caused by permanent monitoring is reduced if the monitoring is disclosed to the employees.
49
The French Data Protection Authority also did not question the proportionality of the data processing for the management of the logistics processes in principle, given the complexity of the services involved (daily preparation and dispatch of millions of different items combined with a large number of employees). Nevertheless, it concluded that some processing was not proportionate.
This included, for example, an indicator of the speed at which tasks were being performed (Stow Machine Gun Indicator). This tool was an error-monitoring tool that sent a message in real time if several items were stowed within an interval of 1.25 seconds, as this would most likely indicate a quality defect. 50 While ensuring the quality and safety of work processes is a legitimate interest in the view of CNIL, in practice such monitoring would lead to the automatic recording of the speed at which each action is carried out. Monitoring with such precision is disproportionate, according to the French Data Protection Authority, as it is likely to have a negative (psychological) impact on employees. 51
In addition, the French Data Protection Authority found that the indicators used for periods of inactivity of more than ten minutes (idle time), or periods of inactivity of less than ten minutes at the start and end of work and before and after breaks, were disproportionate. Although a legitimate interest was found (efficient management of the logistics centre, staff coaching, staff appraisals), this would unreasonably result in staff having to justify every short interruption, which would be tantamount to constant computerised monitoring of all interruptions throughout the working day. 52
Finally, CNIL did not question the use of personal data for regular individual assessments and feedback. However, it considered both the volume and the granularity (raw data as well as aggregated data over even short periods of time) to be disproportionate. In this respect, aggregated weekly quality and productivity statistics would have been sufficient. 53
Finally, the CNIL considered it excessive to keep all the data collected by the system, as well as the resulting statistical indicators, for all employees for a period of 31 days, as the storage of such a volume of data on warehouse employees was particularly intrusive. This constituted excessive computer surveillance that outweighed the company's legitimate economic and commercial interests, in particular with regard to the protection of employees’ private and family life and their right to a working environment that respected their health and safety. 54
While the Administrative Court of Hanover upheld the lawfulness of all the processing operations, the French Data Protection Authority found numerous processing operations to be unnecessary and disproportionate. The Administrative Court ‘with full conviction’ saw no data protection issues, while the first instance even considered continuous data collection as such to be unlawful, and the CNIL imposed a fine of EUR 32 million pursuant to Article 83 GDPR for various breaches of the GDPR (e.g., Articles 5, 6, 12, 13 and 32 GDPR). 55
Critique, synthesis and further considerations
Critique of the jurisprudence
The two decisions described above clearly illustrate both the relevance of Article 6(1)(f) of the GDPR and the existing legal uncertainty. The striking divergence highlights the significant discrepancies in how decision-making bodies interpret and enforce data protection regulations. While the German Administrative Court found all processing to be lawful, the French Data Protection Authority took a more differentiated approach: it did not make a blanket assessment based on dubious arguments (see below), but instead tried to identify and prevent problematic data processing without questioning fundamental data processing in warehouse logistics.
The argumentation of the German Administrative Court is particularly questionable as to whether there is a constant pressure to adapt and perform. The argument that employees are unlikely to live in constant fear of losing their jobs due to Germany's growing labour market, which eases the pressure to adapt and perform, overlooks that any job change brings (significant) transaction costs and uncertainty for employees. It also ignores legal and actual ties (outstanding wages, bonuses, personal ties, etc.) to the previous employer. Moreover, the taking into account of the market situation would lead to a legal situation that is no longer predictable, since the market situation is subject to strong fluctuations not only in terms of time, but also regionally and sectorally.
The argument that only a few workers have complained about the data collection neglects the real balance of power in employment relationships, which are characterised by a high degree of power asymmetry. The historical experience of labour law shows that workers are often afraid to assert basic rights themselves, for fear of consequences and sanctions.
Finally, it is almost cynical to equate the position of an independent, irremovable judge with lifetime tenure 56 with that of a warehouse worker in a billion-dollar global corporation. The same applies to the assumption that the pressure to adapt and perform would be reduced if the monitoring were disclosed. In fact, transparency is a precondition for the emergence of pressure to adapt and perform, because without knowledge of the monitoring, employees would hardly adapt their behaviour.
Balancing the interests and fundamental rights of employees
Looking more closely at the arguments for the protection of data subjects presented in both cases, it is striking that despite the explicit mention of ‘fundamental rights’ in Article 6(1)(f) GDPR, references to fundamental rights are largely absent. The French Data Protection Authority at least speaks of the right to a working environment that respects the safety, health and dignity of workers,
57
but without any further reference to the Charter of Fundamental Rights of the European Union (CFR). In my opinion, this neglect is problematic, especially since, starting with human dignity (Article 1 CFR), there is an abundance of rights that could be considered:
Article 3 CFR: Right to physical and mental integrity, Article 7 CFR: Respect for private and family life, Article 10 CFR: Freedom of thought, conscience and religion, Article 12 CFR: Freedom of assembly and association, Article 21 CFR: Non-discrimination, Article 23 CFR: Equality between men and women, Article 27 CFR: Workers’ right to information and consultation within the undertaking, Article 28 CFR: Right of collective bargaining and action, Article 30 CFR: Protection against unjustified dismissal, Article 31 CFR: Fair and just working conditions or Article 32 CFR: Prohibition of child labour and protection of young people at work.
58
As a consequence, processing that interferes with workers’ rights is not only problematic from an employment perspective, but may also be unlawful from a data protection point of view. For example, if an algorithm is programmed in such a way that strict time limits significantly increase the likelihood of work accidents among employees, this raises a data protection issue, taking into account the right to physical and mental integrity (Article 3 CFR) and the right to fair and appropriate working conditions (Article 31 CFR). If an algorithm discriminates on the basis of gender and this processing is linked to an automated individual decision-making process, this is an issue from the perspective of non-discrimination (Article 21 CFR) and gender equality (Article 23 CFR). AI-based automated monitoring and decision-making systems that put undue pressure on workers or endanger their physical and mental health are raising data protection concerns with regard to the right to physical and mental integrity (Article 3 CFR), as well as the right to fair and appropriate working conditions (Article 31 CFR). 59
This means, of course, that regardless of any employment law consequences, the data protection sanctions and liability provisions (in particular, Articles 82 and 83 GDPR) apply. Under Article 83(5)(a) of the GDPR, breaches of the processing principles are subject to administrative fines up to EUR 20 million or, in the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher.
It can be countered that the above-mentioned rights are not absolute rights. Rather, they are subject to a balancing of interests, since the rights of employers are also protected by Article 16 CFR. 60 The CJEU has already pointed out that the right to protection of personal data is not an absolute right but instead, as recital 4 of the GDPR states, must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. 61 The same applies for fundamental economic rights, which do not enjoy absolute protection under EU law, but must be viewed in relation to their function in society, 62 which regularly leads to a conflict of fundamental rights.
Nevertheless, I believe – and both decisions clearly show this – that the fundamental (social) rights of data subjects should be given more consideration in the balancing of interests under data protection law. There are three main reasons for this:
First, in practice it is much more difficult to identify adverse effects on individuals than it is to identify legitimate interests.
63
This is where the CFR can be helpful, especially as there is a rich source of relevant rights in employment relations. Second, it is only through reference to the CFR that the interests of the persons concerned are given appropriate weight (linguistically and dogmatically), since it becomes clear that these are positions guaranteed by fundamental rights and not arbitrary interests. This applies even more where controllers expressly refer their processing of personal data of employees to Article 16 CFR. Third, the CJEU has already found that some fundamental rights of employees are ‘rules of EU social law of particular importance’
64
or ‘particularly important principles’
65
of EU social law, the compliance with which may not be subordinated to purely economic considerations.
66
This must, of course, also be taken into account in the balancing of interests under data protection law.
No balancing of interests due to minimum level of protection
Moreover, it must be borne in mind that in cases in which fundamental rights provide for a concrete minimum level of protection, a balancing of interests is not permitted at all, since it is not lawful to go below this minimum level of protection. 67 This is particularly true where secondary legislation gives concrete expression to a fundamental right. Examples include the right to four weeks of annual paid leave, 68 the right to a daily rest period of 11 consecutive hours 69 or the right to a maximum working time of 48 hours in a given legally defined period. 70 So, if an algorithm or an AI system used for deployment or shift planning purposes, without any further human involvement or other possibility of intervention, systematically violates the provisions on compliance with the daily rest period of 11 hours (code is law!), this (also) constitutes a data protection infringement due to the violation of a fundamental right (specified in the Working Time Directive2003/88/EC).
Balancing the interests and the European Pillar of Social Rights
Furthermore, I would argue that the balancing of interests should take into account not only fundamental rights, but also the principles enshrined in the European Pillar of Social Rights. This is certainly the case for the data protection provisions of the PFWD, which explicitly refer to the European Pillar of Social Rights. 71 But even in the general provisions of the GDPR, which is older Union law than the European Pillar of Social Rights, there is no reason to doubt that this incorporation is not dogmatically permissible. Since the GDPR suggests a certain understanding of existing Union law and, in particular, of existing social rights (especially the CFR), the European Pillar of Social Rights can be used as a source of legal interpretation, for example, regarding the extent of the fundamental rights enshrined in the CFR. In this regard it should be borne in mind that the CJEU used the CFR in the Viking case to confirm its understanding of fundamental rights, although the CFR had only been ceremonially proclaimed at the time and had not yet been made binding by the Treaty of Lisbon. 72
Finally, it should also be noted that Article 6(1)(f) GDPR speaks not only of fundamental rights and freedoms, but also of the mere interests of the data subjects concerned. That means that the position of the data subject is protected extensively. 73 In view of the wording and the appropriate broad interpretation of the ‘interests’, historical arguments cannot be of significance in this context.
Data protection as a ‘capacitor’ for fundamental rights
These considerations shed light on the very core of data protection law: on its legal nature and the scope of the right to data protection. The above list of different rights to be considered shows that the right to data protection goes beyond the mere protection of privacy. 74 Although Article 7 CFR (protection of privacy) continues to play a central role, other rights may also have to be considered in the context of the proportionality testing in employment relationships.
However, the thesis that the core of data protection law is the right of the data subject to informational self-determination must also be rejected. 75 In today's workplace reality, which is permeated by the processing of personal data, there is no right to informational self-determination. In the area of employment data protection, this is illustrated by the fact that consent is regularly rejected as a justification because it is not freely given and the processing is regularly not necessary for the performance of the employment contract; almost all processing in the context of employment bypasses the self-determination of employees.
Having discarded not only the privacy-centric focus of data protection law but also the concept of informational self-determination, one might ask: what then is the ‘essence’ of data protection law? In my opinion, data protection law ensures that data processing does not take place without the knowledge of the data subject and without taking into account his or her interests. I therefore argue that the right to data protection is – in the context of reviewing the lawfulness of data processing – above all a ‘capacitor’ for interests, fundamental rights, and freedoms. 76 This means that the right to data protection has no content of its own. Rather, it derives its content from other sources, first and foremost from the fundamental rights and freedoms of the data subjects concerned. 77
Addressing legal uncertainty
The need for a comprehensive balancing of interests, together with a lack of case law, leads to great legal uncertainty in practice. To deal with this appropriately, it is worth looking beyond data protection law. Labour law, for example, offers a wealth of issues not involving automated processing of personal data, but where fundamental rights conflicts have been resolved and specific balancing concepts developed. I would argue that in many cases the results and concepts can also be applied to the balancing of interests in data protection law.
These results and approaches can be found, for example, in
the employer's right to information, 78
the introduction and content of personnel questionnaires, 79
access to personnel files by employee representatives, 80
the system of gradual tightening of controls, and 81
the standard of the depth of intervention. 82
For example, based on the previous case law of the Austrian Supreme Court as well as the existing literature on the right to information, data controllers in Austria may conclude that there is no proportionate legitimate interest on the part of employers for the processing of personal data relating to existing or planned pregnancies and/or family planning, the existence of a non-marital relationship, the circle of relatives or acquaintances, sexual orientation or (sexual) preferences, leisure behaviour or other private preferences and habits (e.g., drinking or smoking habits, club memberships), etc. 83
From the principle of a gradual tightening of controls, controllers of, for example, algorithmic monitoring of employee Internet use can deduce that a proportionate legitimate interest may exist if, in a first step, only computerised monitoring is used to ensure that the system is functioning properly. Only if there is a concrete suspicion will personal data be analysed and the user investigated. And then, in a third step, the name will be passed on to the manager if the IT department's contact with the employee concerned has failed to resolve the issue. 84
Legal certainty can also be increased by applying the prohibitions on processing in the context of employment set out in the PFWD, to cases that fall outside the scope of the Directive. These prohibitions refer not only to specific purposes of processing, but also to specific categories of data or to specific periods of time. For example, neither personal data on the emotional or psychological state nor personal data on private conversations may be processed. Data may not be collected during periods in which the employee is not performing or offering work, nor may data be used to predict the exercise of fundamental rights under the CFR, etc. 85
However, if the possibility exists in the Member State, it is particularly advantageous to concretise the balancing of interests in accordance with Article 6(1)(f) of the GDPR by means of a company-level collective agreement. This can increase legal certainty and minimise risks for both employers and employees. 86 The conclusion of an employer/works council agreement indicates that the interests, rights and freedoms of the employees are not overridden. 87 This argument is in line with the labour law jurisprudence that collective agreements per se have a ‘presumption of correctness’ or ‘presumption of reasonableness’. 88 In order to counter the risk of workers’ representatives concluding agreements that allow dubious forms of data processing in exchange for other benefits, Article 88(2) GDPR stipulates that those rules shall include suitable and specific measures to safeguard the data subject's human dignity, legitimate interests and fundamental rights. Provisions that do not comply with the conditions and limits must be disregarded, according to the CJEU. 89 Therefore, these rules cannot have the purpose or effect of circumventing the obligations of the GDPR. 90 Nor may the parties to a collective agreement, within their margin of discretion, introduce more specific rules that would result in the requirement of necessity being applied less strictly or even disregarded. 91 The added value of an agreement at company level is therefore primarily concerned with the third element of the proportionality testing (proportionality stricto sensu), even though a collective agreement only has indicative character and is subject to judicial review. 92
Conclusion
In conclusion, the integration of data protection law within the workplace is essential, since it ultimately guarantees not only the protection of privacy but also the protection of all workers’ rights in the face of advancing technology. This article highlights the importance of balancing employer interests with the interests and fundamental rights of workers, as required under Article 6(1)(f) GDPR. It argues that a nuanced approach – one that explicitly takes into account social rights under the CFR and draws on the European Pillar of Social Rights – should guide the interpretation of this balancing test.
As the CJEU has affirmed the importance of social rights, their consideration should serve as a fundamental aspect of any assessment of legitimate interests in data processing. Importantly, where fundamental rights provide clear protections, the balancing process must respect these boundaries. The legal uncertainty inherent in a balancing exercise can be addressed by transferring case law and concepts from labour law to data protection law, by applying the prohibitions set out in the PFWD and, where possible, by agreements between employers and works councils.
Footnotes
Declaration of conflicting interest
Not applicable
Writing assistance and third party submissions
The author used AI-based technologies for translation assistance.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Ethical approval and informed consent statements
Not applicable
