Abstract
The recruitment process has largely moved online. Job advertisements which used to be bound to newspapers and other print media have become an online service as part of a growing trend towards a more digitalised hiring process. Alongside increased flexibility and cost-cutting, this trend brings so previously unseen challenges. The technology behind online job portals and social media allows job ads to be shown to targeted groups of people using machine learning techniques to filter through the available data and search for the most suitable audience. The correlations that are inferred by algorithms between content and audience, however, can lead to biased outcomes. This is a serious problem since the specific risk with online job ads is that jobseekers who are considered less suitable by the algorithm will not see the ad at all. Such a result effectively hinders access to the labour market and poses the risk of perpetuating existing biases and discrimination. Those discrimination risks raise questions about the legal framework of online job advertisements. This article examines the requirements of the new EU initiatives to regulate artificial intelligence and the digital market and EU non-discrimination law regarding online job advertisements. It also proposes a low-tech solution to the high-tech problems associated with online job advertisements by introducing a legal requirement to publicly tender job ads on an online noticeboard, thus ensuring transparency and effective access to employment.
Keywords
Introduction
You are looking for a new job. What will you do? Will you look at job advertisements in a newspaper? Probably not, because it is the 21st century and the job ad market has changed significantly in the last 30 years. The days when jobseekers learned about job openings mainly through job advertisements in newspapers are long gone: The recruitment process has largely moved online. Job advertisements which used to be bound to newspapers and other print media have become an online service as part of a growing trend towards a more digitalised hiring process. 1 Nowadays, companies publish job offerings mainly through their company websites, through online job portals like Indeed or Monster, and through social media like Twitter, Facebook, or LinkedIn. 2
A UK-based survey conducted in May 2021 found that the top three most effective ways of recruiting and attracting new employees were corporate websites (54%), professional networking sites such as LinkedIn (46%), and recruitment or search consultants (41%). 3 The appeal of newer ways to recruit seems evident. Online recruitment is often less costly; it has a bigger reach and is thus likely to attract more applicants. LinkedIn, for instance, can bolster with 660 million users in 200 countries worldwide with more than 206 million users in Europe. Over 30 million companies are represented on the platform with about 20 million job openings. 4 This sheer volume renders one question increasingly significant: How can advertisers (i.e., potential employers) ensure that a job ad is seen by the people considered to be best suited for, and most interested in, the job? With online advertising, the answer is targeting. The mass utilisation of the internet has not only moved the recruitment process online; it has also revolutionised its delivery method. The technology used by online job portals and social media allows advertisers to target their ads to users with specific (personal) attributes using machine learning techniques to filter through the available data and search for the most suitable audience.
What might sound like a win-win situation, where jobseekers are offered the most suitable jobs and companies hire the best possible employees, unfortunately poses significant risks to the principles of equality and non-discrimination. The correlations that are inferred by algorithms when matching ads and jobseekers frequently lead to biased outcomes. Studies have shown that women are less likely than men to receive targeted ads for high-paying jobs and are less likely to see ads promoting employment in the STEM fields (science, technology, engineering and mathematics). 5 Another experiment found that job ads largely target their audience along stereotypical lines, with ads in the lumber industry reaching a 72% white and 90% male audience, whereas ads for supermarket cashiers reached an 85% female audience and ads from taxi companies targeted a 75% Black population. 6
The automated online targeting of job advertisements bears the risk of excluding whole populations of jobseekers from seeing certain job offerings. Users who are not considered suitable audience will not be shown the job ad and thus have hardly any possibility of learning about a particular vacancy, especially if it is only advertised online. This effectively hinders access to the labour market and—if the audience selection is based on personal characteristics of jobseekers—poses the risk of perpetuating existing biases and discrimination. Advertising jobs online, however, does not take place in a legal vacuum. The risks posed by the digitalisation of more and more areas of life have resulted in the introduction of several EU initiatives to regulate artificial intelligence and new digital technologies, including the Digital Markets Act (DMA), the Digital Services Act (DSA) and the Artificial Intelligence Act (AIA Proposal). 7 In addition, existing EU non-discrimination law forbids discrimination on grounds of sex, ethnic origin, religion or belief, age, sexual orientation, and disability in relation to access to employment. Although non-discrimination law was originally designed for the offline world, its scope equally extends to online applications related to employment and occupation.
This article examines the requirements of EU (non-discrimination) law regarding online job advertisements. It first gives an overview and examination of EU's most recent initiatives to regulate the digital market and artificial intelligence and analyses, whether and to what extent these are applicable to online job advertisements (II.A.). It then outlines the legal requirements for job ads under EU non-discrimination law (II.B.). The following part highlights the discriminatory risks associated with advertising jobs online. It gives a brief explanation of the technical specificities of advertising jobs online and analyses the risks accordingly (III.). The final part of the article proposes a low-tech solution to the high-tech problems associated with online job advertisements by introducing a legal requirement to publicly tender job ads on an online noticeboard, thus ensuring transparency and effective access to employment (IV.).
EU legal framework
(Upcoming) EU Regulations
Artificial intelligence (AI) promises to improve our lives by providing neutral and objective services, that are easily and constantly available at low costs. However, studies show that AI systems have the potential, to ‘collide with and exert pressure on fundamental rights’. 8 The right to equality, the principle of non-discrimination and the right to privacy, which are frequently challenged by the digitalisation of more and more areas of life, are core obligations under international human rights laws and are also enshrined in EU law. 9 With the General Data Protection Regulation (GDPR) 10 the EU has already established a legal framework to address (parts of) the challenges to the right to privacy 11 and the need for further regulation has not gone unnoticed. With its recent initiatives, namely, the Digital Markets Act, the Digital ServiceS Act and the Artificial Intelligence Act, the EU is striving to set the global standard in the regulation of AI and new digital technologies. The three regulations follow a risk-based approach. The more a specific technology is a source of potential threat to safety, livelihoods, internal market and the rights of people, the more caution and regulatory steps will be required. 12
The Digital Markets Act (DMA)
The DMA, adopted on 14 September 2022, aims to achieve three objectives: first, to ensure contestability of digital markets; second, to ensure fairness in the relationship between digital ‘gatekeepers’ and their business users; and third, to strengthen the internal market by providing harmonised rules across the EU. 13 These goals intend to foster a fairer business environment where end users have more choice and new business can grow. The scope of the DMA covers a closed list of ‘core platforms services’ susceptible to ex ante regulation, including online business serving as consumer intermediation services (e.g., Amazon Marketplace, Apple App Store), online search engines (e.g., Google search), online social networks (e.g., Facebook), operating systems (e.g., Microsoft Windows), and advertising services offered by a provider of any such core platforms services (e.g., Google AdSense). 14 The obligations provided for in the DMA only apply to those providers of platform services that are considered gatekeepers. Gatekeepers are defined as large companies with a market capitalisation of at least EUR75 billion or an annual turnover of EUR7.5 billion, which have at least 45 million monthly end users in the EU and 10,000 annual business users. 15 According to estimates, only about 10 to 15 companies qualify as gatekeepers and are as such affected by the DMA. 16 While the DMA will ‘prevent Facebook from harvesting personal data from Instagram and exporting the same data to Facebook, where it could target new advertising to the user in question on the same data’, 17 it does not address the problem of access to online job advertisements in particular. The DMA is primarily concerned with avoiding more intrusive and potentially harmful ad targeting, but not with showing end users ads that have been obscured from them.
The Digital Services Act (DSA)
The DSA, adopted on 19 October 2022, applies to all intermediary platforms and sets out a horizontal framework for transparency, accountability and regulatory oversight. 18 The DSA pays special attention to targeted advertising, requiring platforms to inform the receivers of an advertisement that they are seeing advertisement, on whose behalf the advertisement is displayed and why (i.e., because of which parameters) the recipient is being targeted (Art. 26 DSA). As such, the DSA requires both source and parameter transparency. 19 While a complete ban on targeted advertising was not agreed upon, the adopted regulation includes a ban on targeting advertising for minors, and targeted advertising based on very sensitive personal data including religion or sexual orientation. 20 In addition, very large online platforms must make publicly available a repository containing information, inter alia, on the content of the advertisement, on whose behalf and during which period the advertisement was displayed, and whether the advertisement was intended to be displayed specifically to one or more groups of recipients. The information in the repository must be kept for one year following the last date of display of the advertisement on the online interfaces (Art. 30 DSA). If recommender systems 21 are used by very large platforms, the system must permit a tracking-free application that is not based on profiling (Art. 38 DSA). Platforms are considered to be very large if they ‘have a number of average monthly active recipients of the service in the [EU] equal to or higher than 45 million’ (Art. 33(1) DSA). This corresponds to about 10% of the EU population. The EU Commission will publish a list of such very large online platforms and keep it up-to-date. 22 Sites like Facebook and LinkedIn are prime candidates for this list, each easily reaching the 10% benchmark. 23
The DSA aims at reducing negative side effects of targeted advertising by strengthening the consumer's position through transparency requirements and information about computational advertising mechanisms. Thus, the DSA's transparency requirements will enable the end user to understand why an online ad is shown to them. It does not, however, solve the problem that certain groups of users are not shown certain (job) advertisements in the first place because of specific (personal) parameters and biased targeting. Even under the DSA, users do not learn about the ads that they cannot see because of targeting.
The Artificial Intelligence Act (AIA proposal)
The AIA Proposal classifies AI in four categories depending on the risk associated with each application: ‘AI with unacceptable risk’, ‘AI with high risk’, ‘AI with limited risk’ and ‘AI with minimal risk’. For each of the categories the AIA Proposal establishes a set of complementary, proportionate and flexible rules to address the risks. Annex III considers, for example, critical infrastructure, education, and law enforcement as areas where the use of AI systems would be qualified as high-risk given that AI systems are used to make decisions that directly impact humans and their livelihood. 24 Likewise, systems ‘intended to be used for recruitment or selection of natural persons, notably for advertising vacancies, screening or filtering applications, evaluating candidates in the course of interviews or tests’ are deemed high-risk (Annex III 4(a) AIA Proposal). Job advertisements are not specifically mentioned. Given that recital 36, on the one hand, indicates that high-risk AI in the context of recruitment are primarily applications that may have influence on the selection of persons and the concrete decision-making about a job assignment, it is questionable whether online job advertisements, or rather online job portals, would be qualified high-risk within the meaning of the AIA Proposal. On the other hand, the Court of Justice of the European Union (CJEU) attaches particular importance to access to the labour market. 25 Being informed about job offers in a non-discriminatory manner plays an essential role in this access. In this context, the question of who can see a specific job advertisement and who cannot can certainly be understood as an initial decision on the selection of persons. It is therefore not impossible that the CJEU will classify at least certain forms of algorithmically enhanced tools displaying online job advertisements as high-risk applications. However, the AIA Proposal applies to the high-risk AI systems listed in Annex III only if they are placed on the market or ‘put into service’ after the application of the Act or, in case they have been placed on the market or ‘put into service’ before, if those systems are subject to significant changes in their design or intended purpose (Art. 83 (2) AIA Proposal). The AIA Proposal has not entered into force yet, but could be passed within the year 2023. 26
This brief overview shows that recent EU initiatives to regulate AI and digital technologies do not adequately address the risks associated with advertising jobs online, namely, the problem of (current) bias in targeted job advertisements. However, all three initiatives stress the importance of protecting fundamental rights and the principal of non-discrimination in their recitals. 27 The DMA, DSA, and AIA Proposal are only the latest examples indicating a growing trend in the EU to curtail the free market against the benchmark of fundamental rights and the principle of non-discrimination. 28 The EU commits itself to value non-discrimination and equality (Art. 2 TEU). Its non-discrimination law is amongst the most comprehensive and elaborated bodies of non-discrimination laws globally. Accordingly, the following section clarifies the scope of EU non-discrimination law in regards to employment access.
Foundations of EU non-discrimination law
EU non-discrimination law has evolved from the confines of sex equality in employment to comprise a comprehensive body of law propelled mostly by the adoption of various Directives and case law. Three Directives are particularly relevant in respect the prohibition of discrimination in employment and occupation: the Recast Gender Directive prohibiting sex discrimination, the Race Directive prohibiting discrimination on grounds of racial and ethnic origin, and the Framework Directive extending the scope of protected characteristics to include religion or belief, disability, age, and sexual orientation. 29
Art. 14 Recast Gender Directive, Art. 3 Race Directive and Art. 3 Framework Directive prohibit discrimination in the public and the private sectors regarding ‘conditions for access to employment, including the selection and recruitment conditions.’ The CJEU has repeatedly held that ‘access to employment’ is to be interpreted widely. 30 Job applications, job ads and the entire recruitment process are included in the scope of application of the respective Directives. 31 Regulations relating to employment and occupation thus also apply to job advertisements—regardless of whether they are published offline or online.
It is important to note that EU non-discrimination law only applies to the (future) employer, but not to companies that publish job advertisements offline (e.g., newspapers) or online (e.g., job platforms and social media). 32 Arguably, however, publishing a job advertisement in a newspaper or online could be considered a service in regards to jobseekers, and thus falls under the prohibition to discriminate in the area of publicly offered goods and services (Art. 3(1)(h) Race Directive and Art. 1 Directive 2004/113/EC 33 ). According to dominant interpretation, EU non-discrimination law only covers services that are offered for remuneration (Art. 57 TFEU). This requirement is easily fulfilled in the case of newspaper ads, where jobseekers must buy the newspaper in order to be able to see ads, and in the case of online job platforms charging jobseekers a fee to be able to see the (full) content of the platform. But even the use of social media is not ‘free’. Users pay with their personal data. As more and more areas covered by non-discrimination law become digitalised and the modern digital economy is based in large parts on the exchange of services for data, there is reasonable ground to consider such business models as services in the sense of EU law. 34 However, even if the CJEU was to accept this argument, discrimination in the area of publicly offered services is currently only prohibited on grounds of racial or ethnic origin and sex.
EU non-discrimination law prohibits direct and indirect discrimination. 35 Both forms of discrimination extend to the analogue world and to job advertisements on company websites, online job portals and social media. The discrimination risks associated with job ads on company websites are comparable to those related to job ads in newspapers. They essentially concern the wording of the ad, which can be directly or indirectly discriminatory. By contrast, job ads posted on job portals and social media bear risks of discrimination not only regarding their wording but also their targeting setup. The techniques used to deliver job ads to the most suitable applicants make them particularly prone to indirect discrimination as well as direct and indirect ‘discrimination by association’. The result is not only that applicants might feel discouraged from applying for a job because of the ad’s discriminatory wording, but that they do not learn about a job offering in the first place, since the ad in question is simply not shown to them.
Prohibited forms of discrimination
The following section explains the different forms of discrimination before going into more detail on the specific discrimination risks of online job advertisement. Direct discrimination occurs ‘where one person is treated less favourably on grounds of [a protected characteristic] than another is, has been or would be treated in a comparable situation’. 36 There are two exceptions to the prohibition of direct discrimination: first, if a characteristic related to a protected ground (not the ground itself) constitutes a genuine and determining occupational requirement for the job at hand (e.g., when a male actor aged between 45 and 55 is sought for a specific role in a play); and second, positive action measures by which specific advantage is given to persons with a protected characteristic, if underrepresented, in order to compensate for existing disadvantages in the work environment. 37
Indirect discrimination occurs ‘where an apparently neutral provision, criterion or practice would put persons [with a protected characteristic] at a particular disadvantage compared with other persons, unless that provision, criterion or practice is objectively justified by a legitimate aim, and the means of achieving that aim are appropriate and necessary’. 38 It is sufficient that the provision in question may have a disadvantageous effect. 39 Indirect discrimination can be justified if the action in question is in pursuit of a legitimate aim, and the means of achieving that aim are appropriate and necessary. Financial and economic objectives alone are generally not considered appropriate to justify differential treatment. 40
Discrimination by association refers to the treatment of an individual based on the characteristics of a group or person with which they are associated. As the CJEU has clarified, EU non-discrimination law covers both direct and indirect discrimination by association. 41 This means that the prohibition of discrimination on a protected ground does not only apply to people who possess the protected characteristic, but also to people who suffer discrimination because they are associated with people who possess the protected characteristic (e.g., an employee in receipt of unfavourable treatment because of her child's disability).
After having established the existing protective framework against discrimination regarding access to employment, the next part will be dedicated to understanding the discriminatory risks and legal implications associated specifically with online job advertisements.
Discrimination risks in the context of advertising jobs online
To understand the legal implications accompanying online job advertisements, it is important to consider the technical side. All three online recruitment tools mentioned in the first part of the article—company websites (A.), online job portals and social media (B.)—use the internet as a means of promulgation. However, each tool has its specificities both regarding the technical execution as well as the targeted groups, thereby creating different discriminatory risks and associated legal implications.
Job ads on company websites
The most technically low-key way to make job advertisements public online is to publish them on a company website. It takes nothing more than editing software to run a Word document into a job ad on a website. Job ads on company websites work in quite a similar way to traditional recruitment methods, though there are also some key differences. With job advertisements on their website, companies can reach a larger number of jobseekers from around the world and do not incur any further costs. Advertisements can be published at any time, there are no restrictions regarding the length of the job advertisement, and the online format allows for a variety of display options (e.g., text, graphic images, audio files or interactive links). 42
Company career websites are one of the most important tools in recruiting, allowing not only the posting of job vacancies, but also the hosting of the application process through the integration of an e-recruitment system. 43 Apart from the elements described above there are hardly any differences between a job offering on a company website and a job advertisement in print media. During the whole advertising process, companies determine the content of their job offerings and jobseekers retain control over their search process. The requirements of non-discrimination law for job advertisements on company websites do not differ from those in print media. In both cases, the discrimination risks mainly refer to the wording of the ad. Therefore, both offline and online job advertisements must not use wording that discriminates directly or indirectly on any of the protected grounds. The same care should be taken with the visual design of advertisements (e.g., images depicting desired applicants). If the wording of a job ad indicates that a particular job is only open - or is not open - to applicants of a specific sex or ethnic origin or age group, or is restricted to any other of the protected characteristics, absent exemptions, this is a case of direct discrimination. Likewise, indirect discrimination can stem from the wording of a job advertisement. Making enrolment in Greek police training colleges, for example, conditional on a height requirement of at least 1.70 m indirectly discriminates against women, who are generally shorter than men. 44 Indirect discrimination might arise from job ads looking for ‘persons with military experience’, which could put women in a disadvantaged position if men are overrepresented in the military; or for ‘recent graduates’, which could disadvantage those who are older. 45
Job ads on online job platforms and social media sites
Compared to job ads on company websites, the publication of job advertisements on job portals and via social media represents a much more significant change in the recruitment process. Both tools share the advantages of job offerings on company websites. Job ads can be changed easily and information can be communicated in various forms through various channels. The most significant difference between company websites on the one side and job portals and social media on the other side is that job ads on the latter platforms can be targeted to a specific audience. 46 While job offerings on company websites will most likely be visited by jobseekers who are already familiar with the relevant organisation, job ads on online job portals and social media try to find their way to the suitable audience through targeted advertising. Targeting offers different ways to reduce the target audience of job advertisements: first, via deliberate inclusion or exclusion of a person or group (audience selection), and second, by a cost-benefit calculation to optimise the ad delivery (user-ad-matching).
Audience selection
Attribute-based targeting
When publishing a job ad on a job portal or a social media platform, advertisers can choose for their ads to be shown only to users with certain characteristics. Facebook's advertiser interface, for example, offers advertisers a list of targeting attributes based on users’ demographics, their interests, their ‘likes’ or online behaviour to choose from. 47 Ads on job portals and social media platforms, however, are not only bound to the non-discrimination requirements regarding the wording of the ad. The audience selection must also not be based on criteria that are either directly or indirectly discriminating. If (prospective) employers use Facebook's targeting options, for example, to address only users aged 25 to 36, this qualifies as direct discrimination on grounds of age: only users of a certain age group can see the ad, while for all the others the ad does simply not exist. 48 Discriminatory audience selection, and particularly the possibility of excluding people from seeing ads based on their gender, age, or ethnicity, has led to harsh criticism of Facebook and to a change in Facebook's selection options for targeted audiences. 49 Other platforms announced similar restrictions. 50
Attribute-based targeting does not only pose discrimination risks when advertisers select their audience based on protected characteristics. Targeting based on user interests, like preferences for, or interest in, certain news media sites or magazines, or specific hobbies, can also have discriminatory effects if those apparently neutral attributes disproportionally target or exclude users of a protected group. 51 If the respective attribute cannot be justified objectively, it is a case of indirect discrimination.
PII-based (customer audience) targeting
In addition to attribute-based targeting, online platforms offer advertisers the possibility of personally identifying information (PII)-based targeting. PII-based audiences can be built in two different ways. Advertisers can choose their targeted audience by uploading PII such as the name, phone number, email address, date of birth or post code that identifies the users who will later see the advertisement; 52 or advertisers can choose to create their audience based on users who interact with the advertiser's Facebook applications or (external) website. 53 With the first variant, an advertiser could, for example, choose to target only users whose date of birth is not before a certain year to exclude older people, or only users with a male first name to exclude female jobseekers. While targeting people through PII which corresponds to a protected characteristic will in most cases constitute direct discrimination, indirect discrimination can occur if the PII itself does not contain any sensitive attributes but correlates with a protected characteristic. This could be the case if, for instance, an advertiser uses post codes as PII and those post codes correspond to areas whose inhabitants are predominantly of a certain ethnic origin. Although the post code itself is an apparently neutral criterion, it might put users with a certain ethnic origin at a particular disadvantage as they are prevented from seeing a certain job ad. 54 The second variant poses less risk of direct discrimination. Nevertheless, the use of PII can amount to indirect discrimination if it turns out that the advertisers’ applications or websites are predominantly visited by persons with a protected characteristic, and relying on this criterion cannot be objectively justified.
Look-alike targeting
With look-alike targeting, advertisers aim at reaching an audience which has similar attributes than their already existing audience (the so-called source audience). For the selection of look-alike audiences, advertisers first must provide platforms with information about the source audience. This can be done in different ways, for example by providing the platform with their source audiences’ PII or specifying them to be ‘friends’ or ‘followers’ of their social media account. 55 Look-alike audience targeting will lead to discrimination if advertisers use an already (direct or indirect) discriminatory source audience. If an advertiser chooses a source audience that includes only users of a certain age group (e.g., defined by date of birth) or a certain sex (e.g., defined by male or female first names), the look-alike audience will very likely show the same bias. If the look-alike audience is made up of people assumingly sharing interests or traits with the source audience, there is a special risk of direct or indirect discrimination by association. 56
Regardless of which form of targeting advertisers finally choose, they must not select their target audience in a discriminatory manner, target advertisements exclusively to persons with a protected characteristic, or select the target audience along an apparently neutral characteristic, if this results in disadvantaging users belonging to a protected group and the exclusion cannot be justified objectively.
User-ad matching
The selection of the audience the advertiser wishes to target is only the first step in the ad delivery process. Based on the parameters the advertiser chooses to build the audience, job platforms and social media sites use automated algorithms to decide which concrete subset of the targeted audience is shown which ad. 57 Targeted ads use historical data to deliver ads to those users who are most likely interested in them and to users with similar characteristics. 58 This is done in a mostly automated way, often with AI tools employing machine learning techniques. The algorithms behind those techniques make multiple correlations based on the raw data available, creating predictive profiles of each user. This allows for a prioritisation of the ad in relation to other ads. For users this is reflected in the ordering of the ads. The employment website Indeed, for example, describes this process as follows: ‘Indeed displays Job Ads based on a combination of compensation paid by employers to Indeed and relevance, such as search terms, and other information provided and activities conducted on Indeed.’ 59 When advertisers specify the parameters for their target audience, they also make a bid. Thus, they indicate how much a click on the ad from their target audience is worth to them. 60
Whenever a user visits a website or app, the platform conducts a background auction among those advertisers targeting for these criteria. All the ad campaigns that are placed by different advertisers in a particular time interval and their bids are examined. When it comes to this background auction, not all target groups have the same price: in general, advertising to women is considered more expensive than advertising to men. Facebook, for example, suggests to bidding about 5 cents more to advertise to women than to men. A similar price difference exists for different age groups, the most expensive group being people aged 25 to 44. 61
The respective bid is not the only parameter that is considered during the auction: the ad quality score is another decisive factor. The quality score is based on the ad's predicted engagement level and is calculated with the help of algorithms. It indicates the likelihood of a user clicking on a particular ad. 62 When calculating the probability of a user clicking on an ad, not only the wording of the ad is taken into account, but also any visual material. Images in job ads can lead to significant bias in the ad delivery process. Studies show that whether the content of an image is classified as ‘male’ or ‘female’ has a significant impact on the ad delivery—even when the images are technically rendered indistinguishable to the human eye and the bias in the delivery is (only) due to the platform’s automated estimate of relevance. 63
In contrast to the wording of the ad and the audience selection, advertisers themselves have only very limited influence over the user-ad matching: ‘The actual calculation of the quality score and the bids of other advertisers that the advertising auction algorithm uses to allocate advertising is a black box to the advertiser’. 64 Even the most well-meaning advertiser has hardly any chance of avoiding the discrimination risks resulting from this way of advertising jobs online.
A (temporary) solution: a tender obligation
Any discrimination in the delivery of online job advertisements is a serious obstacle to access to the labour market. Jobseekers might never actually get to see the full range of job opportunities on the market. Discriminatory ad delivery not only possibly discourages applicants from applying for a job because they think they have less of a chance, but rather prevents them from seeing the job advertisement and learning about the job opportunity at all. Since jobseekers do not know that a job opportunity has been hidden from them (probably) because of discriminatory audience selection or discriminatory ad delivery, it also prevents them from taking legal action against possible discrimination.
A lot of research in both technical and legal disciplines deals with the problem of discrimination in online targeted advertising. 65 However, until the development of a technical solution, a legal requirement to publish all job advertisements on one specific website could offer a cost-efficient and viable solution that could be easily put into practice. The proposed solution requires that all job advertisements must follow a tender obligation. Tender obligations are not unknown in the field of employment. In fact, many countries have tender obligations for jobs in the public sector. The tender obligation would now have to be extended to all job offerings, whether in the public or in the private sector. Such an obligation would not impose an undue burden on employers, as most of them already post job offerings on the internet. Even with the tender obligation, employers would remain free to publish job offerings on job portals and social media sites and use the benefits of targeted advertising as long as they make sure that each ad is also published on the designated website. Such a website could either be run by national governments or on an EU-wide level. For an EU-wide reach, the already existing European Employment Services (EURES) website, a website managed by the European Labour Authority, would be a suitable choice. 66 In order to eliminate all discrimination risks, this website must not collect any data on the users. Instead, it should function only as a database that publishes job offerings in chronological, geographic or thematic ordering, and can be searched for parameters such as, for example, field of employment, job title or place of work. To ensure that employers do not consider the tender obligation a mere alibi, any unfavourable treatment of applicants belonging to a protected group who only learn of the job offer through the official website should be considered as prima facie evidence of discrimination. It would then be for the employer to prove that there has been no breach of the principle of equal treatment.
Conclusion
The digitalisation of the recruitment process has led to a substantial shift towards job advertisements being placed online, with the most important recruitment tools now being company websites, online job portals, and social media sites. Job advertising is part of the recruitment process and is therefore subject to EU non-discrimination law in the field of employment. Thus, any form of direct or indirect discrimination on grounds of sex, ethnic origin, religion or belief, age, sexual orientation or disability is prohibited. This also includes discrimination by association. Depending on which online recruitment tool is used, there are different discrimination risks and different ways for advertisers to avoid them. Job ads on company websites are the closest equivalent to analogue job advertisements, hence the discrimination risks lie primarily in the wording of the ad and can therefore be avoided by using non-discriminatory wording (and visual design) in the ad. With online job portals and job advertisements on social media sites, the discrimination risks not only concern the wording of an ad but also the targeting of the audience, namely, the processes of audience selection and user-ad matching. The selection of the target audience, however, is the only parameter that advertisers can influence. Advertisers must not select their target audience in a discriminatory way. Even if advertisers do everything possible to avoid discrimination risks, user-ad-matching can still lead to discriminatory results. Regardless of whether discrimination is the result of word choice, audience selection or user-ad matching, the result is always the same for those affected: they cannot see the ad in question. That job ads can be hidden from certain users is a significant hindrance to access to employment. EU non-discrimination law prohibits this in the same way as it prohibits other forms of discrimination in access to employment. However, in the case of discrimination in connection with online job advertisements, law enforcement is made very difficult. For individuals who do not see the advertisement and therefore do not know and cannot know about the discrimination, it is almost impossible. The EU's new initiatives to regulate the digital market and artificial intelligence are not sufficiently equipped to deal with this problem. Until the non-discriminatory delivery of advertisements can be ensured both on a technical and legal level, a(n) (online) tender obligation, as proposed in this article, could provide a remedy that could be easily implemented.
Footnotes
Acknowledgements
The work received financial support through grant LIT-2020-9-SEE-113.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article
