Abstract
Anti-discrimination is a crucial focus for researchers, policymakers and designers of technical systems. Leveraging insights from AI fairness and inclusion studies, we examine algorithmic bias and its tendency to perpetuate discrimination (Selbst et al., 2019, Fairness and abstraction in sociotechnical systems, pp. 59–68) through two case studies: one in a gaming environment and another on a dating platform. Our research on gaming explores the manifestation of toxicity, particularly its impact on gender identities. We examine the normalisation of toxic behaviours within gaming spaces, negatively influencing gendered perceptions and experiences. In the second study, we conduct an experiment on the Bumble dating platform to uncover and address discrimination within the matching algorithms, which influence the profiles shown to users as potential dates. These interactions in user discovery pose significant risks, including sexual violence and other harms, especially for diverse gender communities on dating platforms. By documenting the experiences of non-binary individuals on dating platforms, we contribute to a largely underexplored area of research. This study not only highlights the discrepancies and biases in dating algorithms but also aims to inspire more inclusive design principles that recognise and accommodate the diverse needs of genders and communities.
Get full access to this article
View all access options for this article.
