Abstract
Autocomplete is a popular search feature that predicts queries based on user input and guides users to a set of potentially relevant suggestions. In this study, we examine what YouTube autocompletes suggest to users seeking information about race on the platform. Specifically, we perform an algorithm output audit of autocomplete suggestions for input queries about four racial groups and examine the stereotypes they embody. Using critical discourse analysis, we identify five major sociocultural contexts in which racial information appears – Appearance, Ability, Culture, Social Equity, and Manner. We found that the participatory nature of YouTube produces a multifaceted representation of race-related content in its search outputs, characterized by enduring historical biases, aggregated discrimination, and interracial tensions, while simultaneously depicting minority resistance and aspirations of a post-racial society. We call for innovations in content moderation policy design and enforcement to address existing racial harms in YouTube search outputs.
Get full access to this article
View all access options for this article.
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
