Abstract
Collegiate esports are a key contributor to the North American esports field’s fledgling talent pipeline, where varsity student-athletes identify the streaming platform Twitch as a major component. Exemplified by Twitch, this article theorizes the role of platform algorithms as border objects—an analytical concept which frames the shared use of classification systems when a powerful party’s practices naturalize their interpretation over others. Twitch’s platform recommendation and moderation algorithms are classifiers used by competitive game-content creators and platform owners. Its algorithms are fundamental to allocating visibility among users, which, as collegiate esports players suggest, informs professional progress. However, algorithms have proven to perpetuate and exacerbate the exclusion of marginalized persons from platforms. Drawing on ethnographic interviews, participant observation, and existing scholarship, this article argues that the inherent biases of platform architecture in esports’ talent pipeline upholds patriarchal structures and reinforces inequality—reducing opportunities for diversity and equality in esports.
Introduction
On 2 February 2020, I entered the conference room of the (then) Crowne Plaza Hotel in downtown Detroit, Michigan. Earlier that day, I crossed the international border from Canada to observe a research participant’s contribution to a grassroots cultural phenomenon in the entertainment hub of this historic city. This participant was an esports student-athlete in Canada’s first collegiate varsity esports program, at St. Clair College in Windsor, Ontario. On that day, they played Super Smash Bros. Ultimate at Frostbite 2020, a supermajor tournament (rated among the most prestigious) in the Smash community, with over 1600 participants signed up to compete—some from varsity esports programs and many others not. Unlike many contemporary esports tournaments, the Smash community is largely organized around local networks without the support of game developers or investment capital.
Situated in the heart of the evolving “motor city,” this event felt somehow symbolic of a significant technological shift. Just as mail became email, esports mark the movement of sports logic to the electronic realm. At the time of writing, there was no clear path to entering the North American esports field and, despite the diversity of players who engage with video games, the field remained largely dominated by white men (Kauweloa, 2021). This is significant because esports present an opportunity to reconsider and challenge our collective understanding of sports and athletic competition. Where traditional sports have struggled with gender equity, representation, and labor exploitation, esports may present an opportunity to reset the playing field.
Still the field showed early signs of recreating the same old institutions. As Scholz (2021) explains, professional leagues and their “feeder leagues,” which funnel upcoming talent into professional programs, have been tumultuous and often collapse. Partin (2019a) refers to this as the field’s talent pipeline problem. If we are to rethink sports and athleticism through esports, surely the talent pipeline should reflect some of the efforts being made. In response, secondary schools, collegiate varsity programs, and esports-related diplomas are being developed. However, researchers have shown mixed evidence of racial and gender divisions (Black and Gray, 2022; Kauweloa, 2021; Taylor and Stout, 2020), even despite historically Black colleges and universities being identified as esports leaders in the United States.
During my time with St. Clair College’s “Saints Gaming” team, I found that much of their work necessitates encounters with Twitch.tv (Twitch)—a broadcasting platform owned by Amazon.com Inc. and widely known as the livestreaming platform for video games. It has been argued by game scholars (Black and Gray, 2022), that exclusion and racism in esports and gaming is embedded-in and made visible-through the cultural practices of technologies of mass visual productions. Participant observation clarified the centrality of the platform in players’ varsity experiences, leading me to question the platform’s role within the esports talent pipeline and its relationship to diversity.
Twitch provides users with the infrastructure to support video livestreaming and a platform from which to realize greater achievements. These affordances make Twitch vital to esports professionals because it curates a market of broadcasters, audiences, advertisers, and employers. Twitch is a growing focus of study among scholars (Anderson, 2017; Brock, 2021; Cullen and Ruberg, 2019; Gray, 2016; Johnson, 2019; Partin, 2019b; Renner and Taylor, 2021; Ruberg et al., 2019; Taylor, 2018; Woodcock and Johnson, 2019; Wulf et al., 2020). It is representative of both contemporary esports culture and the field, in many ways, illustrating the intrigues and issues of both.
As a platform, Twitch makes players more visible. However, in this article, I will raise critical attention to the platforms’ least visible elements—those that have the power to make people invisible: recommendation and moderation algorithms. In doing so, I introduce a conceptual extension of the Science and Technology Studies (STS) notion of boundary objects, which accounts for power differentials. I will exemplify how platform algorithms can be viewed as “border objects.”
Aligning with Star’s (1989, 2010) intentions when conceptualizing boundary objects, border objects are tools for framing the analysis of classification systems. Despite rarely coming to consensus with platform owners on how to interpret them, cultural content producers and consumers—such as esports athletes—use algorithmic classification systems on platforms like Twitch. Furthermore, they are made the subjects of these systems; ultimately effecting their visibility and opportunities for professional development. Algorithms seem—at a glance—to work like boundary objects. However, this article reconsiders the concept’s application on digital platforms. I conceptualize platform algorithms as border objects to better understand the shared use of a classification system, when one party is in a position of power, naturalizes the dominant user’s epistemic approach over others without needing consensus.
For all types of content creators, livestreaming involves performing best practices (institutions) to the satisfaction of recommendation and moderation algorithms, thus boosting their visibility, and avoiding moderation or content-suppression. However, their subjectification to the algorithmic governance of platforms upholds patriarchal structures that reinforce inequality. This undermines opportunities for a reimagined, diverse, and egalitarian sporting field. By questioning the role of Twitch in the esports talent pipeline, this research explores how algorithmic platform governance may negatively affect diversity in the field. I have framed Twitch’s platform algorithms as border objects to understand the power of platform ownership over stakeholders and their resulting institutions. This power distinguished algorithms as border objects and illustrated their marginalizing role in the esports field.
Research context and background
This article gathers an array of topics including platforms, algorithms, varsity esports, and content creation, all while reconceptualizing a popular STS tool. Thus, it is necessary to first explain the project which brought it all together. This research stems from an ethnography which sought to understand the experiences of Canadian varsity esports athletes and their role in shaping new esports institutions within those of the existing academy. Findings were gathered during fieldwork conducted with eight participants at St. Clair College between 2019 and 2020.
The research participants involved were aged from 19 to 23. We engaged each other through participant observation (playing Super Smash Bros.: Ultimate and Fortnite) and semi-structured interviews. While they all compete for the same program, the participants specialized in different games including Overwatch, Fortnite, Counter-Strike: Global Offensive, Super Smash Bros.: Ultimate, Hearthstone, and League of Legends. The group’s diversity was limited although one athlete self-identified as a trans-woman and another as Middle Eastern and Icelandic; the rest identified as Caucasian men.
Student-athletes’ role in the field of esports can be described as competitive game-content creators, making them similar to those colloquially known as ‘content creators’: producers of livestreamed or on-demand content for platforms like Twitch and YouTube. Likewise, many video game content creators who make competitive gaming content are engaged in formalized competitive computer gaming which fits the definition of engaging in esports. Notably, Kauweloa and Winter (2019) have found that esports skill development is a technological process, which includes maintaining the technical proficiency to gain visibility on digital platforms. Aside from competition, participants shared that they were contracted to participate in 3 hours of livestreamed practice each week.
Shaped inductively through ethnographic findings, this article insists upon the significance of platforms and platform algorithms in the esports talent pipeline. This is corroborated by research describing the importance of streaming to professional esports careers and the barriers faced by marginalized persons in streaming communities (Fletcher, 2020; Gray, 2020; Taylor, 2016a, 2018). When considering whether student-athletes had the freedom to shape varsity institutions from within the existing structures of traditional varsity sports, I questioned the role played by Twitch and whether it maintained or reinforced hegemonic gaming, sports, or media broadcasting institutions. Recognizing the existence of algorithmic bias in other platforms and artificial intelligence (Noble, 2018; O’Neil, 2016), I later decided to focus on the relationship between esports athletes and platform algorithms.
While esports student-athletes are not necessarily experts in computer science, and do not understand the nuanced functions of platform algorithms, it is important to respect that their experiences can be illustrative—as humanities scholars (Bishop, 2019; Draude et al., 2019; Trammell and Cullen, 2021) have suggested. The exploration of platform algorithms does not solely necessitate an understanding of their code but can also be achieved through ethnographic approaches. These scholars insist that it is not necessary to pry open the algorithmic “black box”—that is, to become fluent in computer science or gain access to the designer’s original intent—to understand algorithms’ embeddedness in the assemblages of social life. Algorithmic systems have been approached from critical feminist perspectives, which have long addressed bias and structural inequality (Draude et al., 2019). Creative industry scholar, Sophie Bishop (2019), utilizes an empirical approach (including both online and offline ethnographies), highlighting that when users lack information, they perceive risk and conduct collaborative negotiation to de-mystify algorithms.
Considering players’ relationship to platform algorithms led to one of this article’s principal contributions—the border object. This reconceptualization stemmed from a critical reading of Bowker and Star’s (2008) definition of boundary objects. The lens of boundary object has been used by game scholars to describe game development tools and platforms, including Twitch (Partin, 2019b; Taylor, 2018; Whitson, 2018). However, I argue that boundary objects fail to capture power relations and I reconceptualize platform algorithms as border objects. This lens, applied throughout the article, lends a new analytical tool to media and STS scholars engaging with privileged or disadvantaged platform users. As the impacts of algorithms are not restricted to the field of computer science, it is important that other disciplines develop and use these tools to understand them and communicate their own findings. For this reason, my suggestion that the presence of a border object significantly hinders the diversity of the esports talent pipeline, has become inseparable from my larger observations about varsity esports.
Notably, my position to this project as a White cis-male researcher is of the utmost privilege. While I intend for this article to contribute to further efforts toward equity and diversity in the field of cultural production surrounding esports, I recognize that the experiences of marginalized communities are not mine to tell. As my participant pool did not represent the diversity I hoped to see in varsity esports, I have instead outlined the ways algorithmic systems faced by esports players privilege people such as myself, through the conceptual lens of the border object. When discussing the risks and barriers posed by algorithmic border objects, I engage with existing research that corroborates my argument through marginalized voices.
Varsity esports student-athletes stand at the intersection of algorithmic governance, digital platforms, digital broadcasting, and significant cultural shifts. Here lies an opportunity to diversify and rethink sports and athletics. However, the field has a pipeline problem which is punctuated by its lack of diversity. Guided by the insights of these emergent experts, this article studies Twitch and esports through the conceptual lens of border objects, to highlight its effects on an emerging professional’s development. But what is a border object and how does it add to our understanding of Twitch?
Conceptualizing border objects
In a 2010 article entitled “This is Not a Boundary Object,” Susan Leigh Star addressed the popular use of the concept following her original contribution and its subsequent development (Bowker and Star, 2008; Star, 1989, 2010; Star and Griesemer, 1989). Since these publications, the lens has been used to frame analyses across several fields including social science, medicine, organizational theory, history, feminist theory, and information science (Star, 2010). While Star notes that she had never attempted to adjudicate what is and is not a boundary object, the article is intended to address the question of “Couldn’t anything be a boundary object?” In this section, I will summarize Star’s frame to provide a clear understanding of where platform algorithms stand apart, warranting a conceptual extension.
What is a boundary object?
Bowker and Star (2008) convey that boundary objects are shared by several communities of practice; they satisfy the informational needs of each; they are plastic enough to adapt to individual communities’ needs while maintaining a coherent identity across intersecting communities; they may be concrete or abstract; and they resolve issues of commonly naturalizing categorization without imposing the naturalization of categories on a community (Bowker and Star, 2008: 297). Star’s three aspects of a boundary object’s architecture are: interpretive flexibility, material/organizational structure or arrangement, and scale/granularity (Star, 2010). She notes that interpretive flexibility has become a cornerstone of the “constructivist approach” to sociology, that is, the meaning of an object may differ between communities and is informed by one’s social and cultural background (Pinch and Bijker, 2012).
This means that any number of parties may use a boundary object while retaining their own interpretation of the object’s meaning. Thus, boundary objects material or organizational arrangement are structured by a given group’s informational and work needs (Star and Griesemer, 1989)—determining the object’s form and function. Star (2010) provides an illustrative example: a road map may point the way to a campground for one group, a place for recreation. For another group, this “same” map may follow a series of geological sites of importance, or animal habitats, for scientists. Such maps may resemble each other, overlap, and even seem indistinguishable to an outsider’s eye. Their difference depends on the use and interpretation of the object. One group’s pleasant camping spot is another’s source of data about speciation. (p. 602)
An object is “used” in the sense that people and other objects or programs act toward and with it. It is derived from action, not a sense of “thing”-ness (Star, 2010: 603). Both platforms and algorithms fit well within Star’s understanding of objects despite their lack of material “thing”-ness, a qualification which Star (2010) has accounted for.
Game scholars, Taylor (2018) and Partin (2019a), have used boundary objects as a lens through which to analyze Twitch as a platform. TL Taylor (2018) acknowledges the strong fit between Twitch and Bowker and Star’s definition, using the concept to understand how the assemblage of actors on the platform can use it simultaneously despite the tensions between them. In essence, multiple parties can commonly use Twitch to meet their needs despite their divergent interests. Furthermore, Partin (2019a) explains the surveillant affordances of the platform to viewers, creators, and platform owners, through engaging surveillance capitalism. While Partin’s concerns are mirrored here, neither he nor TL Taylor identifies the inability of the boundary object lens to capture or pinpoint the source of tension: the platform algorithms. Where the neologism “border object” clearly departs is with the framing of a boundary.
What is a border object?
When providing an operative definition of the term boundary, Star (2010) explains that it is often seen as “something that implies an edge or periphery, as in the boundary of a state” (pp. 602–603) but clarifies that she uses it to refer to shared spaces wherein the sense of here and there are confounded. Bowker and Star (2008) note that boundary objects “resolve anomalies of naturalization without imposing a naturalization of categories from one community or from an outside source of standardization” (p. 297). Furthermore, they explain that boundary objects are best applied to an analysis of relatively equal situations and that instances of an imperialist imposition of standards, force, and deception have a different structure. Platform algorithms differ from boundary objects in that they violate the sovereignty of disadvantaged communities by imposing the naturalization of categories from a dominant community or outside source of standardization. This trait is what identifies them as border objects. This article will go on to analyze their use on Twitch which will be illustrative, for their violent nature is consistent with platform imperialism (Jin, 2015).
To be clear, recall Star’s example of a road map, but consider if it were centered on the Detroit-Windsor border between Canada and the United States. How the map is interpreted depends on the community using it and without consensus it satisfies everyone’s informational needs. However, when there is a power differential between communities (take for example a member of the Three Fires Confederacy: Ojibwe, Potawatomi, and Odawa Indigenous communities and an agent of the Canadian Border Service), the way one community interprets and uses the map may come to affect the others. When a colonial state uses the “same” map by its own interpretation it naturalizes that interpretation over the other communities which share in use of the object and the lack of consensus becomes significant.
Due to the lack of consensus, when a border object naturalizes the more powerful group’s interpretation over another, it unleashes the potential for conflict and exploitation. Thus, it is important to identify the existence of flexibly interpreted objects and classifiers and to analyze how power effects their interpretation by groups who work together without consensus. Identifying and analyzing border objects will help researchers remain cognizant that borders too are confounded workspaces which risk conflict and exploitation.
As exemplified by the shared use of road maps, one community sometimes enjoys a significant power differential and (although there is room for dissent) their classifiers are naturalized over others who share in the use of a border object. I argue that border objects may be concrete or abstract and are shared by several communities of practice; they satisfy the informational needs of each, and while they are plastic enough to adapt to individual communities’ needs, they naturalize a dominant categorization across intersecting communities. Recommendation and moderation algorithms illustrate the differentiation of “border objects” since they naturalize a classification system, as interpreted by dominant parties, over others. Acknowledging the difference between the platform as boundary object and algorithms as border objects necessitates a more nuanced critical analysis of their role in the field of cultural production surrounding esports.
Finally, it is noteworthy that border object is a term first used by Ahmed (2005). In her use, they are affective objects which arise from abjection: a feeling of sickness caused by an object’s proximity, which makes the subject feel fragile and no longer guaranteed of the integrity of the self. In this process, borders are transformed into objects which are “disgusting,” as they engender disgust. Ahmed states that “borders need to be threatened to be maintained, and part of the process of ‘maintenance-through-transgression’ involves the very appearance of border objects” (Ahmed, 2005: 102). The commonality between mine and Ahmed’s use of the term is the notion that border objects arise through abjection and are maintained through transgression.
How Twitch algorithms behave as border objects
Now using border objects as a lens, we can analyze the relationship between esports athletes and the platform at the center of much of their activity. First, it is important to take note of the significant power differential between the communities commonly using Twitch: the platform owners, content creators, and viewers. At a broad level, Amazon has the power to alter the platform’s architecture, including its recommendation and moderation algorithms, to meet their informational needs and goals.
Algorithms act as border objects on Twitch by functioning as a classification system which is used by several parties. Each of these communities satisfy an informational need through institutions or interactions that bring them in contact with platform algorithms. The Twitch recommendation and content moderation algorithms are abstract objects in users/creators’ communities of practice. Their abstraction is a matter of their lack of thing-ness, to borrow from Star (2010). However, this abstraction also speaks to their black boxed nature, leading parties that border on the algorithm to approach them by forming algorithmic imaginaries through “playing the visibility game.” Successfully playing this game grants users the information they need to succeed on the platform by granting insight into the platform owners’ epistemology. Similarly, due to their interpretive flexibility, audiences may imagine recommendation algorithms as a tool for finding the content they want, while content creators, broadcasters, and advertisers imagine them as tools for enhancing visibility. But what is an algorithmic imaginary, how does one play the visibility game and how exactly does this have a naturalizing effect over the other communities?
The relationship between platform algorithms and platform users is taken up by information and media scholar, Kelley Cotter (2019). Through a textual analysis of Instagram influencers’ discourse, she posits content creators’ game-like pursuit of online influence as “the visibility game,” which is largely structured by rules embedded in algorithms that regulate visibility on social platforms (Cotter, 2019: 896). Her work contributes to larger academic discourses concerned with algorithmic culture. This growing dialogue explores the convergence of data, algorithms, and social platforms, which form regimes of visibility (Bucher, 2012); are potentially fraught with oppressive logics (Chun, 2021; Noble, 2018); and co-construct culture (Brock, 2020; Bucher, 2017; Striphas, 2015).
When playing the visibility game, one enters a relationship with the platform’s algorithmic border objects, constructing what media researcher, Taina Bucher (2017) refers to as an algorithmic imaginary—the ways that people imagine, perceive, and experience algorithms and their ability to affect or be affected by social behavior and structures. On Twitch, as with most platforms, few people understand the coded function of an algorithm, and as such, algorithmic imaginaries are found across the cultural field. However, it is the platform owner’s epistemological use of the algorithm which ultimately influences the other parties’ algorithmic imaginary. Cotter (2019) explains that platform owners establish community guidelines and terms of service, which inform how algorithms categorize, enforce, and encourage behavioral norms. Since algorithms are fundamental to the allocation of visibility among stakeholders, the social institutions pertaining to satisfying the rules of algorithms—that is, playing the visibility game—become solidified on the platform as wholly normative. Since Twitch is central to its talent pipeline, the same can be said of the esports field.
Like most social platforms, Twitch illustrates affect economies, where streamers’ requisite need to build an audience or tap into audience members’ positive feelings about their channel is driven by the architecture of the platform, including financial components such as paid subscriptions, donations, and tips (Taylor, 2016a, 2018: 95). Bucher (2012) originally described this affective work through Facebook’s “regime of visibility”; an inversion of Foucault’s (1995) panoptic model, in which subjects are made ubiquitously visible. Bucher (2012) argues that social media platforms are spaces designed to house algorithmic regimes of visibility in which visibility is the reward, not the punishment.
A content creator’s algorithmic imaginary is the result of affective work and immaterial labor conducted to optimize their algorithmic visibility while building their community on platforms such as Twitch (Taylor, 2018: 89). Visibility was important to esports student-athletes at Saints Gaming, who saw Twitch as platform of opportunity for esports skill development, which is echoed by game scholars (Fletcher, 2020; Taylor, 2016a, 2016b). It follows that maintaining visibility in algorithmically governed spaces informs income and opportunities for content creators (Bishop, 2019).
On Twitch, the recommendation algorithm serves to reward visibility, whereas moderation algorithms may take it away. At its most simplistic, they do so by classifying content as approved and disapproved. Aspiring esports professionals who enter the Twitch platform seeking visibility are subject to this regime of visibility. They use the platform’s recommendation and moderation algorithms as a source of information about how to succeed on the platform. But, as they use the algorithm, it naturalizes the platform owner’s interpretation of the classification system over them. They are categorized and proceed to categorize themselves. They perform practices of algorithmic recognition based on a decoded algorithmic imaginary, becoming the principles of their own subjection. Playing the visibility game with algorithmic border objects is part-in-parcel to identity work and the algorithm threatens both the subject and its borders by naturalizing the classification system of the dominant.
In interviews, student-athletes expanded on the importance of livestreams. A Hearthstone player said that “It’s similar to watching professional sports. So, if you watch hockey and you see a play you might want to try that with your team.” A Counter-Strike: Global Offensive player elaborated, “I’ve always looked up to—in sports, especially playing football—the players that I’ve always liked, and I think the same goes for competitive esports. [,,,] In a way you are like, replicating how they play in the game.” Their teammate remarked, “I believe that you have to watch the game you play so: watch a lot of Counter-Strike, play a lot of Counter-Strike. Practice, school—obviously you gotta’ keep up in school—and then work, that’s it really.” And, finally, a League of Legends player explained that “The opportunity to compete on stream in front of a large audience was difficult to come by and may lead to recruitment from professional teams.”
While my participants were not directly involved in developing a channel or community, they made it clear that livestreaming played an important role in their varsity esports experience, as a tool for training, inspiration, aspiration, and recruitment. What remains to be explained is how the presence of algorithmic border objects may affect the diversity of a platform and bordering fields. In the following section, my findings alongside the work of scholars which amplifies marginalized voices, suggests that some folks in the esports field are playing on hard.
Playing on hard
Before my arrival in 2020, the Saints Gaming players had negotiated for the right to compete and practice on stream from home. A Counter-Strike player told me their argument had been that “we can’t stream our practices, ‘cause we’re coming up with strats [strategies], and then it’s out to the public, and it’s not a strat anymore.” Despite this, I found that players did stream their practice, and I initially reached out for contact with the program through their live chat. While the players are less focused on building the channel or community, by building a relationship with the audience, they still performed the best practices of the visibility game in a passive streaming posture (Walker, 2014). While practicing or competing, student-athletes made their own bodies visible on camera; altered their behavior, language, and appearance; and submitted to moderation under the platform’s terms of service and community guidelines. For example, a League of Legends player told me that one student-athlete had been removed from the team after using inappropriate language on stream.
Even when streaming passively, normative pressure is placed on aspiring esports professionals by platform algorithms. This is observable when analyzing their algorithmic imaginaries and the institutions through which they play the visibility game with border objects. Yet, what works for some also shapes barriers for others. The following section explores how Twitch’s recommendation and moderation algorithms disadvantage minority communities. Herein, I must speak less to my own findings, which lacked diverse perspectives, however, the Saints Gaming players nonetheless brought this to my attention.
Existing scholarship augments my findings to argue that algorithmic border objects on Twitch privileges traits of Whiteness, hetero-masculinity, and financial-security, while disadvantaging traits of Otherness. Algorithmic border objects between the platform and competitive gaming-content creators impose and naturalize a categorization of acceptable interactions (relating to content standards) and reinforce patriarchal class structures. Playing the visibility game is unachievable to many outside of a privileged White, cis-heterosexual male demographic, thus serving to gatekeep success on the platform and discourage minority participation. The platform architecture upholds a culture which is inhospitable to minority participants and this is extended to the field of esports.
How border objects normalize privileged institutions
Of the participants in my study, six shared that they had directly attempted to build a channel on Twitch in the past but had several reasons for why they did not currently create their own content. One player on the Overwatch team told me that “One, I don’t find It fun, unless you truly enjoy what you’re doing or you have a big enough revenue stream. There’s really no point. And [two], I don’t have the hardware capable of streaming well.” Their comment illustrates an institutional norm within the esports field which is reinforced through engagement with border objects and hinges on an unacknowledged privilege: the notion of what is considered fun.
Fun, like play, is a concept that has largely been structured through whiteness within gaming culture (Trammell, 2023). Through Black phenomenology, Trammell has explained how play is often experienced as painful or torturous for people of color due to its subjectivizing nature. In gaming discourse, arguments pertaining to what is “actual fun” are often raised surrounding games which depart from the hegemonic norm: fast, action-packed AAA video game titles. Slow, serious, and queer games are often derided as walking simulators and deemed “no-fun” for breaking these norms (Ruberg, 2019). Similarly, the gamified experience of building a community on Twitch is most often a slow and arduous task—one which is laborious and perhaps akin to the mundanity of a “walking simulator.”
The ability to opt out of the labor of streaming on Twitch and playing the visibility game to advance a career in esports because it is not considered fun, is a luxury only some folks have. Engaging only in the career development opportunities that align with a privileged notion of fun signals that some may cut corners and catch breaks which others may only come across painfully and with much work. Furthermore, it suggests that having fun on Twitch is tied to success in the visibility game. As this necessitates a relationship with platform algorithms, we may understand how engaging a border object might be painful and question how this affects diversity in the esports field.
Playing the visibility game generates normative institutions that are not accessible to all streamers equally and, since creators largely aim to appease or game platform algorithms to gain visibility, the resulting platform spaces are prone to bifurcation and discrimination. For aspiring esports professionals in marginalized communities, this system is restrictive and reinforced by algorithmic bias.
The diversity of esports pathways to competition, opportunity, and employment are markedly White and male, with barriers to participation like toxic behavior and online harassment (Law, 2021). The most famous illustration of this being Gamergate, when women in the gaming industry were sexually harassed online and in real life (receiving threats of sexual violence and death) for voicing their concerns about gender inequity in video games and the industry. Law (2021) and Gray (2020) have explained that Gamergate was not an isolated instance. Rather it is indicative of an issue with toxicity in gamer culture that is not quite universal but is systemic in many communities and an everyday practice.
Toxicity in communication channels—such as game chats, stream chats, and forums—is commonly moderated by community administrators. However, as platform communities like Twitch have grown more populous, companies have turned to algorithms. The need for algorithmic moderation introduced border objects which naturalize the categorizations of what platforms deems acceptable or unacceptable conduct. In the video games and technology industries, this is based largely in White cultural norms. Notably, harassment has not ceased in response to this solution.
While content creators may interpret algorithms as a tool to be gamed for their ability to ensure financial consistency and community growth, platform owners may interpret them as tools for enforcing community guidelines and managing the platform’s brand-safe public image. Their use of the algorithm is at odds; forming a border which is frequently maintained through transgression. Disciplining transgression on platforms has often been inconsistently applied. For example, the top earning streamers on Twitch have generally all been men, with the rare exception being Imane “Pokimane” Anys. The most viewed female streamer on Twitch, as of 2021, Kaitlyn “Amouranth” Siragusa, had been banned and subsequently unbanned five times for various reasons, generally pertaining to their content existing within a gray area of Twitch’s policies restricting the presentation of the female body (Colbert, 2021).
Amouranth’s content frequently challenged Twitch’s carefully cultivated reputation as a gaming platform, with their most successful creators producing gaming content. In doing so, they have largely maintained a culture of gaming that is overwhelmingly White, cis-heterosexual, and masculine. Amouranth’s frequent “controversies” exemplify the two-fold nature of visibility on Twitch. Visibility is at once exposure of one’s brand or product to an audience and exposing one’s self. Due to this, playing the visibility game may be quite painful.
Exploring the gendered dimensions of livestreaming, Ruberg et al. (2019) and Cullen and Ruberg (2019) consider the rhetoric and community regulations pertaining to female streamers who are harassed with derogatory labels like “titty-streamer” or “thot.” Ruberg et al. (2019) explain that “titty-streamer” is a derogatory term for women accused of placing too little emphasis on gameplay, relative to the presentation of their physical appearance. Gaming the platform’s algorithm is only deemed acceptable when one games them “like a man.” Streamers must present their bodies only in a manner consistent with expectations of “normal.” While playing the visibility game, Amouranth (like any non-cis-male streamer) appears on camera, presenting a feminized body and consistently receives toxicity and harassment from gamers and subsequently disciplinary moderation by Twitch, for evidently transgressing the norms of sexuality on their “game streaming” platform. While Twitch has walked back each ban, their initial response reinforces the White masculine norms of gaming culture and their decision to revise their community guidelines in response to Amouranth’s subversions only serves to compartmentalize and marginalize femininity. Toxicity in gaming communities remains infamous and inconsistent enforcement of platform policy against female streamers may discourage others from pursuing opportunities on the platform and in the field of esports by extension.
While one student-athlete on Saints Gaming identified as a trans-woman, she noted that her experience had been positive saying that “a lot of trans people have had bad experiences in gaming in general. So, I was really worried coming into the college. But I honestly expected to have some problems, there hasn’t been one.” Her expectation of encountering problems is symptomatic of the above-mentioned chilling effect.
Exploring the racial dimensions of streaming on Twitch, Gray (2020) argues that people of color are marginalized as deviants to the prevailing White-male (default gamer) culture. She draws a parallel between the discursive markings of the Black body and the discursive practices around the fears and bodies created by it. On streaming platforms like Twitch, Gray’s work illuminates the efforts Black streamers must make to play the visibility game as people of color. Gray notes that practices for Black streamers are both the same and different from White streamer’s practices. Black players are at once held to the normative practices of the default gamer—which work best to satisfy algorithmic governance—as well as a subset of practices necessitated by the discursive markings of fear and anxiety related to Black bodies in White spaces. By observing Black streamers, Gray describes this as a process in which Black streamers modulate in and out of Blackness. She describes streaming platforms as having an unequally permeable racialized border—one which is created in the real world and sustained in the digital (Gray, 2020, p. 85). Here, Twitch illustrates how the use of a border object by parties in a position of power/privilege relative to another, naturalize their interpretation of the object’s categorization onto another.
Likewise, Fletcher (2020) argues that the exclusion of Black bodies from esports is not just a matter of the physical body, but of Blackness itself. Illustrating this, he shares the example of Terrence Miller, a Black Hearthstone player who appeared on Twitch at a large tournament. At the sight of him, the chat exploded with racial slurs and “try hard emojis,” which is a Twitch emoji that has been co-opted as a White symbol for Black gamers (Fletcher, 2020: 2670). Fletcher writes about the esports field and argues that the lack of Black players is due to both capital and cultural discrimination. Fletcher also notes that esports functions on a meritocracy, like the meritocracy maintained through Twitch’s algorithmic architecture. While determining success and visibility by meritocracy is supposedly colourblind, Fletcher argues that it boils down to a judgment of skill, which is a wrapping of social and economic practices. Developing skill is heavily influenced by resources and opportunities available to players and streamers. The skills it takes to play the visibility game are reflective of how platform owners envision their recommendation algorithm should function to promote the best content, such as contributing long hours for consistency. Yet, as Fletcher points out, these skills are afforded by opportunities which are not equitably distributed among minority communities.
The visibility game also creates pain points for esports athletes with disabilities and mental health, which are explored alongside additional normative institutions by Johnson (2019). He cites long working hours, toxicity, and the taxing demands on a streamer’s attention, as the main obstructions to streaming with a chronic condition. Johnson (2019) also notes that streamers commonly play for long hours (e.g. streaming every day of the week or in 24- or 48-hour blocks) to gain visibility. They also contribute further work hours off stream in community management and editing. These long hours benefit Twitch by driving engagement and maintaining active audiences. A Counter-Strike player on Saints Gaming conveyed how they found their schedule too busy to stream, with practice, competition, school, and a part-time job to juggle. These techniques for gaining visibility are once again indicative of privileged institutional norms that reflect the epistemology of the more powerful platform owners. They are normalized by algorithmic border objects and may discourage minority streamers or those with disabilities or mental health issues from entering the workplace, while disciplining those who do. And no doubt some do. How then, do we explain the diversity that does exist on Twitch and in the esports talent pipeline?
Through the lens of border objects, we can see how algorithmic governance of platforms normalizes a visibility game which may be experienced as pain for those outside of the privileged White-male gamer. However, it is important to note that resilience, agency, and countering are exemplified by communities that persist in gaming spaces against these norms. To avoid the deterministic claim that algorithms strictly define how streamers gain visibility on the platform, it must be recalled that the process of gaming the algorithm is one of decoding, which is negotiated and potentially subversive. Bishop (2019) has observed the process of playing the visibility game among beauty vloggers who engage in “algorithmic gossip,” as a means of sharing socially informed theories. She argues that algorithmic gossip is a subversive reaction to power relations often used by marginalized groups (i.e. gaming the system). The goal is to share strategies which pertain to attaining financial consistency and visibility from algorithmically structured platforms. This process is further illuminated by Lomborg and Kapsch (2019) when revisiting Stuart Hall’s (1973) work on decoding messages in preferred, negotiated, or oppositional readings. Beauty vloggers algorithmic gossip, like Amouranth’s manipulation of the recommendation algorithm, involves accepting or resisting the naturalization of the platform owner’s classification system.
While the painful play necessitated by algorithmic platform governance discourages diversity in the field of esports, these practices simultaneously open a horizon for resistance and resilience. Users maintain agency through their interpretation of the algorithm and marginalized users are often positioned in opposition with the power to subvert and resist.
Conclusion
Let’s return to Frostbite, Detroit. As I walked around the tournament hall, I noticed that the crowd around the main stage had become increasingly passionate and vocal, culminating when a local tournament organizer played a match on stage. I made a note in my field journal that the energy was not quite at the level of the traditional collegiate sporting events I had attended but that the atmosphere was competitive and with a spirit of sportsmanship. Players bumped fists before and after each match. They discussed the nuances of different attack combos. Some walked around with their friends cheering them on from behind and a member of the event staff complimented the Saints Gaming player’s button with their pronouns. I noted the grassroots feeling which made me feel at home in this place. This was also true of the five tournaments I attended during my time with Saints Gaming. However, I also recognized that attendance at esports events is still dominated by White cis-heterosexual men. Games scholar Aaron Trammell (2023) conveys a general sentiment held within hobby communities, which is that all are welcomed, but the default assumption is that of white men, such as myself. What I have done with this article is surface and critique the socio-technological systems which support this sentiment on Twitch.
Critical scholars (Bishop, 2019; Bucher, 2012, 2017; Cotter, 2019) have explored how recommendation and moderation algorithms govern behavior and success on platforms. Content creators, such as esports student-athletes, use algorithms to decode classification systems encoded by platform owners, in pursuit of visibility and may play by the rules or game them. By framing platform algorithms as border objects, I have demonstrated an analytical lens for shared classification systems wherein a more powerful group’s use of the object naturalizes their interpretation of the system over others. What this illustrates in the field of esports, is how algorithmic governance gatekeeps and upholds a White patriarchal interpretation of behavioral norms, which restricts and discourages minority participants. Further research by media and game scholars should engage with border objects as a lens to understand the nature of the trial-and-error process which builds algorithmic literacy, particularly on a platform as dynamic as Twitch.
Through this conversation, it has become clear that varsity programs, professional organizations, tournament organizers, and athletics governing bodies in the esports field must work alongside athletes and other creators to disentangle themselves from algorithmic governance and the institutional norms they uphold. College programs need to be conscious of their relationship with platforms and how algorithmic architecture positions student-athletes. This article has developed a tool to frame further analyses and outlined the known structures of the problem. Firsthand research alongside marginalized esports players and student-athletes is necessary to detail how they interpret and use platform algorithms and to understand their relationship to visibility and career development. Doing so may highlight their ability to shape platform technology and its uses, as well as their relationship to the field’s power structures. Importantly, resilient and oppositional practices may aid in empowering and encouraging more racial, gender, class, and ability-diverse talent.
Footnotes
Acknowledgements
The author would like to thank the anonymous reviewers for their constructive feedback. He would also like to thank his friends, family, and colleagues for their thoughtful conversations and for patiently reviewing earlier versions of this article: Dr. Alberto Lusoli, Maria Sommers, Dr. Philippa R. Adams, Amy Harris, Rowan Melling, Tvine Donabedian, and Luke Scholl.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
