Abstract
Online spaces offer fan communities and content creators many outlets for expressing their interests, but they also tend to place users in positions where they encounter hostility, toxicity, and gatekeeping. In the case of online streaming on Twitch, users frequently encounter hostility based on identity and seek assistance from fellow users via social media. In this project, I highlight the ways that social media is used to try to organize against discriminatory cultures toward marginalized streamers. Ultimately, I find that much of the onus is placed directly on streamers themselves to circumvent, address, and keep themselves safe despite harassment. In this paper, I will argue that this feeds into the structures and cultures that allow racist and sexist hostilities in online and gaming spaces by placing responsibility – and blame – on individual users from marginalized backgrounds. Although the community is frequently supportive of users who seek advice for addressing hostility and there are attempts at raising awareness through collective online action, the lack of apparent resolution leaves many feeling that these experiences are inevitable, immutable, and within the realm of individual responsibility.
Keywords
Introduction
Online spaces offer opportunities for users to engage with one another, explore senses of self, and build communities. However, on the other side of this experience, there are a multitude of social issues encountered by people online. Although they were originally conceptualized as potentially revolutionary in promoting equality in public places (Nakamura, 2013), online spaces have many strains and tensions that reflect and reinforce social inequalities. Many online platforms have become centers for toxic and hostile interactions (Alshamrani et al., 2020), which are frequently expressed toward women, people of color, and those in the LGBTQ community (Gray, 2012, 2017). In terms of audiences for online gaming, many have become exhausted by the number of hostile interactions that they encounter, even in cases where they are not themselves targets (Tomlinson, 2022).
Such hostilities are also experienced by online streamers, including those using Twitch (Chan and Gray, 2020; Gray, 2020), which is a popular site for personal streaming. Although this site is largely known for streaming video game play (Hamilton et al., 2014), it has expanded to a variety of other topics (Ruberg and Lark, 2021). Like many other online services, Twitch establishes behavioral standards through their terms of service (TOS) for use, but these are often biased against marginalized groups (Ruberg, 2021). These approaches also appear to do little to mitigate online toxicity and Twitch’s link to gaming communities may exacerbate hostility toward underrepresented groups (Catá, 2019).
Data for this project is based on observations of Twitch streams and conversations on Reddit and Twitter, investigating the underlying norms and expectations that allow toxicity to persist and thrive both in the general Twitch community and stream-specific cultures. How do different sources of guidelines and expectations interact? For example, users receive information from broader Twitch culture (Jackson, 2020; Johnson, 2022), terms of service, and streamers’ channels (Mihailova, 2022) that may contradict or counteract one another. Why are hostile behaviors persistent, despite a notable discomfort among users? Specifically, users of Twitch – as online streamers, moderators, as well as fans of video games and the streamers who play them – are analyzed. 1
This project adds to previous work on toxicity on Twitch using conversations across online spaces to provide foundational and exploratory information on the discourse surrounding these experiences, how users frame these encounters and seek solutions, and the ways that engagement with different online communities can work in tandem to produce cultures of permissiveness and do-it-yourself protection. These dynamics are explored by first reviewing relevant literature, discussing the methods applied in this project, highlighting the analysis and findings, and concluding by emphasizing the implications of the experiences discussed by Twitch users. In the analysis, I find that streamers use online communities to discuss discrimination, hostilities become embedded as part of channel-specific cultures, perceptions place most onus on users to protect themselves, and that the cultures that develop make organizing for change largely ineffective. The unique norms that develop among Twitch users can help us understand many of the online dynamics present in cases where hostility and toxicity become commonplace. This study further highlights that many attempts to organize against or seek assistance for handling these behaviors lead to DIY methods of protection and feeling a sense of helplessness when it comes to relying on Twitch to protect streamers and their fans.
Fans, fandoms, and harassment
Fans are able to bond and build communities around the things they collectively enjoy and online spaces make these communities especially accessible (Jenkins, 2006: 137). Online spaces also mean that fans have unprecedented access to one another (Jenkins, 2006: 142) and, in some cases, the people and things that they are fans of. Twitch is one of several online platforms that supports a highly participatory fandom, where viewers can provide monetary support to a chosen streamer (Johnson and Woodcock, 2019; Wohn et al., 2018).
Despite the emphasis on enjoyment, fandoms and participatory spaces are not always limited to positive and supportive expressions. Fandom has numerous facets and faces, from the political, to the transformative, to the aggressively reactionary (Stanfill, 2020). Harassment in and from fandoms is also not a new phenomenon. Social media allows fans to establish and explore reactionary viewpoints, including harassment campaigns of those involved with the creation side of their fan objects (Blodgett, 2020).
There are varying degrees of these issues with fans, however, with toxic fandom occurring across many fan groups and subsets of fans (Hills, 2018) and happening in short bursts or enduring for longer spans of time (Scott, 2020: 67). Harassment among and from fans also spans a wide range of behaviors on many platforms and even offline, ranging from stalking or cyberbullying to physical attacks (Reinhard, 2018). Unlike anti-fandoms, which are antithetical in many ways to fans (Gray, 2005), these Twitch viewers are sometimes invested in the streamers they view, but may be feeding into online cultural norms that support harassment.
Toxicity online and in video games
Platforms across the internet are increasingly being studied in terms of affordances for hostile interactions among users (Massanari, 2017; Ortiz, 2020). Anonymity is often identified as a primary source for these behaviors. This online disinhibition effect is understood as creating the necessary conditions to allow for hostile interactions due primarily to an experienced separation of identity from consequence (Lapidot-Lefler and Barak, 2012).
This intersects with other ideological and cultural issues online. For example, right-wing extremist views, trolling, and the expansion of these behaviors – including in gaming communities – have become an ever-present backdrop to the discussion of toxicity online. Race is heavily linked to trolling behaviors and can be used to implement boundaries through reactions to perceived cultural threats (Ortiz, 2020) and recruitment into these causes is common online and frequently targets video game players (Condis, 2019). Gender also comes into play, with elements of geek masculinity driving and defining many of these behaviors and the gatekeeping linked to them (Massanari, 2017; Salter, 2018).
Massanari (2017) further discusses these elements of masculinity noting that websites may be set up in such a way that they allow hostilities and toxic ideas to flourish. This underscores the wider toxicity and hostility that have become endemic in gaming culture (Dutton et al., 2011), particularly in recent years (Jhaver et al., 2018; Salter, 2018). In some cases, this manifests as targeting games featuring progressive content with negative reviews (Jhaver et al., 2018) and has prompted harassment campaigns across various social media platforms against people – particularly women – involved in the games industry (Todd, 2015).
Growing toxicity among players has also prompted more research to identify its causes and possible solutions, including better understanding that competition can make trolling among video game players more likely (Lee, 2016). 2 Toxicity overall – whether related to trolling behaviors or competition – has been experienced by approximately 50% of people playing massively multiplayer online games (MMORPGs), with about 35% of players acknowledging that they have initiated or participated in these behaviors (Ballard and Welch, 2017).
In many cases, players are increasingly growing tired of having to engage with and prepare for these behaviors (Tomlinson, 2022), but dynamics of hostility may be fostered by design in similar ways that affordances in other online spaces can introduce or support opportunities for harassment. This has been connected to game design that bolsters hostilities (Sengün et al., 2019) as well as video game content. For example, characters that are not white or are not men are underrepresented and writing relies on stereotypes as part of including people of color or women as characters in games (Srauy and Cheney-Lippold, 2019).
Gatekeeping against players of diverse backgrounds is a cultural factor in this as well. This often appears with the aim of attempting to maintain a perception of the demographic composition of the audience for video games (Birk et al., 2016), which is associated largely with younger, white, straight men (Apperley and Gray, 2020: 41). Toxicity also frequently comes from players blaming one another for poor playing and lack of skill (Lee, 2016; Sengün et al., 2019; Tomlinson, 2022), another area culturally associated with gender (Tomlinson, 2020). This influence has been observed in hostility toward feminine voices during game play, regardless of skill (Kuznekoff and Rose, 2013). In addition to gendered harassment, players receive directed hostile attacks based on race and ethnicity (Gray, 2012, 2017; Sengün et al., 2019). These hostilities are often intersectional in nature and other elements of identity are also seized upon as part of toxic interactions (Gray, 2012, 2017), but these negative attitudes and racial harassment are particularly aimed toward and salient for Black players (Gray, 2012; TaeHyuk Keum and Hearns, 2022).
Online streaming and Twitch
Online streaming communities and culture have strong connections to both online and gaming cultures. This is a relatively new area of research, in part because online streaming is a somewhat recent phenomenon, although its popularity has been steadily increasing (Taylor, 2018). Hamilton et al. (2014) point out that Twitch became a major platform for video game streaming, with individuals able to share their content with viewers while interacting via audio and receiving audience feedback through a chat box. The platform itself is open to a wide range of skillsets, from relatively skilled players to novices. This wide range of possibilities has contributed to the growth of Twitch to the most popular platform for streaming video game play (Mihailova, 2022; Taylor, 2018).
Cultures can vary from channel to channel on Twitch and norms can develop within these contexts that result in violations of broader or more standardized rules, such as the TOS of the site, being ignored (Mihailova, 2022). In part, this may be attributed to the temporary nature of discussions and exchanges on Twitch (Mihailova, 2022), but this project seeks to amplify the contributions of larger cultural considerations, such as increased personal responsibilization in online spaces (Sugiura and Smith, 2020) and online tendencies to victim-blame (Jane, 2017). Other relevant factors on Twitch include the use of humor to build and reinforce identity in the context of specific channels or streamers (Jackson, 2020), which may be fostered by Twitch’s more general cavalier approach to interactions and inclination to make light of more serious content (Johnson, 2022).
Although online streaming comes with participatory benefits linked to building communities and social spaces (Taylor, 2018), hostility based on race and gender is also found on Twitch. For women, their streaming experiences often result in other users reducing them to body parts and questioning their legitimacy (Ruberg et al., 2019). Twitch itself often provides vague information about actionable offenses on the site and tends to focus on behaviors that target women streamers (Cullen and Ruberg, 2019). Comments aimed at women streamers on Twitch are also highly gendered, focusing on objectification, and being applied to the most popular women streamers more frequently (Nakandala et al., 2017).
There is ample work on toxicity in online spaces and among video game players and increasing information on what these issues look like for streamers, but the ways that moderation, perceptions, and norms interact with fandom should be further investigated. There is also a dearth of information available on how broader cultural pressures, including those related to gatekeeping and limiting participation, may influence streamers who are women, people of color, queer, or people with disabilities to go along with fan actions that may make them uncomfortable, target them, or violate community standards. Based on current research, however, it is important to note that even in cases where streamers and their moderators can address violations of rules or TOS, the streamer often still experiences the harassment before action can be taken (Thach et al., 2022).
Methods and data
This project uses content analysis and online observations to explore the forms of harassment experienced in online streaming and how and why these behaviors persist. In particular, the cultures that develop in these spaces around issues of identity are assessed based on users’ online conversations. Data is drawn from observations of Twitch streams and publicly available online discussions gathered from several spaces on Reddit and through Twitter. To protect users’ privacy, despite the publicly available status of the data, specific streamers, subreddits, and users of any of these platforms are not discussed directly. Further, quotes are slightly reworded to reduce the likelihood of users being identified.
Observations of streams are based on two middle-range popularity (e.g. typically holding a few hundred to a few thousand average viewers) streamers. One streamer openly identifies as a gay man, while the other has discussed his disability on stream. Due to the difficulty of finding marginalized streamers through usernames, streams were initially viewed randomly and the observed streamers were selected because of self-disclosure of marginalized status based on sexuality and disability during their streams. These two steamers were selected as brief case studies of self-disclosure of identity and status on Twitch and to serve as a comparison point to online discussions from a broader range of streamers.
Observations of these streams included noting streamed content, text chat conversations, and streamers’ reactions to these conversations. Together, there were a total of approximately 47 hours of stream observation over the span of approximately 3 months based on a few hours of streaming on each channel per week and until thematic saturation (Hennink et al., 2019) in activities and interactions. During streaming sessions, notes were taken based on chat feeds, donation messages, interactions between streamers and viewers, and reactions of streamers to messages and notes sent by viewers.
Online posts and comments were observed between January 2020 and July 2020 and additional data was collected through targeted searches between January 2020 and September 2021. In total, approximately 500 Tweets and 450 Reddit posts and their top 20 comments were collected while conversations were active. An additional approximate 200 Tweets and 300 Reddit posts and their top 20 comments were collected through targeted searches. Reddit allows for posts and comments to be sorted by those with the most ‘upvotes’ (e.g. supportive votes from users). These were selected to gauge general community support and agreement with problems, statements, and offered solutions. These were then qualitatively analyzed to determine the cultures that have developed around harassment in online streaming communities. An open coding approach was used to identify patterns and themes present throughout the data (Corbin and Strauss, 1990), arriving at the major themes discussed in the analysis below. Coding was concurrent with data collection and continued once collection was complete. This process passed through several rounds to determine patterns and themes that occurred frequently in conversations among users discussing their experiences with streaming (Charmaz, 2006). The primary goal was to investigate and analyze how users approach, frame, and understand instances of hostility, toxicity, and harassment from fans and audiences in online streaming.
The coding process included transferring conversations online to a word processing program and applying thematic codes to the conversations with bolding and color coding. These themes were refined through multiple passes of coding during and after data collection to determine the prevalent shared themes in conversation. This process highlighted a number of common conversational topics, tones, and highlighted experiences. This included coding for words tied to emotion or experience (e.g. ‘frustrated’ or ‘thankful’), purpose (e.g. seeking advice), and specific experiences (e.g., ‘stalking’ or ‘developing block lists’). Due to the exploratory nature of this project, following the most common patterns in the conversations themselves took precedence.
Twitter and Reddit posts were collected based on discussions related to streaming through regular observation and by using targeted searches for terms related to harassment, hostility, race, gender, sexuality, disability/ies, toxicity, and moderation. In part, using a targeted search for specific terms allows for direct analysis of these experiences, how users discuss them, and how the community forms opinions of and ideas around causes of these problems and their potential solutions. A random sampling approach or isolating only live conversations would highlight many more discussions from non-marginalized streamers and conversations around other concerns, such as technical questions and suggestions.
Searches for conversations around identity and disability status were not directly linked to searches for terms related to hostility, but the majority of conversations surrounding these social statuses emphasized experiences with harassment except for disability/ies, which did not return as many results and were sometimes related to personal concerns about gaining and maintaining an audience. Among these posts, the top discussions and their top comments were recorded and coded for information. Twitter posts were collected and recorded similarly, through regular observation and by seeking conversations about Twitch, the terms searched on Reddit in relation to streaming, and Tweets regarding recent #TwitchBlackout campaigns. Specifically, themes related to identity, discrimination, cultural support for hostilities, and growing exacerbation paired with self-responsibilization were observed.
Analysis and findings
Identity, discrimination, and experience
Although affordances on Twitch hypothetically grant the same access to all users, algorithmic biases create circumstances where white men streamers tend to be highlighted in ways that help them achieve a larger audience (Chan and Gray, 2020). Further, marginalized streamers – and particularly Black streamers on Twitch – find themselves in positions to gain smaller dedicated audiences as a result (Chan and Gray, 2020). Marginalized streamers have complicated relationships with the identities that they must often juggle as part of the streaming experience, recognizing that harassment due to gender or race and ethnicity is commonplace on Twitch. However, despite the widespread reported issues among users about being targeted due to race, comments from these online communities are generally supportive of streamers of color, contradicting many of their experiences with viewers. This is one of the largest sets of requests for support and advice that come through these online discussions, the majority of which are sought by Black streamers. Streamers frequently note racism in the form of comments from users that they deem ‘trolls’ in the chats of their stream. As one user mentions: Every single time I stream, I end up with racist trolls. I’m Black, so you can probably take a guess at some of their comments. I try to just be happy and positive, but these racist kids ruin my day….I know to stream, we have to harden our shells, but I’m at a breaking point. People are ruining what I love. Can anyone help me figure out how to stop this? (user, Reddit)
Twitch users report cases of hostility, racism, and toxicity as an average part of their experience. These issues are framed in two primary ways. The first is seeking advice on how to handle these issues or potentially avoid them. This is usually in the context of wanting to continue streaming as a hobby, but feeling unwelcome or unsafe. The second is expressing general frustrations. This is typically in conjunction with the viewpoint discussed below that hostilities are part of the experience. Hostilities and harassment are seen as negative and needing to be more fruitfully addressed, but there is some level of ambivalence about whether this is truly possible.
In comments on these posts, there is a great deal of support and streamers who come to seek help from these online communities often edit their posts to discuss how thankful they are for the help and advice. These posts suggest a dual experience in streaming and online spaces, one in which the streaming – and potentially fan – community appears to be positive on a surface-level glance or at the very least engages in performative support toward streamers of color outside of Twitch, but platform affordances allow dominative racism (Dovidio et al., 2017) to flourish as part of Twitch chat. Despite finding what appears to be a level of community support for encountering common instances of racism, these issues also persist and are responded to with individual-level solutions that do not fully address the problem or streamers’ concerns and largely place responsibility on the shoulders of marginalized streamers encountering these issues, which will be discussed further below. In short, however, the advice given to streamers experiencing regular racist interactions during their streams recommends that streamers realize the kinds of interactions that they will encounter and either handle it themselves or quit streaming.
Discussions of gender-based harassment and hostility on these platforms also encounter user blame for their experiences, but from different suggested sources. Harassment is still framed as an expected part of streaming, something that is unavoidable and largely not questioned in the broader community. One woman, recalling her consistent harassment due to a video from her stream being shared online, mentions: It’s sad that this behavior from men is so common that women become desensitized to it. People often dismiss this behavior by saying, “That’s just how it is.” One woman told me: “Don’t participate in a group known for being toxic to women and be sad over it.” Women are expected to be complacent with the lack of respect without trying to change it.... How long do women have to deal with this in gaming and online? (user, Reddit)
Women streamers express their discomfort and disappointment with harassment on Twitch and note community resistance to support, which is not always necessarily linked to the gender of the people responding. This, in conjunction with the frequency of such reports, also speaks to the broader cultural concerns and considerations that bolster these dismissive responses.
Women seeking similar kinds of advice on handling toxicity also tend to be understood based on what they may have done to encourage harassment. This may relate to the use of labels like ‘titty streamer’, a derisive term suggesting that women on the platform gain views based on physical appearance (Ruberg et al., 2019). It may also link to gendered dynamics of victim-blaming for online harassment (Jane, 2017). While hostilities are generally viewed with some level of acceptance for all streamers seeking advice, women are more likely to be questioned about their intent when encountering hostility. In one such post, several users push back against a complaint about women being inherently seen as sexual on the platform. One commenter suggests, ‘…women wearing the tiniest possible bikinis…. This is obvious and intentional sexual behavior, so should it be on a platform that is against sexual content?’ This is also true in instances where users post about well-known harassment experiences tied to gender. As one commenter points out in response to an article covering these experiences, ‘The bigger issue is why is this promoted by Twitch? This is a gaming website and someone is just making things up about how video games are sexist’. While some users still offer support, there is more of an emphasis on turning the discussion toward explaining what women are doing incorrectly on the platform.
These experiences reveal a set of expectations that have been established for users of Twitch, whether they are streamers or audience members. However, in these cases, it is not as easy to avoid hostilities like one would in a multiplayer game by muting a mic or hiding one’s identity (Cote, 2017). In streaming, multiple factors put identity front and center for most of those using the platform, typically due to being visible on camera and/or being heard over a microphone. While there is resistance among streamers to these patterns of hostility on Twitch (Gray, 2020), the cultures at play may make it more difficult to see or feel a sense of progress or recognition. Many of the elements present in both online and gaming cultures discussed above in conjunction with sharing stories about these experiences produce conceptualizations about what encounters are likely to occur and this tends to include hostility at the forefront.
Hostilities embedded as culture
While streamers frequently seek community advice and support – and despite systems in place through TOS and the use of moderators – there is a potential for channels to succumb to cultures of hostility among fans. For the observed streamers, viewers often integrate jokes about identity or ability. Sometimes, but not always, this would include the use of slurs aimed at the streamer based on sexuality or disability. For the streamers studied, this became a part of the everyday discourse in their chats, even in cases where this would typically violate Twitch’s TOS. As one example, the majority of viewers of an openly gay video game streamer often made hostile homophobic jokes, which became an average part of chat. The streamer in these cases played along with the audience, channel moderators did not ban users or give warnings, and the stream would continue as usual, despite the comments. In another instance, a streamer’s known disability became a common theme and target for aggressive jokes. Users would create text-based imitations of the streamer’s disability, to which the streamer would respond by laughing and playing along.
In each of these cases, the streamer would pause. While they never gave verbal communication that they disliked or did not approve of the jokes, they would tense slightly and their demeanor would momentarily change. For the streamer whose audience made jokes about his sexuality, he was also openly reliant on donations from the audience to make his rent regularly. The possible pressures experienced by streamers to grow and maintain an audience – particularly when finances are considered – may contribute to the establishment of cultures of permissiveness in cases where users are largely responsible for enforcing TOS. This adds another element of emotional labor (Ruberg, 2021; Woodcock and Johnson, 2019) to the streaming experience for marginalized streamers. This also creates circumstances where hostilities as part of Twitch and streaming culture may be reinforced.
Part of this cultural embedding is also related to broad concerns and understandings of companies overstepping to protect users. There is a divide among streamers and their fans discussing streaming and appropriate approaches to reducing hostilities. While many are uncomfortable with the toxic landscape that has developed, others are concerned about what this could mean for fans and viewers who engage in harassing behaviors, sometimes dismissing hostility as part of the culture and as relatively harmless joking. Humor, in addition to being a large part of Twitch culture (Jackson, 2020; Johnson, 2022), has also been used to explain why racism is a common part of online discourse (Hokka, 2021).
Streamers and viewers alike often give users who engage in these activities the benefit of the doubt. Still, discussions about humor as a vehicle for racist comments are notable among users. For instance, one user mentions: Most of the time it’s just people making tasteless jokes. It can get overwhelming, but Twitch chat just wants to get reactions out of people. A lot of Black streamers stopped reacting to the racist jokes and it gradually stopped. I’m not saying that makes it okay, but most of the time, it isn’t actual racists. They’re just trying to upset people and faking racism is an easy way to do that. (user, Reddit)
This dismissive and permissive point of view also provides some backing for hostile interactions and the perception of users as responsible for handling these encounters. Taken together, this provides pressure and justifications for cultures to be established among general users and fans that can foster these interactions. It becomes more difficult to make distinctions between what a user might deem ‘real’ or ‘just a joke’. The distress felt by streamers, however, does not diminish based on possible or assumed intentions of users.
These jokes occur in contexts with more pointed ideas about what Twitch culture is or is meant to be. As one streaming fan posits: Streaming information on Feminism shouldn’t be on Twitch. Politics, religious preaching, or anything else that’s not for games or creative, honestly. Twitch was built up as a streaming platform for gamers, I don’t think that it needs to change any more. Adding Creative and Music fits amazingly well. Adding Feminism or other ideology doesn’t belong here. (user, Reddit)
Together, these perceptions work with the broader landscape of gatekeeping that targets marginalized players in gaming spaces and positions them as others (Gray, 2012, 2017; Kuznekoff and Rose, 2013; Harmer and Lumsden, 2019) causing users to anticipate and expect hostilities and toxicity online and while streaming. Because of streaming’s centralized position between internet and gaming cultures, this is, perhaps, not surprising. Discrimination and harassment do not go unnoticed in these spaces. Streamers who receive these hostilities or targeted ‘jokes’ from fans take to forums to post about their experiences and ask for advice and users of forums frequently discuss and assess these events. Ultimately, there is an increasing sense that this is just what one must expect, whether it is considered right or pleasant or not.
Perceptions of users’ roles
These discussions further delve into the position of streamers in these circumstances. The vast majority of advice provided emphasizes the streamer’s status in the context of experiences with harassment. Fellow streamers and moderators recommend building better bots to automatically take care of troublesome fans and harassers, ignoring trolls, setting up better teams of moderators, and creating lists of blocked words that may include a streamer’s personal information and slurs that may be used against them. The foundation of this perception is in the seeming inevitability of these interactions. As one user suggests to a streamer afraid of encountering racist hostility while streaming: I agree with these comments. Being a streamer means exposing yourself to this. The same things happen for [other public figures]. People are mean and you can’t do much about that. I would say ignore them and just try to enjoy your life and if it bothers you that much, think about how much you really want this. (user, Reddit)
While the community is typically supportive of those seeking help, there is also an underlying dismissiveness in many cases and a sense that complaints – even in cases of racist or sexist harassment – are in some ways misguided because online streaming is inherently hostile. This works to bolster the gatekeeping that is heavily applied against marginalized users, not only putting pressure on them to be responsible for their own safety, but placing blame on them if they fail to ignore discrimination and harassment. Further, there is a helplessness to these discussions. For streamers who are continually harassed across platforms, there are few remedies for the hostility that they experience. 3
This logic is common among Twitch users. It is your responsibility to recognize and accept the risks that come with streaming online, particularly if you are from a background that routinely encounters hostility. Additionally, this connects to other conclusions about participating as a streamer. If you are using Twitch, you are responsible for keeping yourself safe and dealing with harassment on your own. This is a pervasive idea across discussions, for users of Twitch as well as for video game players broadly. Another streamer discussing these issues likens the approach to playing video games and suggests handling harassers in a similar fashion. They suggest: For example, if you were playing with four random people using mics and one started to harass you because of your race or gender or anything else, you would mute them. You report them and you continue playing with the other three people. The other players should also report even if they weren’t harassed. (user, Reddit)
Most often, the discussion centers on users’ personal responsibilities and roles in protecting themselves from harassment and hostility, even in cases where it appears the affordances of a platform are likely contributing to these outcomes.
Personal responsibility also aligns with common perceptions of companies as largely not invested or as incompetent when it comes to protecting users and enforcing TOS. In a few discussions, users acknowledge that the TOS is being violated directly, but these pieces of advice continue to follow the trend of protecting oneself because official channels fall short. In one such exchange, a streamer came to Reddit for advice on how to handle an extended harassment campaign. The streamer and top commenter had the following exchange: I’m a small streamer and I ended up being at odds with another female streamer with a bigger following. She has been having her followers harass me for months, having trolls accounts follow me and spam friend requests. She’s been in trouble twice, but hasn’t stopped. I’ve made clips of everything and sent it to Twitch, but they haven’t done anything. I instantly block the accounts, but does anyone have any other ideas how I can stop this toxic behavior? (user post, Reddit) Hopefully Twitch will see this. I can’t believe someone would do this. It’s a violation of TOS. Get some trustworthy mods to make sure to kick these people out. You could also make chat limited to followers. I hope you’re alright. Don’t let their behavior get to you. (response comment, Reddit)
Other users echoed the sentiments in the initial response, checking to see if the streamer had a list of banned words, had their ban option at the ready, and were prepared to block all hostile users, with some noting that they would otherwise just need to keep reporting as, ‘it’s really all you can do’ (user, Reddit).
The growing awareness of these interactions and the sense that, realistically, they have become an expected part of the culture meshes with concerns about what can be done to alleviate or address the problems. This sentiment is reflected well by one streaming fan who expresses their exasperation with these issues while also highlighting the use of humor in these situations: Are you interested in the TRUE answer? I won’t sugarcoat this. Twitch is going to do the bare minimum (anything but the N word is allowed). Streamers have to decide what is okay or not. Most streamers are not racist, but money is going to be more important than anything else, so they coddle racists and allow offensive “humor” as long as they won’t get banned by Twitch. It’s a positive for profits. That means racists can run rampant, they can tell their friends about this channel that won’t punish them. Then there are younger users who are NOT racist, but impressionable and want to fit in with the same jokes. So they use the jokes and see the streamer is okay with it and then figure racism is okay. I do love Twitch, but this is something they need to work on. (user, Reddit)
There is a lack of trust in the company to be able to address and handle harassment on their platform. Additionally, viewers may recognize the role that pressures for profit play, the proliferation of humor as a vehicle for harassment, and the ways that these elements of streaming can foster a culture which embeds hostility. With these realizations, there is also a conclusion that users of Twitch are largely responsible for themselves and how their spaces develop and grow.
Twitch and gaming platforms are framed as unreliable, online interaction is viewed as inherently toxic, and users are left to the whims of harassers. There is, however, frustration that exists among users. Much more common on Twitter, users often call for better support and more action on the part of Twitch to protect users, rather than relying on individuals to counter and address the discrimination and broad harassment that they often face.
Online organizing and the pitfalls of DIY safety
Although much of this discussion has centered on disruptive and hostile viewers, fans have also increasingly used online spaces to highlight the toll that harassment can take on streamers. In mid-2020, conversations on Reddit and Twitter among streaming fans began discussing a spike in suicide among streamers. Although many pieces of advice that streamers get emphasize their personal responsibility and appear to view this as a cultural norm, these conversations quickly turned toward the role of fans as well as the pressures and toxicity many streamers face.
Fans on Twitter and Reddit mention that several of these streamers were open about their struggles with mental health, discussing experiences with depression only to be met with messages encouraging them to “kill [themselves].” These moments provided a space online for fans to discuss the streamers that they enjoyed watching, the harassment they witnessed, and the impact that these streamers had on them. It also presented an opportunity for fans and streamers alike to discuss mental health and provide resources and support for one another.
This also prompted discussions about what could be done on a larger scale, including banning prominent users who frequently encourage negative interactions with streamers and highlighting negative fan interactions and their potential roles in these deaths. Some of these conversations emphasized community responsibility in different ways, including discussions of online communities dedicated to anti-fandom and harassment of specific streamers. As a user noted in the wake of one streamer’s suicide: It’s really sad that so little has changed since [streamer’s] passing last year. There was a lot of discussion about how it couldn’t be for nothing and how the community needs to stop being so toxic or some shit. How many more people need to be fucked by their mental health for us to take this shit seriously? This [site] is so dedicated to shitting on people whether they deserve it or not. Even this morning people were mocking [recent death] for his mania induced behavior. It's so disappointing that we have to lose figures in the community before we can see any change. (user, Reddit)
In some instances, users began conversations about the viewer’s direct role in potentially providing solutions to the problems that plague the community and wear on streamers who have been largely left to their own devices to cope. In some cases, this has gone further than individual posts calling attention to these behaviors. Social media is also used to organize among streamers as well as viewers in ways that aim to challenge and reduce the impact of toxic viewers.
The #TwitchBlackout campaigns have mobilized online to call attention to a variety of issues on Twitch, including hate raids, harassment of smaller streamers, racism on the platform, harassment due to gender and sexuality, and instances of sexual harassment or assault. Hate raids often overlap with other issues and resemble a mixture of anti-fandom (see: Gray, 2005) and trolling. These blackout events aim to raise awareness and place pressure on Twitch, which also provides official statements that reinforce personal responsibility in terms of addressing attacks that target streamers (Twitch, n.d.).
These events are also used and engaged with differently by streamers and their fans, with some opting to share personal stories about why organizing in such a way and refusing to use Twitch for the specified day is important and some promoting general support for streamers who have been harassed by viewers. Others used the opportunity to outline specific demands and ways that Twitch can address users’ concerns and make a better space for everyone. As one Twitter user mentions, “Streamers need to make lists of banned words manually instead of Twitch banning those words site-wide. Racist and inappropriate usernames should not be allowed.” The gaps left by the emphasis on personal responsibility are felt widely and contribute to the shared sense of being on one’s own and not having access to enough official support.
Although many people have supported blackout events as a means of combatting these issues, others are critical of the potential success with this approach due to the compounding of personal responsibility and perceived limited scope of blackout organizing. One example of these sentiments is reflected by a Twitter user who mentions: I see a lot of people planning a #TwitchBlackout this week. Great, but what about after that? If you participate, I support you. But beyond that day, what does this do? Let’s be real, some streamers taking the day off isn’t hurting the bottom line.
Users largely recognize the issues faced by streamers from viewers and acknowledge the limitations placed on them by the DIY approach that they must take to ensure their own safety. Although social media can be used to organize against a culture that makes harassment permissible and, by many accounts, seemingly inevitable, many users find little comfort in these events and the push to organize is frequently short-lived.
Conclusions
This project further investigates experiences among marginalized users on Twitch, highlighting the ways that they use external sites to build communities and address dissatisfaction with current approaches to addressing harassment and toxicity. While options exist for streamers to try to address these problems on their own – and the community has become increasingly savvy with these options – they place the burden on users and frequently produce feelings of helplessness and hopelessness when trying to reduce harassment. Further, despite attempts to organize beyond sharing advice in settings outside of Twitch, blackout events meant to raise awareness and influence change appear to gain little traction across the platform itself and decline rapidly.
The perceived and experienced failures of the platform discussed on Reddit and Twitter reinforce tendencies to victim-blame and, although this is often noted as rooted in misogyny (Jane, 2017), this extends to other marginalized users due to the emphasis on personal responsibilization that has also taken root in online spaces (Sugiura and Smith, 2020). These online norms extend beyond Twitch, with advice provided to streamers on other sites amplifying feelings of helplessness among the users studied. This highlights the importance of using intersectional approaches to understand these issues (Gray, 2020) and suggests that more spaces (e.g. social media and other areas of online interaction) should be taken into consideration as a larger part of the streaming and Twitch community experience.
The hostile interactions experienced by streamers and discussed online add to the emotional labor of performance (Ruberg, 2021; Woodcock and Johnson, 2019), which is not alleviated by the necessity of witnessing harassment before it can be addressed (Thach et al., 2022). Paired with the user-centric approaches to solving these problems as explored in previous research (Sugiura and Smith, 2020) and further illuminated in this study, this results in users trying to find solutions within the community but outside of Twitch. Streamers use these spaces to commiserate and seek advice, but these communications often occur in ways that ultimately reinforce many of the feelings of helplessness that they experience due to a lack of standardized response, the persistent cultures on Twitch that discourage dealing with issues seriously (Jackson, 2020; Johnson, 2022), and the emphasis on personal accountability.
Broader Twitch norms interact with channel-specific cultures (Mihailova, 2022), leaving streamers in a position where they must sometimes choose between monetary support (Johnson and Woodcock, 2019; Wohn et al., 2018) or protecting themselves. This may also link to larger issues found on the platform beyond algorithmic discoverability that emphasizes racialized and gendered results (Chan and Gray, 2020), including the proliferation of far-right radicalizing information (O'Connor, 2021) that has also been observed in gaming more generally (Condis, 2019).
The foundational findings of this study highlight the need for more work in this area, but also suggest that stream-specific cultures and norms (Mihailova, 2022) extend beyond Twitch itself and can become further reinforced in spaces used by streamers to seek assistance. Exasperated users take to Reddit and Twitter to seek advice, find solutions, and try to organize, but are often met with similar stories, suggestions for individual action, and a growing sense that change or improvement are impossible to attain. The ubiquity of hostility is felt deeply by users – especially those in groups that face higher rates of social discrimination – and yet is also perceived as inevitable and largely immutable. The current approach of placing the burden on streamers, who are often in a position of dependency on unchecked audiences, perpetuates and maintains these hostile cultures. As many frustrated community voices have increasingly made clear, policies and reactions from Twitch have not been enough.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
