Abstract
Social media sites such as Facebook depend on tens of millions of volunteer moderators across the globe to facilitate platform-based discussion forums. While research has revealed much about the work that these moderators do, some fundamental questions remain. For example, why do volunteer moderators commonly work as teams rather than individuals? In this article, I use data gathered through digital ethnography with Facebook Group moderators to explore the benefits and challenges of moderation team work. I develop a three-part framework to articulate how teams facilitate logistical, discursive, and emotional labor. Finally, I argue that this empirical analysis reveals otherwise hidden and unacknowledged dimensions of volunteer moderation work that make platform-hosted discussion groups possible.
Keywords
Introduction
It only takes a few clicks to create a Facebook Group; besides designating a group name and privacy level, there is no further configuration necessary. So even while Toby (a pseudonym) had created his tag group as a joke for him and his friends, it was open for other people to join and start sharing content. Before long Toby found himself in charge of a Facebook Group with several thousand active members. He panicked. “I had no experience with—no, like, understanding [of] how to moderate a community, no experience knowing . . . how to even, like, mitigate conflict,” he told me.
How did Toby learn how to handle these complex situations? The Facebook Groups interface lacks any explicit guidance or mandatory training for admins. There is no tour of the tools, and there are no default rules. So, he asked for help from his friends and active group members, appointing them moderators of the group. As they ran into tricky situations, he and his fellow moderators “would discuss it with each other and try to figure out what the best course of action would be.” Essentially, they formed a team. When Toby encountered issues that he could not handle on his own, he turned to his team to discuss and decide how to respond.
Research of volunteer moderation practices spans decades, and the concept of the moderation team has been firmly established as a construct in the literature (e.g., Butler et al., 2002; Lo, 2018; Malinen, 2021; Seering et al., 2019). However, the moderation team has not yet been systematically explored. This research attempts to address this gap by interrogating the role teams play in moderation. It builds off of approaches from sociology and management that highlight the vital role teamwork and structure play in shaping work and knowledge production (Eisenstein & Jacob, 1977; Ulmer, 1997; Vertesi, 2020) and applies these theories within the context of platform-based content moderation.
In this article, I use digital ethnographic methods to give an empirical account of moderation teams and develop a three-part framework of their logistical, cognitive, and emotional functions for moderators. Furthermore, I demonstrate how each of these functions represents labor otherwise unaccounted for in descriptions of this type of free labor for digital platforms. I conclude by proposing that teamwork can be considered a trade-off for moderators, involving both costs and benefits.
Context and Literature
Facebook has been widely studied as the representative example of online social media platform structure and logics (e.g., Caplan & boyd, 2018; Gray & Stein, 2021; Schwarz, 2019; Zuboff, 2019). It is the largest social network worldwide with approximately 2.9 billion users at the time of writing (Dixon, 2022).
Content moderation on Facebook is a dense web of interconnected systems including both human and algorithmic actors (Gillespie, 2018). Meta employees craft content policies and put in place processes to review and revise these policies as new issues arise (Caplan, 2018). Because of its scale as a global platform, Facebook must also rely on the deployment of large-scale algorithmic surveillance to facilitate the review of all potentially problematic content (Caplan, 2018; Gorwa et al., 2020). When the algorithms cannot make determinations about individual pieces of content, these systems rely on commercial content moderators (CCMs) to make final decisions in conjunction with internal company standards (Chen, 2014; Newton, 2019; Roberts, 2019). CCMs often face harsh workplace conditions due to their status as low-paid contract workers frequently dealing with psychologically harmful content (Perrigo, 2022).
It is important to note that volunteer content moderators, the focus of this article, hold a distinct role from CCMs. While CCMs are employed by Meta or a third-party contractor to make content determinations in line with strict internal guidelines, volunteer moderators are not directly compensated for their work and therefore have the freedom to choose what spaces they moderate and make decisions about how they will appropriately govern these spaces (Malinen, 2021).
Facebook Groups enable users to connect over shared interests rather than through social networks. Users in Groups can share content like links, text, or pictures in the Group space that will then also show up in fellow members’ News Feeds. Users who create these groups have the title “admin” and, while specific affordances of Facebook Groups are continually updated, generally admins can make decisions about their group’s structure and content, such as whether the group is visible to non-members and whether specific users are allowed in the group. Admins can assign other members of the group as other admins or as moderators, who have slightly fewer privileges than admins. Here, I will identify the collection of admins and moderators as the group’s moderation team. Groups must have an admin (“What Happens when a Facebook Group Doesn’t Have any Admins?” n.d.) but beyond that, there are no requirements that groups must have any specific number of admins or moderators relative to size. Admins’ powers are limited, as content in these groups is still subject to platform moderation.
Group moderators serve a key structural role for Facebook. After the political scandals in 2016, Mark Zuckerberg prioritized Facebook Groups on the platform, publicly explaining that his company had a duty to help democracy by strengthening social infrastructure through interpersonal connection (Wagner & Swisher, 2017). The effort to reorganize Facebook around Groups appears successful: company announcements report there are “tens of millions of active communities on Facebook,” “[m]ore than 1.8 billion people use Groups every month,” and 70 million people are admins or moderators (“We’re Launching New Engagement Features, Ways to Discover Groups and More Tools for Admins,” 2020). 1 The relationship between Facebook groups and civic health remains an open question (Silverman et al., 2022), but many people who otherwise have soured on the Facebook brand report that Groups are the feature that keeps them tethered to the platform (Petersen, 2022).
Thus, Facebook’s financial model is in part dependent on the continued free labor of these moderators (Li et al., 2022), even if such labor is knowingly given or even enjoyable to moderators (e.g., Seering et al., 2022). As Terranova (2000) articulated over 20 years ago, digital cultural labor is both “pleasurably embraced and at the same time often shamelessly exploited” (p. 37) as the appropriation and structuring of free labor online is “part of larger mechanisms of capitalist extraction of value which are fundamental to late capitalism as a whole” (p. 51). The nearly ubiquitous exploitative structural inequalities of digital platforms persist today in part because the pervasiveness of such systems has served to normalize and obfuscate the inherent economic relations between users and social media companies (Fourcade & Kluttz, 2020).
The tension over free labor in volunteer moderation has been highlighted in academic literature through case studies of lawsuits that volunteers brought against AOL in the 1990s (Postigo, 2003, 2009) and the moderator-coordinated Reddit blackout in 2015 (Matias, 2016). More recently, using a moderation-log-based method, Li et al. (2022) were able to calculate that the monetary value of Reddit’s volunteer labor from its roughly twenty thousand active moderators to be worth, at minimum, USD $3.4 million to Reddit in 2020 alone. If even one percent of Facebook’s volunteer moderators contributed a similar amount of labor in basic tasks such as deleting user comments, that labor would be worth over USD $110 million to Meta’s bottom line.
Teams
The field of organizational studies has been considering the effectiveness of teams for several decades because of the key structural role they play in firms. Influential teams’ effectiveness scholar J. Richard Hackman (1998) claims that the best way to get a great team “has much more to do with how teams are structured and supported than with any inherent virtues or liabilities of teams as performing units” (p. 248). Such enabling conditions include teams having real interdependent work, compelling direction, thoughtful team design, supportive organizational context for their work, and available expert coaching (Wageman et al., 2005).
Vast changes in firm structure and organization in the last several decades have led to a corresponding transformation in teams in firms (Hackman, 2012; Oldham & Hackman, 2010). Indeed, many scholars question whether “team” is even a useful construct anymore when studying how groups carry out work. In many firms today, just like the volunteer moderators examined here, employees decide for themselves what their jobs will look like (Oldham & Hackman, 2010). Subsequently, group members have a say in how interdependent their work with the rest of the group will be, and such collaboration may ebb and flow over the time that the group exists (Wageman et al., 2012).
Volunteer moderation teams serve as prime example of these borderline teams. While moderator roles have defined technical capabilities, there are no explicit expectations of how they do their work. For example, teamwork may have any number of degrees of interdependence, depending on individual members’ preferences and time commitment. Some teams may collaborate on decisions every day, whereas other teams may hardly ever communicate, choosing to pursue their own work with little to no coordination. Moreover, Facebook Group moderation team members are not initially given any substantive guidance on how to do their work beyond the technical capabilities of their roles, and therefore teams are not provided with the enabling conditions needed for successful teams as described above.
Method
I conducted a total of 41 semi-structured interviews with English-speaking volunteer moderators on Facebook between 2020 and 2022. I solicited interviews by sending an IRB-approved text through Facebook Messenger to Facebook Group admins from my personal Facebook account, either as referrals or randomly sampled from Facebook’s “Discover” tab in the Facebook Groups tool. Interviews generally lasted approximately 1 hr, though they ranged from 30 minutes to two hours. All interviews were conducted over Zoom or the phone. Approximately three-quarters of the participants were compensated US$25 for their participation, depending on whether they were interviewed before or after a grant to support the research was awarded. Participants were mostly American, but the sample also contains Australian, European, Canadian, and Mexican participants.
Many participants moderated more than one Facebook Group, and altogether the range of group sizes moderated by participants ranged from approximately 250 people to almost 2 million people, and teams ranged from a single admin to more than a dozen admins and moderators. Most participants moderate fan groups of content like 19th-century authors, podcasts, television shows, and role-playing games. The demographic characteristics of participants are described in Table 1.
Demographic Characteristics of Participants.
Source. Interview data with volunteer Facebook Group admins and moderators.
Note. Data comes from interviews with 41 participants.
Interview participants often invited me to join and observe their groups. In these groups, I took screenshots of discussions or events that participants described during our interviews. I also captured screenshots of contentious discussions that I saw taking place, and recorded field notes about the contexts of these discussions.
After each interview I captured my initial thoughts in a memo, and then interviews were transcribed using Otter.ai and Rev. I coded these interviews on the NVivo 12 software using a grounded theory approach (Charmaz, 2006; Glaser & Strauss, 2009), beginning by creating an open number of codes to describe accounts at a sentence-by-sentence scale. Codes were subsequently analyzed and grouped into larger themes to find emergent patterns and theories. I then adjusted the questions in my protocol for the next round of interviews and coded these with special attention toward testing my theories (Charmaz, 2006). For the findings described in this article, I added a question to my interview protocol after my pilot round of six interviews about the role of teams using the counterfactual format (Jiménez & Orozco, 2021). For all three of my subsequent next rounds of coding, I identified specific practices and beliefs associated with teams and iteratively developed the three categories of work described below.
Results
Overall, my results suggest that moderation teamwork can be divided into three categories: distributing tasks, pooling knowledge resources, and commiseration. In the context of these categories, moderators, respectively, perform process labor, discursive labor, and emotional labor. The differences in these categories are not always clear cut and they may overlap, and the specific benefits were not uniformly experienced across teams or participants. See Table 2 for a summary of these results.
Summary of Results.
Participants in this study presumably over-represent perspectives of moderators having positive experiences in their work, as their work was ongoing, and they were willing to speak to a researcher about it. However, many moderators also recounted past experiences of teams that were unhappy or ultimately failed. They named many situations in which the benefits of teamwork were outweighed by the amount of labor required to sustain them, such as imbalanced power structures or toxic cultures. In these situations, they experienced immense personal stress and some even felt they may even have been chased out by other team members. Several moderators expressed at the end of our interview that they found our discussion to be deeply helpful in processing the lingering emotions they had from participating in prior groups that had failed despite their strenuous efforts. At least two people started crying during our interviews when recounting their past experiences.
Subsequently, I describe each of the three moderation categories as experienced by study participants.
Distributed Tasks
Increased Workload Capacity
The most straightforward reason that there are multiple moderators for the same group is that there is often more work than one person can do alone: increasing the team increases capacity. The people I spoke with had many commitments in their day-to-day lives, such as child care or working full-time jobs, and therefore reported that the time and energy they could commit to moderation work was limited. As such, dividing tasks decreases stress and makes managing large groups feasible.
One type of day-to-day labor for moderators is asynchronous group maintenance: this involves curating posts, monitoring membership, and responding to content that group members have reported. Carlos, a moderator for a shitposting group, told me that if he were the only moderator in his group, he would have much less time to consider whether to approve or deny each individual post in the queue. To clear the queue, he explained, he would probably save time by just hitting Facebook’s option to “Approve All” posts much more regularly. Participants also reported that if a group became popular quickly, they would need to add moderators to handle the added work in a timely fashion. Several participants’ groups were mentioned in a media outlet or otherwise went “viral” in a way that caused a tidal wave of new membership applications. Just like a toy store during Christmas that brings on seasonal workers to deal with the holiday rush, the sheer volume of maintenance work that needed to be done in these groups meant that new moderators were recruited to deal with the influx of attention.
A second kind of day-to-day labor involves synchronous participation: monitoring ongoing discussions and communicating policies and discipline with group members in real time. Teams have more bandwidth to keep tabs on ongoing discussions than any one individual. For example, Regan, an admin for a very large fan group, described how she alerted fellow moderators, even when she didn’t have the time to deal with a situation: “If I . . .don’t have time to really investigate, then I’ll maybe tag some of the admins from [my group] or I will go into our chat [and] say, ‘Hey, you guys might want to check this out. . .’” Such coverage is also helpful when moderators are too tired to deal with these fights in a productive way. Jennifer, who was also on the same team, told me that it’s “better when you have a large admin team, so you . . . can, like, take a break on refereeing some of the fights.” Azalea, a moderator for a large group dedicated to a different fandom, also expressed gratefulness for distributing work across the team: There are some days where in our mod group chats, people will be like, ‘Hey, I’m having a rough day mentally today. Is somebody else able to handle this?’ We’ll have like three people be like, ‘Yeah, I got it. You take a break.’ Not being able to pass it off to people sometimes would definitely be exhausting.
Many moderators highlighted that the diversity of time zone representation within a team expands the monitoring capacity during a given day significantly, supporting the finding from Seering et al. (2019). Simon moderates an international fan group of fifty thousand members, so getting someone in a different time zone is essential to keeping discussions running smoothly. He described that having someone on the other side of the world increases the capacity that the team has to deal with ongoing issues: “Like in one group, we have a moderator who lives in Japan, so they can moderate the overnight while the rest of us sleep.” In contrast, Anastasia is the only admin of her group of about three thousand people. She lives in Europe while most of her community is in the United States. The difference in time zones means that she sometimes feels overwhelmed encountering the results of a flame war long after it began: “[T]hings can go off in the middle of the night and I don’t know anything about it until the morning . . . I have to read through hundreds of comments trying to figure out who’s in the wrong, who’s in the right.”
Specialization
Dividing labor can also mean that team members can specialize by “claiming” tasks that they find take relatively less time or effort. This may happen explicitly, through discussion, or it may happen without explicit coordination, where moderators just choose to focus on specific tasks and ignore others. Explicit task specialization seemed to happen in groups where teams are able to communicate with each other about their preferences and relative skill sets. For example, writing up policies for the group may take some moderators much more time and effort than others. Christiana, who moderates a career-focused group, finds that she has a really hard time writing up policies, so when a policy needs to go up, she asks her only other co-admin, who she says is “better at writing . . . She gets it out and it’s perfect.”
Other members may be uncomfortable with administering discipline and rely on fellow members to do the “dirty work” for them. In her group with millions of members, Isabella works closely with one other moderator who is much more comfortable with disciplining members. “I’m the people person, he’s the logistics person, good cop, bad cop, if I do say so,” she told me. This “good cop/bad cop” dynamic was echoed by several other moderators who appreciate offloading disciplinary work onto others, who they know will be more willing to administer disciplinary action against members.
Moderators may also specialize in tasks because they find it fits better with their time or energy constraints. Hope and Max both work on the same large moderation team for their meme group of 600,000 people, but they specialize in very different tasks. Hope likes to attack the long queue of new member requests: “I can just sit there eating dinner and just whack out a couple hundred of new member requests. It’s great. Because that’s how I contribute and I enjoy it a lot.” In contrast, Hope’s team member Max finds such maintenance work overwhelming and likes to focus exclusively on real-time tasks like monitoring sensitive discussion in posts: “I tend to [step into conversations] because I just find it less overwhelming than say going through literally thousands of people trying to come into the group and looking over their answers.”
In summary, moderators report that adding moderators to a team increases the collective capacity for both maintenance and monitoring work, allowing individual moderators to focus more on the work that they find enjoyable and lessens the guilt that any one moderator might feel for not being able to attend to all parts of the work at one time.
Pooling Knowledge Resources
The second key feature of moderation teams is that they allow any one moderator access to more information and experience than they would have on their own. Moderators for very heterogenous communities may need to be able to assess comments and posts in a wide variety of contexts, especially as they become aware of their own gaps in cultural knowledge. Participants described situations in which they turned to fellow moderators to seek out knowledge about specific identities or cultures, such as being gay, trans, Black, or from a particular country. Teams allow moderators to exchange relevant information about those identities. Similarly, teams allow the dissemination of moderation strategies between or among groups. Finally, moderation teams create space for moderators to gain information about how the situation may look from different perspectives.
Access to Diversity of Knowledge and Experience
Moderators reported that having diverse moderators was very important for being able to deal with the issues that may come up in a large, diverse community. Specifically, they felt diversity was important for deferring to someone with more knowledge so that if an identity issue came up, the person with that identity can provide the rest of the group with information about how to deal with the situation and provide justification for decisions. Many moderators described that the diversity of the team would allow them to defer decision-making to the team member seen as having particular expertise, such as particular geographical or cultural knowledge. For example, Toby, a white moderator based in the United States, describes how the diversity of moderator location is important to expand the team’s capacity to engage with international politically sensitive issues: [T]here’s very frequently terrorist attacks happening all over the planet. And we will have to defer to the moderators who, if there is a moderator who is close or lived in that community, what do you feel needs to be done here? What precedent should we be setting? Where should we be drawing the line? And they will help us come to a decision about those certain topics.
Similarly, Connie, a Black moderator, was recruited to the moderator team of her large fan community to better represent Black perspectives shortly after the widespread American racial justice uprisings in June 2020. She told me that white co-moderators will sometimes screenshot posts and send it to her and the other Black moderator to get input about whether certain comments about race are acceptable. I asked her about how she felt when these white moderators asked her about racial comments in the community, and she said she was actually very glad: I really like it because I think it just kind of, I think communicating in that way, in terms of, ‘Hey, I just wanted to check in and make sure this is okay,’ is really good. I really want that to be a thing that is just, that just happens. . .
In other contexts, a white person asking a Black woman about the appropriateness of a post may be seen as entitled or unfairly demanding of her time (“White People, Stop Asking Us to Educate You about Racism,” 2017), but in the context of a moderation team, highlighting particular moderators’ race and cultural knowledge may not only be acceptable, but encouraged.
For many, the moderator discussion group was described as a “safe space” for moderators to learn about different cultures and ideas, allowing them to ask questions they would feel uncomfortable expressing publicly in the group. Eleanor, a Jewish woman moderator, articulated that her moderation team channel is a space where she can ask questions, and it’s a place “where I know that I’m not overstepping by asking for that.”
Of course, inclusion may only be symbolic. Moderators reported that identity issues were sometimes fraught when the team didn’t trust a particular member, or if that member felt like they were not being taken seriously. Two Black women admins on different teams I spoke with reported that they felt like they were only added to moderation teams as tokens, and that once on the team, they were not actually listened to. One of these admins, Shady Boy, told me that sometimes she feels like she is just the “token Black woman” on a team and feels her concerns are not taken seriously. Larry, a White man admin, runs a sports team fan group whose membership is about three-quarters men and has a harassment issue with “creepy dudes” sending women unsolicited private messages. He told me that it was only recently that the team added a woman moderator for the first time, a move they felt was important to communicate respect and solidarity for the women of the group. He did not feel that this woman substantively changed the team’s response, but that the symbolic value of her presence was in itself worthwhile.
Access to Second Opinions and Outside Perspectives
Jennifer, a Black woman, was very aware and open about her own personal biases against men and white women in making moderation decisions because of her own experiences with them: Well, I already don’t like men, I mean, so they’re already starting with at least two strikes. White women get every single [fandom] group shut down, so I’m just, phew!, I’m always on guard . . . So I just have to really rely on like, well, this is what I’m seeing, from my perspective, you guys [i.e., the moderation team] already know these things about me. So if you’re seeing something else, let’s put those things together and hopefully we get enough pieces of the pie.
This “pieces of the pie” metaphor captures a crucial belief about the value of team deliberation: that multiple perspectives about a specific issue will lead to greater overall knowledge about that issue, and therefore lead to better outcomes. Across the board, moderators were keenly aware that they had inherent limitations to their knowledge about some topics and therefore as an individual would be limited in their ability to make the best decisions for a diverse group. Derrick, for example, described that his own “way of doing things has evolved because of this commitment to talking it out.” Moderators told me that doing moderation work helped them realize that they had blind spots and therefore became more convinced about the value of deliberating with other group moderators.
This is not to say that moderators perceive deliberation as uniformly beneficial. Moderators may struggle with either the implicit or the explicit power structures of the moderation team. Along with a diversity of experiences, moderators also come to the moderation team with a diversity of convictions about their opinions. In this case, moderators with stronger feelings, or simply those more willing to express their feelings, may tend to dominate these discussions. Also, when there is an explicit hierarchy within the group, sometimes moderators with admin or founder privileges may simply take action. Jennifer, who also moderates a spiritual group, described how she felt strongly that the moderation team should not condone the use of sage by members, and most of the moderation team deferred to her cultural knowledge. The group’s admin, however, overruled the rest of the team, leaving Jennifer feeling helpless. Deliberation does not inherently ensure equity in decision-making.
Commiseration
Moderators feel a range of emotions during their work—anger, frustration, boredom—but often feel an obligation to come across as neutral, rational decision-makers to maintain face and communicate the legitimacy of their position to group members. Hochschild (1983) has described such management of personal expression for instrumental purposes as emotion work. In the context of Facebook Groups, this emotion work extends beyond interactions with members: moderators therefore often turn to their teammates to express their real emotions in these cases and accordingly must provide each other with empathetic displays of support in return. The third function of moderation teams emerges from moderators being able to commiserate with one another because of their shared experiences of mostly invisible labor, feeling supported by others when making decisions, and feeling a shared sense of responsibility for group decisions.
Emotional Support
Moderation is a kind of work that many people will encounter online only in negative disciplinary contexts and therefore have very strong feelings about it. However, the moderators I talked to uniformly said that the work was much more complex and nuanced than they would have imagined from the outside. Therefore, having team members who understand the relative pressures of moderation helps alleviate the anxieties that a specific moderator may have about making bad or wrong decisions. As one moderator described it, in many ways the moderation team serves as a “support group” within a group.
After complaining to the current admins about the chaotic state of a large, public fan group, Lance simply woke up one morning and found that his Facebook layout now included moderation controls for that group. He told me that before he was made a moderator, he did not really understand what moderators did or what tactics trolls would use to disrupt the group. He directly compared getting a moderator’s view of a community to what happens when you first get to see the backstage of a theater performance: It’s like when you work anywhere or do anything, like when you work in a play, and you get to be behind and see all the stage and all the things and how things are actually being done, you go to a play you never see it the same way again because you are thinking of all that other stuff. I guess that’s how I would look at that.
As a result, the moderation team is a unique space where moderators can affirm each other in their shared moderation experiences that other members of the group do not have access to.
Because of this unique, shared context, the moderation chat is often a space for moderators to share frustrations about their group members. Knife told me that it was often cathartic to share user-submitted posts in a space where other moderators would appreciate it: It’s like a support group of, ‘You won’t believe what was submitted today!’ Especially in the big group. I can’t tell you how many times people would just submit a random [object]-with-sexy-legs art. I’m like, “What does this have to do with [fandom]? What? In what economy? Where?” So you have people that can [relate to] that.
Andrea’s group for an extreme sport tends to attract a lot of extreme personalities. As a result, she told me that she and her fellow moderators use their shared chat to express frustrations with particularly difficult people by “making fun of people behind their back, yeah, or discussing why is [group member] such a weirdo.”
As explored in the previous section, moderators may ask their team members for advice because they are seeking knowledge about specific perspectives they can use in the group. However, I found that sometimes questions in the moderation chat served more for moderators to get support and affirmation for the choices that they are making. Such “checking in” may not actually change the moderator’s decision, but rather it gives them confidence and support that the rest of the moderation team supports them in their work. Without such affirmation, moderators can feel quite lonely or insecure. I asked moderators what their work would be like without the rest of the team, and many expressed that it can be quite hard to deal with the emotional impact of making decisions by themselves. The configuration of the platform helps produce this negative affect; in turn, this affect demands the creation of this additional layer of social connection among moderators.
Previously, I showed how group discussion allows moderators more access to information about the context. I also interpret these discussions as a means for moderators to express solidarity with each other. By asking for opinions, team members may not just be seeking information but also expressing that they feel others’ opinions are equally important and valid. Simon described this process by describing how moderators will ask others’ opinions simply “as a courtesy” and thereby demonstrate respect.
Moderators are often the targets of harassment by members, especially in response to one or more moderation decisions that members are frustrated with. One of the benefits of having a space where moderators can privately discuss issues is that publicly they can appear united. When Lance was suddenly thrust into the role of a moderator, he felt that he was the only person actively fighting spam and trolls. However, within a few weeks, new moderators started to defend his choices: One of [the other moderators] put up a post the other day saying, ‘You can all stop messaging me because I’m not looking at any other posts that any other people have canceled no matter what you say. We are just going to keep on doing what we are doing here. . .’ . . We are being a united front anyway, those three or four of us. So it’s really gotten a lot better.
Even if these moderators disagreed with specific decisions, they were willing to publicly defend them to the group at large, creating this “united front.”
This united front gives moderators a sense of shared identity, increasing trust among the team. Chuckie works as a school counselor where he helps students with emotional issues. He was recruited to a moderation team for a very large game group because of his skills in dealing with interpersonal and emotional issues. He told me that the moderation team chat was valuable because he felt like he could trust that other moderators would “have his back,” even if they disagreed: [S]ince we’re not getting paid, those sorts of emotional intangibles mean a lot more. If you are going to volunteer your time, but then feel like you are constantly making bad decisions you’re not going to stick with that volunteer position very long . . . And so knowing if you make a call, other people are going to go with it is an important thing.
Many of my participants echoed Chuckie’s sentiment that such group support is what allows them to keep doing the challenging, uncompensated work of volunteer moderation.
Creation of Shared Responsibility
Finally, teams allow moderators to feel a sense of collective responsibility for decisions. When asking participants about what their work would be like if they were suddenly the only moderator on their team, one emergent theme was that it would be taking on all the work would be stressful not only logistically but morally. Anastasia told me that as the only moderator of her group, “I don’t have to answer to anybody else. It’s like being your own boss. But then if something goes wrong it relies on me to end up on my shoulders, so I have to deal with it by myself.” She uses the evocative language here of small businesses: “being your own boss.” There is some pleasure in having full control of her group, as she doesn’t have to “answer” to anyone else above her. However, there is no one to share moral responsibility with her when something goes wrong.
Toby ran into an issue when he started declining a group member’s posts about fascist terror in her country for being off topic for the theme of the lighthearted shitposting group. She, in turn, accused him of supporting fascism, an accusation that shook him deeply: I had to talk to my moderators and be like . . .. am I the fool for thinking I’m not like supporting these fascists or by accepting, by declining these posts, am I accepting this fascist terrorist group? And the rest of my moderators agreed, I was totally in the right.
The moral implication of supporting fascists was too great for Toby to bear alone: it was a genuine dilemma. Talking to his team allowed him to share the responsibility of the decision with other people, who in turn became implicated in handling a morally tricky situation.
Moderation teams also express this shared responsibility through the language of their announcements. In most groups, a single moderator will post an announcement, but credit the statement to the entire moderation team. The phrase “On behalf on the mod team” or similar language begins many of these announcements. As we saw in the previous category, it may not be the case that all members of a moderation team are equally on board with a decision, but the formulation of these announcements communicates a sense of shared responsibility among the team to the rest of the group. In this case, unity is expressed through the shared shouldering of team decisions, taking on risks of both displeasure from group members as well as the moral risk of taking a stand on an issue they may be uncertain about.
Discussion
The Value of Teams
Platform companies make profits by inviting continued participation and attention from users (Couldry & Mejias, 2019; Zuboff, 2019), and thus, the popularity of Facebook Groups provides immense value to Facebook and Meta. I demonstrate here that for a full accounting of the value of online communities, we must include all the labor of moderators, including the logistical, discursive, and emotional labor needed within the moderation team. Such labor may increase the value of online groups through a number of mechanisms, such as increasing the quality of the group or enticing moderators to participate more frequently. Indeed, my data suggest that moderators find great personal rewards from teamwork, and thus continue to contribute free labor for Facebook. However, further research is required to establish these causal links.
The architecture of digital platforms affords cheap scaling (Hanna & Park, 2020), and thus new groups require little up-front investment by either moderators or the platform. Moderators must therefore be self-reliant in both learning how to moderate as well as how to be a member of a moderation team. (Facebook does provide educational material for volunteer moderators about running Groups, but it is difficult to find, and not a single participant in this study reported having used it) The low cost means that for the platform there is no loss if any given group ends up inactive, while there is much to be gained from active and growing groups. The risk of new groups is passed on to moderators themselves, who must put in the investments and face the potential for the mental or emotional fallout of teams or groups that turn toxic. Indeed, research on teams tells us that in an environment like Facebook, we should not be surprised if most teams are unable to achieve personal or collective goals (Hackman, 1998). I found the specific benefits and labor involved in moderation teamwork were not uniform, such as tokenized Black or women members of teams, suggesting that these differences in costs and benefits reinforce existing racial and gendered inequality.
Surprisingly, several moderators from “super groups” of more than one million members told me that Facebook reached out to them to design moderation tools to better support their moderation teams. This is an example of tiered governance (Caplan & Gillespie, 2020), wherein platforms selectively give personal care and attention to their most “valuable” members while leaving general users at the mercy of automated algorithms. Ironically, the moderation teams that get the most help from Facebook are those who have already had the most success in accumulating members.
Thus, despite the many benefits experienced by individual moderators through teamwork, we should not understand moderation teams as categorical positive forces. Jiang et al. (2022) argue that trade-offs define practices of content moderation: for example, in enforcing site rules, moderators must balance leniency (which may lead to harm for group members) with harsh discipline (which may lead to stifling community voices). Similarly, I suggest that the creation and maintenance of moderation teams also involve trade-offs. A well-functioning moderation team can decrease individual moderation labor and provide support, but only when individual moderators invest in their relationships with other moderators. This investment may prove fruitful, as it did for many of the participants in this study, but there is no guarantee that any given self-created and unsupported team will prove worth the time and energy for all team members.
Going Forward
This article has used ethnographic methods to detail the complexity of the social processes at work within volunteer content moderation teams, shedding light on the costs and benefits such teams provide for individual moderators, group spaces, and platforms, and how such costs and benefits are not equitably distributed. Research questions pursued by scholars of content moderation should take these factors into account.
In addition, platforms like Facebook are one place where people find community online, but they are not necessary for such connections. Moderation practices predate the era of platform companies (Hafner, 1997), and digital community spaces of the future may not include platform companies at all, especially in their current configurations. Interpersonal labor, both paid and unpaid, has supported digital social worlds and will continue to do so. For a more just and equitable future, researchers and designers of such worlds must consider how practices of process, discursive, and emotional labor can be recognized and supported beyond the status quo.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Stanford University > School of Humanities and Sciences, Stanford University Graduate Research Opportunity.
