Abstract
The 2018 passing of FOSTA (Allow States and Victims to Fight Online Sex Trafficking Act) and SESTA (Stop Enabling Sex Traffickers Act) set new limits on the free speech protection of Section 230 of the 1996 Communication Decency Act. In the aftermath, many social media sites shut down forums used by sex workers, and on some sites sex workers were systematically deplatformed or shadow banned. Social media platforms are critical spaces for sex workers. In addition to using them to safely do their work, social media platforms offer a place to communicate explicitly about their profession and find community support, particularly for sex workers from marginalized communities who are often physically isolated and lack support networks. We performed a qualitative analysis of Community Guidelines from seven social media platforms (Twitter, TikTok, Facebook, Instagram, Reddit, Snapchat, and Tumblr). Our research focused on guidelines related to nudity, sexual content, and solicitation. The restrictions placed on content by social media platforms have serious implications for the ability of sex workers to do their jobs safely, advocate for themselves, and find community support online. Community guidelines, as they relate to sex workers, need a more nuanced approach to balancing free speech rights and preventing harm that prioritizes the sometimes competing needs of marginalized communities.
In April of 2022, media attention turned to Twitter when billionaire and “free speech absolutist” Elon Musk bought the site and issued a statement proclaiming, “free speech is the bedrock of a functioning democracy, and Twitter is the digital town square” (Allyn, 2022). While debates emerged over Musk’s proposed changes to the site’s content moderation practices, one community that expressed trepidation about the future of the social media platform was sex workers (Cole, 2022). The 2018 passing of FOSTA (Allow States and Victims to Fight Online Sex Trafficking Act) and SESTA (Stop Enabling Sex Traffickers Act) set new limits on the free speech protection of Section 230 of the 1996 Communication Decency Act (Jackman, 2018). In the aftermath, sex workers were systematically deplatformed from sites like Facebook (Lee, 2018), Tumblr (Leskin, 2018), and TikTok (Dickson, 2020). Twitter, as one of the few mainstream social media sites that allows explicit sexual content, has a sizable community of sex workers ranging from escorts, sugar babies, and erotic masseurs to sexcam workers, fetish models, and porn actors (Cole, 2022). In addition to possibly losing access to a platform that provided public forums and valuable community resources, adult content creator Lucy Banks noted that the possible changes to Twitter were worrisome because in the aftermath of FOSTA and SESTA, it had become “difficult to just exist online as a sex worker” (Cole, 2022). The sexual content restrictions set forth in community guidelines have serious implications for the ability of sex workers to participate in public spaces, advocate for themselves, and find community support online. The passing of FOSTA and SESTA has encouraged social media platforms to use community guidelines to over-police sex workers. This essay provides an analysis of community guidelines that sees them not as neutral documents governing all user behavior, but as culturally situated statements that can disproportionately impact stigmatized groups.
Section 230 and the Regulation of Sex Work on Social Media
In 1996, Congress passed the Communication Decency Act to attempt to regulate pornography online (Kosseff, 2019). Most of the law was struck down by the Supreme Court in the 1997 case Reno v. ACLU. What remained was Section 230, a short provision that reads, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (Kosseff, 2019, p. 2). Jeff Kosseff explains that this provision, which distinguishes platforms from publishers, has been critical to the way platforms regulate speech. Tarleton Gillespie (2018, pp. 48–49) identifies the emergence of two common philosophical approaches to regulating speech: (1) the platform as a speech machine model, which focuses on a “commitment to free speech, self-expression, and access to information,” and (2) the keeper of community model, where “the platform makes possible a diverse but fragile community, one that must be guarded so as to survive.” These two philosophies map on to the different approaches to Section 230. In one interpretation, Section 230 is a safe harbor law, and it provides cover for a broad interpretation of free speech online (Medeiros, 2017). Danielle Citron and Mary Franks (2020, p. 46) take issue with that interpretation, arguing that in the context of the original law Section 230 was meant not as a safe harbor but as a “good Samaritan” provision that was supposed to encourage platforms to police content by not punishing them for how the content was policed. Social media platforms have used both the safe harbor and the good Samaritan approach seemingly interchangeably to protect speech that increased their profits (Cohen-Almagor & Lehman-Wilzig, 2022). Often, speech related to sex work on social media platforms has been both overtly and covertly allowed because of its profitability. The passing of FOSTA and SESTA made the legal protections of Section 230 in relationship to sex work more precarious.
The internet has given consensual sex workers a level of safety that was unavailable with strictly in-person work—“screening clients before meeting them, sharing information about dangerous clients, and finding work without relying on pimps”—and under Section 230 sites that allowed this were immune to prosecution (Tripp, 2019, p. 219). Prior to FOSTA and SESTA, Craigslist, one of the key testing grounds for Section 230 and sex work was brought to court multiple times with demands that they shut down their “erotic services” section (Lingel, 2020). Court rulings repeatedly pointed to the platform’s neutrality—it was not actively encouraging illegal behavior—and ruled that the site could not be held responsible for the content of ads placed by users. These court cases, Jessa Lingel argues, gave social media platforms overall “a vested interest in presenting themselves as ideologically agnostic” (p. 54). Backpage, a classified advertising site similar to Craigslist, used Section 230 as “both a shield and a sword” by first claiming complete immunity from prosecution because it was not “the speaker, publisher, or distributor of the content on its site,” and then arguing that law enforcement attempts to shut down payment methods on the site were harming their right to free speech (Franks, 2019, pp. 146–147). With the passing of FOSTA and SESTA, Section 230 no longer protected these sites (Lingel, 2020). Backpage swiftly shut down, while Craigslist began charging and developed a screening system for certain types of ads. Other sites, seeing the legal action against Craigslist and Backpage, preemptively began constraining sex workers. Where previously certain activities might have been ignored because of profitability, Coombes et al. (2022) note that the legal climate “exacerbated” social media platform’s “whorephobic” perspective, which intensified the over-policing of sex work. This is particularly true where sexual content is concerned, as social media sites tend to err on the side of regulating based on hypothetical users that might be offended (Spišák et al., 2021).
Creating space for the speech activities of a diverse community of sex workers online contributes to overall public conversations about sex by drawing attention to perspectives often ignored by mainstream media. Melissa Gira Grant (2014) explains that typically in public conversations the focus is on sex work as a set of activities or a form of content, and that framing is used to remove sex workers themselves from the conversation. Sex workers are invited into public conversation primarily as victims of sex work. The victim lens both decreases the agency of sex workers and allows for a very narrow interpretation of the value and meaning of sex work. Social media platforms potentially allow sex workers to operate without the variety of media industries that typically act as third-party managers for sex work and this creates more options for sex workers to enter public conversation (Are & Briggs, 2023). Swords et al. (2021) note that social media platforms are important spaces for sex workers in terms of communication that is explicitly sex work—for example, advertising services and connecting with potential customers or consumers—but also for having conversations about sex work. Sex workers use social media platforms to share health and safety information, seek legal advice, and engage in a community with other professionals. Sex workers in general, and those that are disabled or represent gender minorities in particular, are often physically isolated from their community (Coombes et al, 2022; York, 2022). Multiple scholars have pointed out that the passing of FOSTA and SESTA have made the rights of sex workers online more precarious than ever before and increased the isolation (Bronstein, 2021; Gerrard, 2020; Paasonen et al., 2019; York, 2022). Social media platforms are critical sources of connection and community as sex workers seek connection and advocate for themselves.
Methods
To understand the free speech restrictions placed on sex workers on social media platforms, this essay focuses on community guidelines. Community guidelines are important documents for identifying how the values and philosophies of platform creators are translated into concrete practices (Maddox & Malson, 2020). While terms of service lay out in legalistic language exactly how a site will be moderated, Tarleton Gillespie (2018, p. 45) explains that community guidelines are “like a constitution, documenting the principles as they have been forged over routine encounters with users and occasional skirmishes with the public.” Though insufficient for understanding the full experience of social media platforms, they are the place where “the human rather than the machine comes to the fore. . . they are the spaces where interpretations of values and rules are consciously conveyed” (Gerrard & Thornham, 2020, p. 1272). This essay follows the lead of Gillespie (2018) to take a qualitative approach to analyzing community guidelines, with an emphasis on what Amanda Cullen and Bonnie Ruberg (2019, p. 2) term a “socially-informed critique of community guidelines.” The approach involves an analytical reading of community guidelines with a focus on the way those guidelines pertain to the cultural practices of a particular set of users. As Kate Klonick (2018) points out, standards are a reflection of values, and how they are translated into practice in particular situations is strongly related to social norms. In our research, that analytical lens was focused on sex work and sex workers with an eye toward social norms and stigma surrounding both. We used Swords, Laing, and Cook’s (2021) four general categories of sex work as we were reading the guidelines to help isolate places where the guidelines would allow or restrain different types of sex work online: (1) in-person, direct sexual experiences, (2) at-a-distance, indirect “live” experiences, (3) indirect purchasing or consumption of material, and (4) asynchronous consumption and interaction. Our research took place in 2022 during the months of June and July. Community guidelines are constantly evolving, and a critical reading is meant to provide a snapshot of a particular moment (Cullen & Ruberg, 2019; Maddox & Malson, 2020).
This essay focuses on seven social media platforms: Twitter, TikTok, Facebook, Instagram, Reddit, Snapchat, and Tumblr. We were interested in the regulation of sex work and sexual content on popular sites where sex workers would have the ability to speak to the broader public, as opposed to sites specifically designated for sex work. In 2022, Facebook, Instagram, TikTok, Twitter, Snapchat, and Reddit were all ranked among the top 10 most popular social media sites (Walsh, 2022). Tumblr was included because of their complicated history with sex work, which will be discussed more later in the essay (Pilipets & Paasonen, 2022). In choosing the sites, we were conscious not of not simply focusing on sites where sex work and communities of sex workers were readily visible. One issue with researching a marginalized community online is already made decisions that lead to silences that make it appear that certain voices were never present (Gerrard & Thornham, 2020). Large-scale deplatforming and shadow banning, both discussed in more detail later in the essay, can give the appearance that a community is not present when they have actually been removed or hidden. Furthermore, in online spaces, awareness of regulation can make users more likely to self-censor by simply not posting certain content or deleting content they are nervous will be flagged for content moderation (Duffy & Meisner, 2022; Gibson, 2019). Ysabel Gerrard and Helen Thornham (2020) explain that researchers must “acknowledge the dynamic and iterative processes of platforms always-already coming into being, rather than considering them as static or fixed objects of study.” Certain communities of sex workers may not be visible on a social media platform, not because they are uninterested in the platform but because they have been removed from the site.
Community Guidelines and Sex Workers
In reading community guidelines across the eight platforms, we found three categories of restrictions that pertained most directly to sex workers: regulation of sexual content, regulation of nudity, and regulation of soliciting. In Tarleton Gillespie’s (2018) analysis of the standard issues discussed in content policies, he noted that nudity and sexual content are commonly regulated in some way across most social media platforms. He collapsed the categories of nudity and sexual content into one, but for the purpose of our research we found that policies on nudity and sexual content have distinct and nuanced implications for sex work that require them to be treated separately. With that said, as Carolina Are (2022) has noted, content policies themselves often conflate nudity and sexual content in ways that reflect US attitudes toward these topics. Regulation of soliciting is, to some extent, the same across different platforms as almost all platforms acknowledge it as an illegal activity. Where the variation happens is how platforms attempt to define when solicitation is happening and what activities should be restricted as solicitation. In our final area of analysis, we discuss enforcement of community guidelines.
Nudity
Social media nudity policies have received considerable scholarly attention, especially as they relate to breastfeeding (Gillespie, 2018), gender affirming health care (Albert & Haimson, 2022), and male versus female-presenting nipples (Gerrard, 2020). In our reading, Facebook, Instagram, and TikTok had the most extreme bans on nudity. Both Facebook and Instagram had explicit prohibitions on nudity. In Instagram’s prohibition on nudity, it said it was not allowed and linked to Facebook’s more explicit definition of what constitutes nudity. Both sites did allow a provision for certain forms of nudity or implied sexuality, such as childbirth scenarios and gender confirmation surgery, but the content needed to be flagged as sensitive, so it is only available to users 18 and older. Facebook had a particularly interesting nudity guideline. They do not allow images with a “visible anus and/or fully nude close-ups of buttocks unless photoshopped on a public figure” (emphasis added). The site so strictly bans nudity, that it seems incongruous to allow something as graphic as a visible anus if it is part of political speech. In their attempts to be precise, some of the guidelines read as strange. TikTok’s restrictions on nudity were strict and did not give caveats or exceptions. The platform guidelines read: “do not post, upload, stream, or share” any content that “depicts genitals, buttocks, the pubic region, or female nipples.” Furthermore, where Facebook makes an exception for some photoshopped images, TikTok specifies that their policy includes “digitally created or manipulated content, of nudity.”
Since its creation, Tumblr’s policy on nudity, pornography, and sexually explicit content has gone back and forth from highly permissive to excessively strict and back to permissive. The microblogging site, founded in 2007 by David Karp, was acquired by Yahoo! in 2013, and a major controversy during the acquisition was the amount of pornography on the site (de la Merced et al., 2013). Many of Tumblr’s most popular accounts trafficked exclusively in pornographic content (Perez, 2013). The site allowed nudity and sexual content so long as it was tagged as NSFW (not safe for work). Karp openly defended the company’s policy as a free speech issue, and he explained that “we’ve taken a pretty hard line on freedom of speech, supporting our users creation, whatever that looks like, and it’s just not something that we want to police” (Dickey, 2013). In 2018, in the wake of the passing of FOSTA and SESTA and Apple banning Tumblr from their app store because of sexual content, the company announced it was banning nudity from the site (Masnick, 2018). Elena Pilipets and Susanna Paasonen (2022) argue that this move had a detrimental impact on marginalized communities online that had migrated to Tumblr because it both had a very relaxed policy on nudity, and it was not expressly a porn site. For sex workers, being banned from Tumblr was just another moment of being “pushed to the margins and pushed off the internet. . . Sex workers share important info on Tumblr like providing folks with education, and resources, and really just much-needed community” (Darby, 2018). When we were looking at Tumblr in July of 2022, their nudity policy was very brief, but it specified “Don’t upload images, videos, or GIFs that show real-life human genitals or female-presenting nipples.” In November of 2022, Tumblr revised their policy to say “Nudity and other kinds of adult material are generally welcome.” The site’s rapidly shifting policies make the status of sex workers precarious, as it is unclear when or if a welcoming policy may revert to a more restrictive one at any moment. Conversely, during the time we were conducting our research, Snapchat did not mention nudity in their policy at all, but in an updated version of the policy from 2023, they included a ban on “nudity where the primary intention is sexual arousal.” The constant changing of platforms’ policies makes it difficult for users to navigate the site and develop community.
Reddit and Twitter had much more permissive policies when it came to nudity. Reddit makes no reference to nudity in their community guidelines. On Twitter, the phrase “adult nudity” is used to distinguish nudity that is “intended to cause sexual arousal.” Adult nudity is not allowed in “highly visible” locations on the site, such as live videos that would autoplay in someone’s feed, profile photos, or banners. It is likely because these are the types of content a user would be unintentionally exposed to. If an account includes this kind of content in posts, it has to be marked as sensitive. Content marked as sensitive on many social media platforms is blurred, and a user must click on the image to make it visible. One of the difficulties previously discussed is balancing the harm of exposing users to unwanted sexual content against the harm of depriving an already marginalized community of a critical technology they can use to speak to the public. Twitter seems to have found a nice balance here in that sex workers are able to place nude images or sexual content on the site in a way that requires users to consciously choose to view it.
Sexual Content
When the phrase “sexual content” is left vague in community guidelines, or it is clustered with other difficult to define terms like “obscene” or “pornographic,” then in practice it becomes a tool to restrict the participation of individuals that a community finds objectionable. Bonnie Ruberg (2021), in her research on video game live streaming platform Twitch, found that while the site banned “sexual content,” it did a poor job of defining the term. As a result, interpretation of the ban was heavily informed by the sexist cultural logic of the community on the site. Further research into Twitch’s guidelines indicates that the process of enforcing the vague standard created a situation where men felt empowered to police the participation of women on the site (Zolides, 2021). The specificity of a site, in terms of the meaning of the phrase “sexual content,” is critical.
Facebook and Instagram have some of the most specific guidelines when it comes to what constitutes sexual content. They start out their policy by stating that they “default to removing sexual imagery to prevent the sharing of non-consensual or underage content.” Caroline Are (2022, 2023) points out that Instagram and Facebook have a history of using the justification of protecting audiences from sexual content to protect themselves from accusations of restricting speech. Facebook and Instagram do attempt to specify what constitutes sexual activity—for example, “explicit sexual intercourse or oral sex,” “explicit stimulation of genitalia or anus”—and they also specify that implied sexual activity through the presence of an “erection” or “by-productions of sexual activity” are prohibited. One of their guidelines prohibits depictions of “squeezing female breasts,” which includes specifics of how one can tell that the breasts are being squeezed and specifies that they do allow squeezing if breastfeeding is taking place. These specifics were added to the policy after memos were leaked where moderators asked how they were supposed to distinguish between someone covering their breast with their palms to prevent nudity and squeezing their breast to imply sexual activity (Are & Paasonen, 2021). While these guidelines would definitely make it difficult to use the site to engage in certain types of sex work, throughout the guidelines they make exceptions for depictions in a “medical or health context.” This opens up the possibility that sex workers could use the site to share information.
The most permissive sites—Reddit and Twitter—had policies that in some ways were deceptively open. For Reddit, their only prohibition in this area says, “keep it legal.” The site is often lax in enforcing that policy. Twitter’s policy started out by expressly stating the goal was to “balance allowing people to share this type of media with helping people who want to avoid it to do so.” Worthy of note in Twitter’s policy is that while an individual tweet with sexual content may be allowed, but in the policy it states that accounts “may be permanently suspended if the majority of your activity on Twitter is sharing sensitive media.” The placement of this warning implies that the sensitive content is “adult media.” Certain identities, particularly those of sex workers, tend to be culturally coded as sexual content. A sex worker’s very existence on the site, even if they are not posting primarily about their work, can be treated as sexual content. Genesis Lynn, the owner of Fetish Con, had her Twitter account banned because of adult content (Dickson, 2021). Much of the account was logistical information about the convention she runs, and the account contained no nudity or pornography. Because of who owned the account, all content was treated as sexual content. Stories like this are worrisome because they indicate the way that removing sex work from platforms ultimately polices the amount of sexual discourse on platforms. As Carolyn Bronstein (2021) points out, these policies are not simply decreasing sexual content; they are decreasing discourses about sex. The policy is concerning for transgender sex workers, for whom social media platforms are a critical space (Pezzutto, 2019). If someone understands all trans content to be sexual content, then someone posting a lot about trans identity issues and bodies would have an account that is primarily sexual content. The policy opens the door for certain identities to be more rigorously policed.
Three sites had incredibly vague sexual content bans. First, TikTok did not allow content that “explicitly or implicitly depicts sexual activities.” There are some ways around this policy. TikTok only removes content once it has been reported. The site has a growing community of strippers who post under #striptok and use the site to share safety and health information, advice on supplies, and tutorials about technique (Connors, 2021). Using the hashtag for posts means that the videos are generally only viewed by users who are interested in them and less likely to report them as sexual content. With that said, the strippers who use TikTok say it is always a gamble because even content featuring something like hygiene information may be flagged as depicting sexual activities. Second, Tumblr had a ban on “adult content” that included a link to a statement from 2018 offering some explanation of this ban. The statement in the link specifies that they ban “any content—including photos, videos, GIFs and illustrations—that depicts sex acts.” They did, however, still allow “written content such as erotica.” The statement also specifies that older blogs with explicit content will be removed from search results. Interestingly, even though Tumblr lifted their ban on nudity in November of 2022, the new guidelines still do not allow “visual depictions of sexually explicit acts.” Finally, Snapchat does not talk about nudity in their policy, but they do “prohibit accounts that promote or distribute pornographic content.” As Gillespie (2018) points out, policies that allow nudity but prohibit pornography leave a lot of leeway for content moderators to use discretion. In this case, there is tremendous leeway because the policy gives no elaboration about what constitutes pornography. Snapchat’s policy got some attention when the porn company NaughtyAmerica created a filter for the site (Roettgers, 2019). Snapchat filters can function as a form of augmented reality where the user takes a picture, and the filter changes their environment. In this case, the filter made it appear that porn stars in various stages of undress were in the picture. The site specifically pointed to their prohibition on pornography when having the filter removed. Due to the nature of Snapchat and its interface with messages disappearing after viewing, it is used as a tool by sex workers for custom content requested by customers. Sex work is often advertised on other social media sites and then distributed through Snapchat. These tools help sex workers establish themselves in one place while offering content in another.
When platforms have these types of vague bans on sexual content, the sex workers impacted most are the ones working independently (York, 2022). These bans often disproportionately impact queer and female-presenting bodies. For disabled sex workers who may be immunocompromised, being able to do sex work virtually is critical for safety, and these bans both compromise their safety and push the visibility of erotic representations of disability further into the margins of fetishism (Coombes et al., 2022). Bans that are directed at the avenues that amateurs have available for building an audience allow commercial porn sites to determine what is visible. The question of visibility is almost as important as the question of what is explicitly banned. Film maker Erika Lust explains that When pages that promote female pleasure are hidden, we understand that our pleasure is invalid . . . when drawings of vaginas are removed, we learn that we should be ashamed of our bodies. When female nipples are censored but male nipples are not, we know that we must police our own bodies to ensure we do not arouse men. . . The bodies, sexualities, and desires that are allowed online translates itself into the bodies, sexualities and desires that are accepted in society. (as quoted in York, 2022, p. 157)
Access to social media sites for sex workers is in part about who is able to make a stable income in the sex work industry, but it is also about whose voice is heard in the conversation about what sex is supposed to look like.
Soliciting
Paid sex is a common taboo, and “sex work is generally excluded from the notion of good sex, that is, socially acceptable sexual practices that should take place in a relationship void of monetary exchange” (Paasonen, 2010, p. 1305). After FOSTA and SESTA, platforms became legally liable if users were using them to facilitate paid sex. In this way, FOSTA and SESTA linked paid sex with sex trafficking. Several sites use the lens of sex trafficking to talk about soliciting. Facebook’s policy says that banning soliciting is about avoiding “facilitating transactions that may involve trafficking, coercion and non-consensual sexual acts.” Tumblr’s only reference to solicitation is actually specific to human trafficking and “illegal prostitution.” Similarly, TikTok’s policy said they did not allow content that promoted “sex trafficking or prostitution.” Despite FOSTA-SESTA, some platforms had very little to say about solicitation. Twitter addresses solicitation in their community guidelines only in the context of harassment and abusive behavior. The “solicitation of sexual acts” is listed under “unwanted sexual advances” as a prohibited behavior. The implication seems to be that the site does not necessarily ban solicitation in general, but it can be reported by a user as a form of harassment or abuse. Snapchat had no reference to either solicitation or paid sex in their guidelines.
Several sites use the terms solicitation as an umbrella for a variety of sex work that involves paid sex, but it was not always clear what constituted solicitation. Facebook had some of the most detailed guidelines regarding solicitation. They specify that solicitation includes, in addition to paid sex, things like advertising “strip club shows,” “tantric massages,” and the offering of “sexual fetish items.” Facebook attempts to draw a line between conversations about sex work and conversations that are sex work, stating that “We also allow for the discussion of sex worker rights advocacy and sex work regulation. We draw the line, however, when content facilitates, encourages or coordinates sexual encounters or commercial sexual services between adults.” Instagram linked to Facebook’s policy to specify different forms of solicitation. In 2019, there was a conflict between Instagram and the Adult Performance Artists Guild, a union that represents actors in the porn industry, over what constituted solicitation (Steadman, 2019). Accounts were being taken down for violating rules regarding solicitation, when by any legal definition they were not engaged in solicitation. Carolina Are (2021, 2023) explains that this is part of a larger problem of Instagram treating sexually suggestive posts on an account where the profile talks about sex work as solicitation, even if the post itself is not soliciting. Similarly, Reddit has been overreaching in interpreting their solicitation policy. Reddit has a general policy against content meant to “solicit or facilitate any transaction or gift involving certain goods and services,” and they specify that this includes “Paid services involving physical sexual contact.” Despite this policy, historically, Reddit has been relatively safe for individuals engaging in various forms of paid sex, and the platform has had large and active communities of sugar babies, prostitutes, and escorts (Lam, 2020). Raven Lam (2020) notes that Reddit has been an important site for sugar babies to communicate with each other about feelings about their work, health and safety issues, and professional practices. When FOSTA and SESTA were passed, Reddit began banning a number of subreddits that were gathering sites for escorts and sugar babies (Brown, 2018; Dickson, 2022). Many of the banned subreddits were not forums for transactions; they were forums for conversation. Vague policies allow for this kind of broad interpretation.
Enforcement
Community guideline enforcement can vary greatly, with some content being banned, accounts being suspended, or users being deplatformed. Crackdowns on sexual content have been so prevalent in recent years that some scholars have argued sites are “deplatforming sex” (Blunt et al., 2021; Paasonen et al., 2023; Tiidenberg, 2021). Social media platforms are critical sources of connection for sex workers, and suspensions and bans exacerbate conditions of physical isolation and create conditions where sex workers are more likely to engage in risky behaviors (York, 2022). In the case of a user banned from a site, “there may be other platforms available, but the banned user cannot take with her an entire network of people, an accumulated history of interactions, or a personal archive of content” (Gillespie, 2018, p. 177). When community guidelines include threats of suspension or deplatforming, users are likely to self-censor out of fear of losing their community (Gibson, 2019). Even in the cases where users are not suspended or removed from the site, sites that hide certain users and their content from search results prevent those users from reaching public audiences (Are, 2022; Are & Briggs, 2023).
The sites we looked at took a variety of approaches to moderating content that violated community guidelines. Facebook and Instagram, in their community guidelines, state that they will remove content that violates policy, and after multiple violations they will remove a user. Reddit has many stages that include being asked to “knock it off” and having a forum quarantined, flagged as NSFW, or taken off the main page. TikTok openly talks about its use of shadow banning and filtering, saying that for content that potentially violates policies they will “reduce discoverability, including by redirecting search results, or making videos ineligible for recommendation in the For You feed.” Snapchat and Tumblr do not say expressly what will happen if content violates policies. However, Tumblr has a history of hiding certain topics from searches, a practice called shadow banning (Are, 2022; Pilipets & Paasonen, 2022; Wright, 2022). In this case, the content is not taken down; it is just hidden from other users. Sites can also regulate content at a platform level by using the site algorithm to decide what is prioritized in feeds and making certain types of content unsearchable. The more subtle forms of filtering can be just as damaging as all out bans.
While the research in this project did not focus on content moderation processes, it is worth noting that previous research indicates that moderation done by both humans and artificial intelligence (AI) has been problematic (Klonick, 2018). Instagram relies partly on AI technology to detect content that violates community guidelines (Are, 2023; Jones, 2023). The site uses AI that searches pictures for nipples, and if a nipple is detected, it uses the amount of fatty tissue surrounding the nipple to determine if the nipple is part of a female-presenting breast and the image needs to be removed from the site. This has led to complaints about the AI removing images of nipples for male identified users in cases where the person was trans and had not had top surgery or the person was fat. Sex work, in particular, is precarious on sites like Instagram, and pushing back against automated takedowns is difficult for users who may not want to draw attention to their presence on the site (Are, 2023). Conversely, Twitter relies on human reporting to identify content that violates policy. In 2019, Vice reported sex workers all over the site having their accounts locked after they were reported. The users were told they had to provide their phone number to verify they were not bots. Many were reluctant to give up that information to a site with a history of being hacked, and they were locked out of their profiles permanently. The policy of the site gives considerable leeway to sex workers, but the moderation mechanisms open up the ability for sex workers to be harassed.
Deplatforming and shadow banning have a disproportionate financial impact on independent sex workers. These enforcement mechanisms disproportionately impact “Black performers and other people of color, fat people, and members of the LGBTQ+ community” (Bhalerao & McCoy, 2022, p. 10). With the shifting profit structures of the porn industry, the majority of profit for actors now comes from ancillary sex work, such as “webcamming, escorting, and strip club performances managed under one’s star brand” (Paasonen et al., 2019, p. 59). For this work, social media platforms are critical: “they provide (in many contexts) legal spaces of work and help to facilitate safety strategies for sex workers . . . sex workers utilize platforms to advertise and communicate with potential customers and consumers, take payment and provide services or content” (Swords et al., 2021, p. 2). Models and actresses make money off of their circulating work by maintaining intimate relationships with their audience through social media—often with messaging and conversations with their audience (Paasonen et al., 2019). Basically, they do the branding work that was previously done by a major production house to make their streaming work profitable. This is particularly important for actors and models not employed by a major studio: amateur porn needs to come across as more real in the bodies, acts, and relations that it conveys . . . it takes some work to develop a desirable amateur porn commodity, and the relatable authenticity of amateur pornography necessitates crafted performances of intimacy and domesticity. (Paasonen et al., 2019, p. 67)
Social media sites allow amateurs to develop relationships with their audience that promote their work in the absence of large studio backing. What’s more, sex workers who work primarily online experience less violence and report fewer instances of assault than their offline counterparts (Sanders et al., 2018). One peer organizer explained, “It’s important to remember that what makes a site essential for sex workers is not that it simply allows sex work technically but that it allows sex workers to be a part of society and interact with other people” (Dickson, 2022). Enforcement of policies and philosophical preferences regarding sexual content has implications both for sex workers using online spaces to build community and those looking to advocate for themselves to the public at large.
The Future of Free Speech Online for Sex Workers
Part of the argument against the participation of sex workers on traditional social media platforms is that sex work sites exist, and sex workers could simply migrate over to those platforms. Moving all sex work onto designated sites, away from more mainstream social media platforms, comes with its own problems. It sends a message that sex work needs to be hidden, and it allows economic forces to control conversations about what sex looks like by deciding how different forms of work are categorized, whose work is prioritized, and what bodies are most desirable for sex work. Katrin Tiidenberg (2021) warns that the focus on commercial gain means that the regulation of sex is overly influenced by moral panics that result in undervalued groups being over regulated. This has a broader impact on not just sex workers, but on conversations about sex in general: By forbidding any potentially sexual content, tech companies are furthering the sexual ideals proliferated by mainstream pornography sites—ideals that many feminists have long considered harmful. And, in particular, by banning positive and realistic depictions of women’s bodies—many of which are created and shared by women—Silicon Valley companies are ensuring that the status quo will remain. (York, 2022, p. 156)
Beyond the very narrow view of sex that comes with forcing sex workers to move to designated sites, there is the problem of sex workers losing public spaces to hold sex-oriented sites accountable. Susanna Paasonen et al. (2019) point out that assumptions about the universality of exploitation and abuse in the porn industry arise from a false sense of unity within the industry. While the porn industry was never truly cohesive, the shifting business models that arose with the internet have made for an even more disparate community. We argue that the same can be said for most of sex work, which largely by virtue of it being illicit and either unregulated or ignored, has never been an industry with clear health and safety standards. In 2015 and 2016, Twitter was a key site for porn actresses drawing attention to sexual violence in the industry (Paasonen et al., 2019). It gave them a platform to tell their stories and generated media attention they were able to leverage to create some change in the industry. Access to the general public through social media is about safety.
When the FOSTA and SESTA were first passed, law enforcement complained that the elimination of sites like Backpage did not make sex traffickers go away; it just made them much more difficult to track (Fischer, 2018). The ads on these sites were being used for investigations, and when they were banned law enforcement lost a valuable source of information. What’s more, sex workers said that sites like Craigslist and Backpage had allowed them to screen clients, something they could not do in person, and the passing of these laws made in-person sex work more dangerous (Romano, 2018). While it has become increasingly difficult for sex workers to find safe platforms online, FOSTA has proved to be an inconsistent tool for victims of sex trafficking seeking to take social media platforms to court. Section 3 of FOSTA specifically deals with the use of “an interactive computer service,” and it allows victims of sex trafficking to sue platforms. A group of child pornography victims sued Reddit, claiming that the site did not make efforts to police child pornography and profited from message boards that hosted the content (Poritz, 2022). In October of 2022, the San Francisco–based Ninth Circuit ruled that, despite FOSTA, Section 230 shielded Reddit from liability. The court said that there had to be proof that the site was aware of the problem. This ruling comes a year after the anti-pornography National Center on Sexual Exploitation was allowed to move forward with a suit against Twitter claiming that the site knowingly hosted material depicting child sexual abuse (Cole, 2021). That lawsuit is not yet resolved. A U.S. Government Accountability Office (2021) report released in 2022 found that there have been no charges brought or civil damages awarded under Section 3 of FOSTA. As Carolyn Bronstein (2021) points out, the law caused a decrease in the arrest and prosecution of sex traffickers, while it functionally deplatformed sex workers.
Sex workers need online platforms to have safe ways to avoid in-person sex work, to screen clients for in-person work, and to be able to make decisions about what their work looks like without the control of the pornography industry. The difficulty here is that some site users may not want to be exposed to sexual content, and some site users may not be old enough to consume sexual content. Sites like Twitter make explicit in their guidelines where sex workers can and cannot post sexual content. Making these things clear in guidelines can help multiple communities to co-exist on the platforms. Furthermore, social media sites are important places for sex workers to find community where they learn best practices for health and safety, and where they are able to explore their feelings about the work away from conversations that demonize what they do. Guidelines like Facebook and Instagram that have carved out exemptions for content related to health issues provide a framework for sex workers wanting to participate on the site. Making guidelines explicit, and specifying where content is allowed, can help sex workers figure out how to navigate sites. Rasika Bhalerao and Damon McCoy (2022), in the analysis of Terms of Service, have argued that one way to make policies easier to navigate is to invite sex industry workers to participate in the process of drafting the policies that impact them. Finally, having a public platform allows sex workers to advocate for themselves when abuse happens. None of the guidelines we looked at carved out exemptions for sexual content in the context of individuals reporting experiences of abuse or sexual violence. This may be an area for platforms to explore. The restrictions placed on sexual content by social media platforms in the wake of FOSTA and SESTA have serious implications for the ability of sex workers to advocate for themselves and find community support online, highlighting the need for a more nuanced approach to balancing free speech rights and preventing harm that prioritizes the needs of marginalized communities.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
