Abstract
Discussions of online content moderation often focus on the platform, however credit card networks and payment processors determine what content can be monetized and therefore placed on adult platforms. Through fieldwork and interviews among adult industry stakeholders and a survey of adult content creators, this paper demonstrates how these financial actors impose a moral ordering of sexuality that prioritizes credit card brand reputation and optics over the autonomy and integrity of sexual subjects. Visa and Mastercard, via payment processors, suppresses kink content in the name of ensuring consent and safety. This process inappropriately broadens the definition of ‘harmful’ sexual content—what we call ‘definitional creep’—such that private financial entities can effectively create de facto global obscenity law that suits corporate rather than collective interests.
Introduction
With e-commerce and the digitalization of payment, financial providers are required to monetize sexual content on adult platforms (Tusikov, 2017). However, these financial providers, including credit card networks and intermediary payment processors, do not treat all content equally. Visa and Mastercard maintain stricter guidelines and policies for platforms that host sexual content such as pornography and live webcam shows. Ostensibly designed to ensure that all content is created consensually, legally, and safely, the actual outcome is censorship of certain erotic fantasies, themes, and aesthetics and the denial of adult sexual autonomy, diversity, and bodily integrity. In this paper, we argue that the rules set out by financial companies and enforced by adult platforms (re)produce a normative moral ordering of sexuality, enacted through a particular misreading of safety, risk, harm, and consent. The corporate interests of credit card networks, passed on to payment processors and platforms, determine how sexual content is moderated on adult websites (Franco, 2024; Hill, 2024). We argue that, in prioritizing brand reputation, these rules define many sexual acts and fantasies as harmful or non-consentable according to normative moral hierarchies. We warn that a ‘definitional creep’ is underway, in which this understanding of harm is expanding to encompass more and more sexual conduct and content, demonstrating how private financial entities are in a position to create de facto obscenity regulation on a global scale.
Previous literature has examined how governmental porn legislation, particularly in the UK and Australian contexts, enforces a misogynist, heteronormative, white supremacist, ciscentric, ableist, and anti-kink moral ordering of sex acts (Bloom, 2015; Thorneycroft, 2020). These governments are at least nominally beholden to the public and subject to the democratic process, whereas payment intermediaries are primarily responsible to their private shareholders and are not required to offer transparent decision-making (Stardust, 2018). Consequently, the ‘threshold test for content has moved from obscenity and offensiveness under classification law to profitability and risk under capitalist enterprise’ (Stardust, 2018: 170). Most research on how commercial interests drive sexual content moderation, censorship, and deplatforming investigates at the level of platforms (Paasonen, 2021; Tiidenberg, 2021). Less attention has been paid to how payment intermediaries are deputized to enforce this moral ordering of sexual content without accountability to sex workers and other adult industry stakeholders (Franco, 2024).
Building on a growing body of work examining the role of payment intermediaries in content moderation (e.g., Beebe, 2022; Busch, 2022; Hill, 2024; Franco, 2024; Franco and Webber, 2024; MacDonald, forthcoming; Stardust, 2018; Stardust et al., 2023; Swartz, 2020; Tusikov, 2021), this paper investigates how payment intermediaries establish and enforce regulations that impose a moral ordering of sexuality, effectively suppressing non-normative content on adult platforms. First, we discuss how the moral ordering of different sex acts manifests in porn-abolitionist advocacy and pornography legislation. We then explain how the moderation of sexual content on adult platforms is shaped by their relationship to credit cards, payment processors, and performers. After a brief description of our research methods, we outline the mechanisms that platforms use to ensure adult content is created and posted consensually. We then describe the specific types of sexual content that are subject to strict moderation guidelines and increasingly banned across platforms, as well as the overmoderation that results from enforcement of these rules. Through this illustration, we show how the reputational interests of credit card networks, processors, and platforms are prioritized such that the depiction of consent takes precedence over the actual consent of those producing and consuming the content. We follow this with a discussion of what we call ‘definitional creep’: a progressive expansion of the definition of ‘harmful’ sexual content, and finish with some recommended guiding principles for the moderation of sexual content on adult platforms.
The moral ordering of (online) sex
‘Non-consentable sex’
Not all sex is considered equal: as anthropologist Gayle Rubin has argued, there exists a moral hierarchy whereby some kinds of sex are considered healthy and enriching while others are considered deviant, dehumanizing, and antithetical to human flourishing (Rubin, 1984). This moral ordering of sex is central to both historic and contemporary criticisms of porn, which often hinge on the condemnation of rough sex, controversial fantasies, and BDSM. 1 These sex acts are described as inherently degrading or ‘body punishing’ expressions of sexual and gendered violence (e.g., Dines, 2015: 40–41). The presumption is that no one, particularly women, would or should willingly engage in these sex acts, and that depicting these acts in pornography normalizes violence against women (West, 2022). Thus, a particular understanding of consent and of the desires we can conceivably consent to has become a central organizing principle in justifying this moral ordering of sex.
This conceptualization of consent is invoked in the legislative treatment of porn in many countries, determining which filmed sex acts are prohibited or not. In Canada, the 1992 Butler decision asserted that porn is obscene and may justifiably be censored if it portrays ‘violent’, ‘degrading’, or ‘dehumanizing’ sex; these descriptors are assumed to be self-evident and have historically applied to both queer and heterosexual BDSM and rough sex (Khan, 2014: 186). In 2014, the UK’s Audiovisual Media Services Regulations act banned porn featuring consensual spanking, fisting, face-sitting, squirting, and other distinctly queer, kinky, or female-centered sex acts (Khan, 2020; Monea, 2022). Australia’s porn classification law prohibits ‘consensual depictions which purposefully demean anyone involved in that activity’, listing acts like bondage, spanking, urine play, and fisting (Thorneycroft, 2020: 154–55; see also Stardust, 2018). Across these contexts, the condemnation of kinky porn does not hinge on the subjective experience of the performers or whether they portray the sex as enjoyable or not: the sex is deemed innately harmful and degrading, antithetical to the principle of consent, and a contributor to gender inequality and sexual violence.
Prohibitions of kinky pornography in the name of justice and gender equality represent a ‘contradictory form of paternalism’ whereby ‘censoring images of consenting sexual activity’ is justified ‘in order to protect the principle of consensualism’ (Khan, 2014: 198). This paradox is made possible through what Khan (2014: 184) calls ‘sexual literalism’, where the performance of mutually agreed upon sexual play is interpreted as if it is a literal endorsement of non-consensual sex. That is, if the fantasy or scenario would be problematic in real life, then it is problematic to depict in pornography. ‘Consensualism thus emerges as a contingent value’, where the consent of porn performers is only recognized if they are portraying normative sex (Khan 2014: 189). This reading aligns with legal scholar Karla O’Regan’s (2019) assertion that consent, in regards to sex as well as in other domains, is always a normative concept. Rather than giving primacy to people’s autonomy and choice, ‘consent’ is routinely deployed tautologically to designate some acts as rational, reasonable, and therefore ‘consentable’, and others not. Consentable actions are those that uphold ‘public interest’, ‘community standards’, or ‘social values’—interests, standards, and values that are ‘invariably White, heterosexual, able-bodied, politically conservative, and middle class’ (Thornton, 1996: 2, cf Thorneycroft, 2020: 161). On the other hand, actions that are unintelligible to the dominant imaginary are dismissed as the product of coercion, manipulation, false consciousness, or misinformation. The unintelligible pleasures of kinky sex, considered ‘an unequivocal threat to society’ (Khan, 2020: 366), are judged according to ‘whether the legal gaze deems the act to be degrading or dehumanizing’ rather than ‘the subjective state of the participants’ (Khan 2014: 197). As we will show, the regulation of sex on adult platforms not only employs but intensifies this logic, given that private platforms, impelled by the protection of their corporate interests, have the impunity to moderate broadly and without public accountability as compared to the law.
Payment regulation and reputational risk
The regulation of sexual content on porn platforms orders sexuality according to corporate logics of reputational risk and brand management, skewed to ensure shareholder profits and advertisement revenues. It is well established that mainstream social media platforms problematize sexual content as a potential source of controversy or abuse that poses a risk to their profitability, categorizing it as ‘harmful’ content and limiting or prohibiting it (Griffin, 2022; Hill, 2024; Paasonen et al., 2019; Ruberg, 2020; Stardust, 2018; Tiidenberg, 2021). Adult platforms, for whom sexual content is the core of their business model, also limit the type of sexual content that may be posted and profited from. Internally, platforms prioritize content according to data-driven algorithmic feedback loops based on features such as tagging and search functions that reinforce a heteronormative bias and make it difficult to both create and find diverse pornographies (Craig, 2024; Monea, 2022; Rama et al., 2023). But external forces also determine the content of permissible porn.
Research has begun to explore how payment intermediaries’ rules affect platform terms and conditions by determining what content they will allow processing for (Beebe, 2022; Busch, 2022; Franco, 2024; Franco and Webber, 2024; Hill, 2024; MacDonald, forthcoming; Stardust, 2018; Stardust et al., 2023; Swartz, 2020; Tusikov, 2017, 2021). As with other types of e-commerce, different financial actors work in concert to provide the necessary infrastructure that enables digital payments on adult platforms (Tusikov, 2017). Credit cards are at the apex of this financial infrastructure, but they alone do not process payments. They are networks that rely on processors and banks to move money between the card holders and the ‘merchant’ (i.e., the adult businesses operating the platform, not the individual performers). Processors are ‘Independent Sales Organizations’ or ISOs and provide the infrastructure and customer services to process the payments between the card issuer’s bank and the acquiring banks, as a sort of ‘payment service wholesaler’ for the merchant businesses (Swartz, 2020).
Importantly, businesses in the adult industry are categorized as ‘high risk’ meaning these merchants have to find adult-specific processors who are willing to work with them in order to access online payment services (Swartz, 2020). In general, businesses that are considered more susceptible to fraud and other illegal activity or chargebacks – meaning that card holders can dispute transactions after the service or goods have been bought – are categorized as ‘high risk’. While some adult businesses may deal with chargebacks because of, for example, disgruntled or embarrassed customers, 2 all adult businesses are categorically defined as ‘high risk’ regardless of their actual chargeback rates (Stardust et al., 2023; Swartz, 2020). Some scholars working on sex work have argued this is because of stigma and discrimination against the adult industry (Beebe, 2022; Easterbrook-Smith, 2023; Stardust et al., 2022), while research on payment systems emphasizes how the high risk category also creates a business opportunity for processors, because their ideal customer is one that is considered risky but does not actually operate illegally or have high chargeback rates (Swartz, 2020). Importantly, this ‘high risk’ category creates a distinct payment ecosystem specifically servicing and regulating the adult industry (Franco, 2024). High risk processors specializing in adult businesses have significantly higher fees to offset the increased costs related to the tasks of fighting and absorbing chargebacks and maintaining compliance with bank requirements such as KYC (Know Your Customer) and credit card requirements, through which they exert substantial control over the kind of content that can be monetized, as we will set out below.
In this ecosystem, credit card networks have a unique position and role in regulating adult platforms. As some credit card networks (e.g., American Express) outright refuse to process payments for the adult industry, Visa and Mastercard have a near duopoly. Adult platform merchants are therefore dependent on Visa and Mastercard to access payments from credit card holders, and they must abide by their terms, which are enforced through the processors (Franco, 2024). In this position, Visa and Mastercard have set broad rules for the moderation and verification of adult content, motivated by a mix of the reputational risk posed by association with adult content and the legal liability of processing payments for illegal content (Franco, 2024; MacDonald, 2023). Research has shown that in the process of developing these rules, legitimate concerns around preventing the production and dissemination of child sexual abuse material (CSAM) and other non-consensual intimate images (NCII) have become entangled with anti-pornography interests (McDowell and Tiidenberg, 2023; McKee and Lumby, 2022). Pornography platforms are disproportionately targeted as uniquely responsible for hosting non-consensual content, even though this is a challenge for scalable content moderation efforts on adult and social media platforms alike (Mckee and Lumby, 2022). Campaigns led by groups with clearly stated goals to abolish pornography and sex work conflate pornography with CSAM and NCII and pressure credit card networks to pull financial services from large adult platforms (Cole, 2020). This was exemplified in 2020, when a New York Times op-ed accused Pornhub of lacking sufficient safeguards to prevent the proliferation of CSAM and NCII on the platform and accused Visa and Mastercard of indirectly profiting from this content. In response to public pressure, Visa and Mastercard revoked Pornhub’s processing services (Celarier, 2021). A few months later, Mastercard introduced updated global requirements for adult content merchants (Verdeschi, 2021), followed by Visa in 2022. 3 Credit card networks already had special requirements for adult content merchants, but this incident motivated the introduction of more restrictive rules that have had devastating creative and financial consequences for performers (Franco and Webber, 2024; Holston-Zannell, 2023; Webber, 2024).
Given the catalyst, these tightened rules focus on verifying that content is created and uploaded by consenting adults and on ensuring there are sufficient systems in place to remove offending content swiftly. Our findings describe how consent is deployed by payment intermediaries to regulate sexual content on adult platforms and how adult platforms comply with these rules. After a brief description of our methods, we (1) describe the mechanisms that platforms use to verify content is uploaded by consenting adults, (2) outline the genres of pornographic content that are being more heavily moderated since the introduction of these updated adult content merchant guidelines, (3) demonstrate how platforms interpret these vague guidelines broadly in an effort to remain compliant, leading to the overmoderation of content, and (4) discuss the implications this has for a ‘definitional creep’ whereby more and more content is labeled ‘non-consensual’, ‘objectionable’, or ‘harmful’. We then offer a few alternative guiding principles for the moderation of sexual content on adult platforms.
Methods
Our findings combine document analysis, stakeholder ‘expert’ interviews, and fieldwork at three adult industry conferences carried out by Franco with an online survey of adult content creators carried out by Webber.
The document analysis was conducted on the rules and standards of credit card networks Visa and Mastercard, the policy guidelines and terms and conditions of adult payment processors, Segpay, Vendo and CCbill, and the terms and conditions of several webcam and subscription-based fan platforms (Skyprivate, Cam4, LiveJasmin, MintStars, MyClub and OnlyFans), retrieved from the respective websites of the providers and platforms. This document analysis was complemented by sixteen semi-structured interviews with key stakeholders working in or adjacent to the adult industry, based in either the US or the EU. All interviewees were people who held ‘expert’ positions within their respective organizations, working for platforms (4), the adult trade association (1), payment processors (4), content moderation and verification services (2), industry law firms (3), or are models and creators who are active in advocacy (2). They were recruited through purposive sampling for having extensive expert and insider knowledge of regulatory and political issues that are not accessible through media and policy documentation analysis (Van Audenhove and Donders, 2019). Fieldwork was conducted at three industry conferences: XBIZ in Los Angeles, US (January 2023), the Webcam Summit in Bucharest, Romania, (May 2023), and XBIZ Europe in Amsterdam, the Netherlands (September 2023), at which Franco attended 29 panels on industry issues. Ethical approval was granted by the Ethics Advisory Board of the Amsterdam Institute for Social Sciences Research (2022-AISSR-15640).
These data are further contextualized with the results of a survey asking online sex workers how they were impacted by the new payment processor policies that followed the introduction of Mastercard’s October 2021 guidelines. Survey questions were drafted by Webber and piloted and refined in consultation with online sex workers, adult industry advocates, and queer research community members. The survey was administered via Qualtrics research platform and launched on Twitter in November 2021; it was subsequently shared on several adult content creator listservs/online communities. It remained open until January 2022 and collected a total of 117 responses from creators who were primarily white (68%), US-based (70%), queer (66%), cis women (60%) between the ages of 25–39 (68%) (for full demographic details see Webber, 2022). Creators reported working across various cam, clip, fansite, and sexting platforms, reflecting the reality of platformized sex work wherein workers typically operate across multiple overlapping revenue streams. Reflecting the myriad roles they encompass and the diversity in industry terminology, within this article, we interchangeably refer to online sex workers as performers, models, and content creators. The survey included several open-ended questions, from which direct quotes are drawn and may be lightly edited for clarity. Ethics approval was granted by Memorial University of Newfoundland and Labrador’s Interdisciplinary Committee on Ethics in Human Research (# 20180439-ME).
Defining and verifying consent
How then do credit card networks and payment processors define consent, and how do adult platforms verify it? Credit card rules stipulate that platforms must have processes in place to verify the age, identity, and consent of performers. The Child Protection and Obscenity Enforcement Act of 1988 already mandated porn producers selling content in the US to obtain and maintain age and consent documentation for each piece of content (known as ‘2257’ paperwork), 4 and in 2004 the law was modified to apply to online distributors as well. This paperwork has long been considered the ‘gold standard’ throughout the industry, since many companies outside the US complied in order to lawfully access the American market. Visa and Mastercard’s updated rules compel platforms to implement additional requirements beyond this legal threshold, including for content that was previously verified. For example, OnlyFans requires current identification for performers whose ID documents have expired since filming the content, and images of the backsides of ID cards. On other platforms such as Streamate, performers must conduct biometric liveness checks to match their face to their ID on record. Consent documentation and affirmation must be submitted for each piece of content and at the beginning of each livestream. As we have explored elsewhere, these verification requirements place additional burdens on performers, in part because automated verification tools frequently malfunction (Franco and Webber, 2024).
Verifying ‘consent’ in creator uploads, however, moves beyond the point of production to what is portrayed in content and how it is marketed. Platforms are restricted in the tags and search terms they may use, as these must ‘not give the impression that [the platform’s] content contains child exploitation materials or depiction of non-consensual activities’ (Mastercard, 2021: 2). Platforms are also required to review content prior to publication and conduct real-time moderation of livestreams to ensure that content does not ‘depict’ non-consensual or otherwise ‘objectionable’ sex. Echoing the logic underlying the pornography legislation discussed above, the optics of consent – whether or not something looks consensual according to normative sexual parameters – are prioritized over the subjective experience and documentation of consent on the part of performers.
Because credit card networks design rules for adult platforms that prioritize their legal protection as well as brand reputation, Visa and Mastercard prohibit illegal material as well as legal content they consider ‘unauthorized’ or ‘brand damaging’ and thus ‘unacceptable to sell in connection with [the brand logo]’ (Mastercard, 2023: 114). This interpretation consolidates the conflation of reputational risk and illegality. As Stardust argues, corporations ‘use criminal laws as an excuse to establish requirements that are not based on law or community standards but rather on market risk—the potential of bad publicity, a tarnished reputation, public outcry or loss of their investors’ (2018: 164, emphasis in original). The process through which these rules are drafted and implemented is not accountable towards the adult industry or the sexual subjects creating and consuming adult content (Franco, 2024; Franco and Webber, 2024), despite being ostensibly concerned with ensuring safety and consent on the platforms.
As illustrated in our findings below, these rules are implemented in broad terms, which as we have written elsewhere are applied inconsistently from case to case and across different platforms (Franco and Webber, 2024). In varying ways, the ‘depiction’ of non-consensual activities is interpreted as pertaining not only to recordings of sexual assault, which are clearly illegal and objectionable, but to consensually produced sexual performances that portray themes of coercion and power exchange.
Banned kinks
The rules and content guidelines set out by credit card networks have far-reaching consequences for the types of content that can be sold on adult platforms. As outlined above, credit card executives assert that the purpose behind their adult content merchant guidelines is to ensure that content is made by adults under consensual working conditions and that they agree to its distribution on the platform. While the requirements for verifying consent have intensified, some sex acts and storylines are banned because they are interpreted as ‘depictions’ of non-consent or otherwise ‘objectionable’ and ‘brand damaging’ content, despite the presence of consent documentation.
The creators most affected by this interpretation of payment rules are those who produce kink and fetish content. When asked about the impacts of Mastercard’s (2021) revised adult merchant guidelines, over half of respondents to Webber’s survey (58%) reported having content banned from a platform because of the theme, storyline, or sex acts it contained. This most often entailed BDSM-related material such as bondage, CBT, 5 impact play, and breathe play. 6 Content featuring bodily fluids 7 and acts like gaping, 8 sounding, 9 and ‘large’ insertions were also frequently banned. According to Ash, 10 a squirting, fisting, and big toy creator 11 who had to remove some of their ‘large fantasy toy content’, platforms ban these acts ‘for being too “extreme”’.
Creators also had to remove fictional role plays including hypnosis, intox, and mind control
12
fantasies. Larry, a bondage, sleepy, and mind control creator, wrote that in an effort to comply with Mastercard’s rules, platforms ‘are labeling totally consensual scenes with model releases, IDs, and professional models as “non-consensual” due to themes in the plot/storyline’. Hypnosis, BDSM, and CNC creator Joanie similarly argued that: The original ‘reason’ for [Mastercard’s rules] has been just in case the person in the video didn't consent. This is why I can’t do hypnosis or use the word ‘hypnosis’. Yet [even though] now creators must show proof of consent, the word hypnosis [is] still banned.
Other role plays that respondents had flagged included money-related fantasies such as findom 13 and blackmail, 14 as well as storylines involving vampires, religious themes, taboo, 15 and age-play. These removals occurred despite deliberate choices creators make to prevent misinterpretation. For example, because she is aware that age-play content is highly scrutinized, Lenore (roleplay, BDSM, and DDLG 16 creator) noted that she is ‘very curvy, clearly an adult woman, and keep[s her] pubic hair specifically to try and avoid any potential issues [with moderation]'. Nevertheless, all of her age-play content was removed from the platform she uses.
Finally, consensual non-consent (CNC),
17
forced bi,
18
and kidnapping fantasy roleplays were frequently banned, and several respondents had to remove content featuring fake weapons. As BDSM creator Winston wrote, this severely limits his options for monetizing content: Common bdsm/power exchange fantasies are also off the table, including consensual non consent, kidnapping, anything that could be considered ‘forced.’ It’s extremely restrictive. The only way I’ll be able to continue making and selling content is to launch my own paysite.
The unifying theme across these types of content is that they include some form of consensual power play or fictional coercion, or they include non-normative sex acts perceived as ‘extreme’ and therefore non-consentable. In the next section we describe how a broad interpretation of what constitutes ‘objectionable’ content and ‘depictions of non-consent’ leads to the overmoderation of content that is neither illegal nor non-consensual.
Compliance and overmoderation
Neither Visa nor Mastercard list specific sexual acts or kinks in their guidelines. Payment processors must interpret their broad definitions, which conflate illegal and brand-damaging content and prohibit marketing the depiction of non-consent. Processors tend to do so conservatively to appease their relationship with the credit card networks, and subsequently pass their terms and conditions down for platforms to comply with (Franco, 2024; Franco and Webber, 2024). In this way, compliance can become a game of ‘broken telephone’: Visa and Mastercard’s vague rules are interpreted and relayed across multiple layers, from banking regulations, to payment processor policy, to platform moderation (Franco, 2024; Franco and Webber, 2024). As a result, prohibited content is defined imprecisely and can fluctuate. While there are trends in the type of content being banned across platforms, every platform has different rules. As a group of platform representatives argued during an industry panel that Franco attended, while there is a baseline of universally prohibited content, each platform allows different ‘forbidden fruits’ according to their particular interpretation.
This imprecision is intensified by increased use of AI-powered moderation services to enforce platform rules, leading to overmoderation. Prohibited content needs to be classified and interpreted by machine learning, however AI is typically not designed for porn but against it, with the goal of capturing all adult content (Monea, 2022). It is well recognized that AI moderation processes are problematically opaque, have difficulty interpreting important nuance or context, and tend to favour certain bodies in ways that replicate the same racism, sexism, ableism, and other biases informing the human moderation decisions they are trained on (Gorwa et al., 2020; Gregory, 2024; Pilipets and Paasonen, 2022). When tasked with differentiating between types of adult content, AI is unskilled at distinguishing subtleties—for example, being able to recognize the difference between a slap that is definitively ‘violent’ versus one that is consensually erotic. As Arthur, an AI content moderation service representative told Franco: We are often asked if we can, for example, check for real violence [...] Slapping, whipping, flogging, whatever you do not want to see anymore. And this is not typically possible or easily possible by looking at an image.
This imprecision creates difficulties for performers, whose content is overmoderated due to AI’s limited ability to comprehend context and nuance, and its prerogative to fixate on possible indicators of non-consent, including text and language. Interviewee Mindy (creator and educator) found that: Whatever automation they’re using to try to police [content], it errs way too far on the side of being conservative [...] I saw a model tweet the other day that she had an audio message that was telling a fan they’re doing a good job, ‘they’re killing it’, like they’re doing great. And because she said ‘killing it’ in the audio, it was taken down.
Moderation that cannot take context into account often relies on lists of banned words, flagging their presence in content titles, descriptions, or chat logs. Survey respondent Claire (cuckolding creator) stated that ‘performers can get caught due to AI for innocent exchanges or for explanation of rules.’ Ironically, a webcam performer’s explanation that she cannot engage in a specific kink because it is banned might itself trigger a flag that shutters her account.
Creators also had content flagged and banned for simply describing certain sex acts, rather than actually doing them. Many creators make solo POV videos where they speak directly to the camera, weaving a fantasy through roleplay without necessarily acting out any of the described behaviours. Survey respondent Joanie (CNC, BDSM, and hypnosis creator) noted her frustration with having to adapt this type of production to avoid moderation: The majority of my content is solo performer and non nude, non penetration. In other words, me talking to the camera about dirty things. The most nudity is usually my breasts in pasties. I wear more clothing than most pop stars and simply talk to a camera. So why am I having to change so much content?
Creators pointed out that much of their banned content is extremely campy and lacks any realism, making it difficult to imagine its moderation is based on a valid concern that the content depicts actual violence or assault. Several survey respondents described creating fantasy storylines featuring vampires or witches, specifically noting implausible scenarios and low-budget production values. Maddy (hypnosis, CNC, and castration fetish creator) described her approach to hypnosis fetish as, mostly silly roleplaying that isn’t harmful at all. Filters with spiral eyes, pretending to walk with zombie arms etc. Hypnosis is banned on all platforms and [is] being heavily enforced now.
The logic behind the widespread decision among many platforms to ban hypnosis content is reflected in Pornhub’s ‘Non-Consensual Content Policy’, which has likely been fortified in part to revamp their reputation, to regain Visa and Mastercard processing, and to appease the processors they currently use. This names ‘Spells, Mesmerizing, or Possession’ in their list of ‘Sensitive Themes’ and prohibits content where performers are ‘depicted as incapacitated’ (Pornhub, 2024, para. 19). Regardless of the identity and consent verification processes in place, ‘Content featuring or promoting non-consensual acts, real or simulated’ is prohibited, because ‘Consent must be determinable by the reasonable observer from the material itself’ (paras. 5 & 16, emphasis added). This interpretation of consent blends the moderation of coerced sex with that of hypnotic sexual control fantasies, reflecting Khan’s (2014) definition of sexual literalism.
When adult content, fictional in nature and created for entertainment purposes, is moderated on the basis of what the creators perform within it rather than on the basis of the conditions under which it was made, it establishes a threshold that no other form of performance is held to. Specific signifiers of coercion or violence (such as weapons, restraints, or vampire teeth) are taken out of context and moderated as a direct indicator of non-consent. In the next section we expand on this phenomenon, which we refer to as ‘definitional creep’, whereby benign adult content is deemed ‘harmful’ or ‘non-consensual’ through a false, abstract association with actual exploitation and coercion.
Definitional creep: De facto obscenity regulation
Through the enormous power they wield, private financial companies are being deputized to create and enforce de facto obscenity regulations, dictating the kinds of sexual expression and pornographic content that can be sold online. This regulatory function is not subject to any public oversight or democratic policy process. This is troubling given these decisions do not appear to be guided by an interest in protecting the health and safety of performers but by a corporate interest in managing reputational risk. The actors in the payment processing ecosystem introduce changes that quietly and collectively broaden what constitutes harmful sexual content, as processors continually move the goalposts of this ever-expanding definition and platforms subsequently over-moderate. We call this process ‘definitional creep’.
Visa and Mastercard guidelines, developed by corporate financial actors and designed to avoid controversy, are principally concerned with ‘ensuring consent’ and making sure the content appears ‘consentable’ according to normative sexual mores. The reputational optics of banning content featuring BDSM or storylines about power play takes precedence over honouring the paperwork that demonstrates those scenes were filmed consensually. In this way, the depiction of consent is prioritized not just in favour of, but at the expense of people’s actual consent to engage in and monetize diverse sexual acts. This denies the bodily autonomy and sexual integrity of the people involved, nullifying their consent. Credit card networks and payment processors thus reinforce and consolidate a moral ordering of sexuality that pathologizes non-normative and kink sexual practices (Rubin, 1984). Survey respondent Sam (toilet fetish creator) lamented having to participate in this policing: I hate that I have to turn clients away for niche fetishes that are now banned and it feels like kink shaming, which goes against the whole reason I got into the industry: to give people a safe space to explore their kinks.
Several content creators who responded to the survey discussed how the increased prohibition of diverse forms of sexual content has had a chilling effect. They described credit card compliance as ‘massive censorship that won’t stop with us’ (Francesca, taboo, vampire, and findom creator), referencing the double standard applied to pornographic versus mainstream media: The things censored [in porn] are shown on Netflix all the time and Mastercard certainly makes no attempt to shut down their payment gateways. [...] The public hears the compliance rules and thinks, ‘well really, how is that bad?’ [...] [We need] to make people understand the draconian intent (Joanie, CNC, BDSM, and hypnosis creator).
In her interview with Franco, the CEO of a crypto-based fansite platform shared her fear that credit card networks wield their vast censorial power in increasingly restrictive ways for which they are not qualified: I think there’s no signs that it’s going to get better anytime soon. It just feels like the credit card companies are getting stricter and stricter and putting themselves in spaces that they don't understand.
Similarly, Daniel, an industry lawyer, offered Franco a bleak prediction, highlighting how these rules gradually expand the definition of prohibited content to contain any ‘fringe’ sexual behaviour: I think there’s going to be a slow change in the industry to where anything that is outside of vanilla
19
porn is going to be erased off the internet. Rébecca: Why? Because of the potential liability for Visa and Mastercard. They’re just not going to allow it. Anything that has any type of kink to it, gone. [...] So anything that isn’t really kind of like, very vanilla. And what I mean by vanilla, I don't mean just boy/girl sex. I’m talking just about, you know, when you get into the fringes of sexual behaviour, okay, those fringes are going to be brought back in. [...] And it’s going to continue. Slowly, slowly, slowly, slowly.
Discussion
Assessing the effects of payment processor rules in practice demonstrates that they are being applied through a lens of sexual literalism (Khan, 2014) that places disproportionate stock in the optics of consent at the expense of respecting sex workers’ agency. Performers provide extensive documentation representing the complex negotiations that have gone into making content and asserting their desire to distribute and monetize it. In the spirit of contradictory paternalism (Khan, 2014), however, what matters is that porn looks consensual according to a normative, corporate understanding of sexual morality that views sex as ‘dangerous and largely defined in terms of its potential for harm’ (Hill, 2024: 185).
As credit card executives face pressure to disengage from adult businesses over claims of reckless moderation practices, performers’ consent is deemed necessary but not sufficient for their content to be monetized on platforms. The sexual literalism applied to interpretations of consent in adult content is fueled in part by the negation of the labour that goes into producing it, as porn is widely perceived as documentary rather than fiction or entertainment. ‘[A] fundamental confusion between the content of an image and the conditions of its production’, in conjunction with the moral ordering of sexuality, upholds assertions that certain kinds of porn are documents of violence filled with sex acts that are ‘so inherently distasteful that no one would do them willingly’ (Rubin, 1993: 267, 268). Far from protecting porn performers, this form of paternalism dismisses workers’ agency outright. Ignoring sex workers’ yes renders their no meaningless, because it suggests they are the same (Shane, 2013), and this allows coercive and exploitative labour conditions to go unchecked. This includes the widespread financial discrimination and payment processor relations that make platformized sex work deeply precarious, as we cover in a companion publication (Franco and Webber, 2024). While it is necessary to have productive conversations about how to reduce harm on adult platforms, this is impossible to do effectively unless respect for adult content creators’ agency is foregrounded (Berg, 2021; West and Horn, 2021).
Elaine Craig (2024) reminds us that whenever sexuality is policed, whether through the application of obscenity law or criminal law, it is queer, racialized, and other marginalized communities that are inevitably targeted. She further warns that increasing platform liability or requiring more robust content moderation policies risks producing similar effects. The evidence is mounting that this is not a hypothetical risk, it is precisely what is happening (Franco and Webber, 2024; Hill, 2024; MacDonald, forthcoming). An ever-expanding definition of ‘harmful’ content is primarily sweeping up representations of niche and non-normative sexualities, especially those produced by queer, fat, Black and other creators of colour (Franco and Webber, 2024; Webber, 2024).
Prioritizing a literal optics of consent over the declarations of adult creators making fetish content feels especially misguided, given the consent cultures of these communities. Sex workers and kinksters grapple with the politics and practices of consent in ways that far exceed mainstream conversations, which tend to treat consent as a box ticking contract or a unidirectional request for permission rather than as an entry point for interrogating labour under capitalism or developing good sexual ethics (Buggs and Hoppe, 2023; Kukla 2019; West and Horn, 2021). Sex workers have pushed understandings of what constitutes meaningful consent by highlighting the double standards applied to sexual labour, calling out the ineptitude of the choice/coercion binary, and articulating the pervasive constraints that shape our voluntariness to all forms of work (e.g., Lee, 2019). Sex workers have also stressed the shortcomings of ‘enthusiastic’ consent as a benchmark for ethical sexual conduct (e.g., Calida, 2023; Shane, 2013). Regarding kink and BDSM practitioners, it is well documented that on average they tend to have lower rape-supportive beliefs and engage in extensive and nuanced communication about boundaries, including protocols developed for the express purpose of safely enjoying fantasies of non-consent (Dunkley and Brotto, 2020; Klement et al., 2017). Contrary to presumptions that they are necessarily working under duress or false consciousness, sex workers creating kink content are, on a whole, particularly well equipped to articulate and practice informed sexual consent.
While Mastercard and Visa are ostensibly interested in reducing harm and ensuring their network is not used to monetize non-consensual material, their interests ultimately lie in the safety of their brand. Their policies enable a ‘definitional creep’ whereby more and more types of sexual content are deemed harmful and eliminated from platforms, which does not address concerns regarding platform safety for performers. There are alternative policy suggestions, however, that could allow for diverse sexual content on adult platforms while also reducing the exploitation faced by workers.
Recommendations and conclusion
Content moderation (and analysis of it) primarily deals with how to moderate sexual content on mainstream platforms. Many thoughtful recommendations have been proposed for more sex positive, human rights-based approaches to moderation that move beyond simply identifying and eliminating nudity and sex to allow for more consumer choice and agency (e.g., Hill, 2024; Stardust et al., 2022; Tiidenberg, 2021). We have instead addressed how sexual content is moderated on adult platforms, where sex is the core of the business model. Here and elsewhere, we have argued that financial brand managers should not be granted the power to establish moderation rules and processes that determine online access to sexual materials and structure the workspaces – adult platforms – used by online sex workers to upload, stream, and sell their content (Franco, 2024; Franco and Webber, 2024; Rand, 2019). More work is needed to develop appropriate and equitable moderation policies and technologies that are not principally grounded in the logics of profitability and reputational risk, but rather centre sex workers’ needs, encourage sexual diversity, and reduce the perpetration of harm on adult platforms. We believe a few guiding principles can be derived from our findings.
Verification and moderation should be limited to ensuring sexual content is produced and distributed by consenting adults, disentangling this from the depiction of consent or normatively ‘consentable’ sex acts within the performed scenarios. Drawing the line anywhere else requires making a moral sexual judgment that falls beyond the purview of these decision-makers. Development of these verification procedures should be led by adult content creators to ensure the most effective and acceptable ways to demonstrate meaningful consent from all performers without being overly burdensome or endangering their privacy and security. This includes swift and effective mechanisms for flagging and removing potential CSAM and NCII, as well as mechanisms for creators to contest moderation decisions believed to have been made in error. Avenues should be pursued to prohibit platforms from closing creator accounts without warning or recourse due to moderation decisions, and from seizure of any outstanding funds. In short, adult platform guidelines should be treated first and foremost as labour policies dictating workplace processes and procedures, rather than as moral dictates determining what constitutes appropriate media content.
Credit card networks have been deputized with widespread censorial powers, producing de facto obscenity law in the name of corporate interests. As a result, adult platforms are increasingly inhospitable to sexual expressions that fall outside of a narrow normative moral order. This reflects a trend of technological and financial companies increasingly holding power over digital markets and infrastructures, within a climate that is deeply suspicious of marginalized sexualities. Corporate interests of financial companies and platforms should not be the guiding logics for a moral ordering of sexuality. This approach to moderation devalues the consent, autonomy, and self-determination of content creators and their viewers, and undermines the expression of diverse sexualities on adult platforms.
Footnotes
Acknowledgements
We would like to thank all of the interview and survey participants who generously shared their experiences and insights in support of this work. We would also like to thank the two anonymous reviewers for their constructive comments on an earlier version of this paper.
Declaration of conflicting interests
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Val Webber holds a paid advisory board position with Ethical Capital Partners, the private equity firm that acquired Pornhub's parent company in 2023. This advisory role does not confer decision-making authority and includes contractual protection of their academic freedom.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was funded by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek (406.DI.19.035).
