Abstract
This article examines the role of payment intermediaries in regulating the platformized adult industry and demonstrates how the adult industry responds to their power and the rules they set. Based on 16 expert interviews, fieldwork at 3 industry conferences, and document analysis of rules, content guidelines, terms, and conditions, the author teases out the intricate interplay between credit card networks, payment processors, and adult platforms. Visa and Mastercard’s rules, enforced by payment processors and implemented by platforms, create a selective, private ordering of permissible content that surpasses legal requirements. This process is impelled by the brand safety and commercial interests of global corporations, without accountability to the industry or consideration for sex workers’ needs. The article calls for the need to hold payment intermediaries as de facto regulators of online sexual commerce and key actors in platform governance accountable toward the industry and workers they impact.
Keywords
Introduction
“Visa and Mastercard is not . . . they’re not a government. . . . Ultimately, they’re a company. They can choose who they want to do business with and who they don’t want to do business with” (Joshua, 1 industry lawyer). As Joshua points out, payment intermediaries like credit card networks can choose with whom they do business. On 14 April 2021, Mastercard published new requirements for platforms that sell adult content, stipulating new rules for content moderation and verification. These rules were established in response to public outrage following a New York Times article that accused Pornhub of hosting and profiting from non-consensual content and child sexual abuse material (CSAM). The article also criticized Visa and Mastercard for processing payments for the platform, leading Mastercard and Visa to withdraw their processing services (Celarier, 2021; Kristof, 2020). This chain of events indicates a regulatory reality of which many in the adult industry had long been aware: payment intermediaries play an important role in determining who can buy and sell what types of sexual content online, by setting requirements for platforms on the allowed types of sexual content and how this content is moderated and verified (Free Speech Coalition and SexWorkCEO, 2023; Herrman and Redman, 2021).
The role of payment intermediaries has received relatively little scholarly attention, especially in comparison with social media platforms and platform-level governance. Still, growing research reveals that payment intermediaries play an imperative role in regulating online content in the platformized adult industry because their regulations affect content guidelines and content moderation platforms (Swartz, 2020; Tusikov, 2017, 2021). By “platformization,” I refer to the shift of market activities onto digital platforms, reflecting the growing dominance of these platforms as the primary organizational structure for economic processes (Helmond, 2015). This concept encompasses not only the computational aspects of platforms but also their economic and participatory dimensions as multi-sided markets (Gillespie, 2010). Because contestations over moderating appropriate online content foreground sexual content, payment intermediaries play a particularly important role in regulating this (Blegen, 2023; Tusikov, 2021). Recent research examining their role in regulating online sex work argues that they can function as extra-legal regulation (Beebe, 2022; Stardust et al., 2023). These studies make an important contribution to understanding the regulation of sexual content and adult platforms. Yet, there is still a need to disentangle how the relationships between credit cards, payment processors, and the adult industry are structured, and how the adult industry experiences, understands, and responds to the power of and rules set by payment intermediaries.
This article does so by focusing based on 16 expert interviews, fieldwork at 3 adult industry conferences, and document analysis based on rules and terms and conditions from credit card networks, adult payment processors, and webcam and fan platforms, as well as compliance-related articles from payment processors. While existing research typically concentrates solely on terms, conditions, and policy documents, my analysis also relies on fieldwork and expert interviews to provide insight into the regulatory process from the perspective of the actors involved, thereby contributing to understanding the layered governance of the platformized adult industry.
I examine platforms that facilitate (audio)visual, indirect, Internet-based sex work (Sanders et al., 2018), focusing specifically on adult livestream platforms (webcamming) and subscription-based fan platforms. These serve as “digital, multi-media, interactive spaces that host exchanges between sex workers and customers” (Rand and Stegeman, 2023: 2103). Transactions on webcam and fan platforms typically involve sexual content, including webcam shows, photos, and prerecorded pornographic videos, which customers access through subscriptions, tips, or pay-by-minute models. The emergence of digital sex work and the platformization of the adult industry have empowered sex workers to directly upload and livestream content for consumers, blurring the lines between traditional studio pornography, direct client–sex worker interactions, and digital content creation, all the while bolstering digital platforms’ power to set the conditions for sex work and pornography (Hamilton et al., 2023; Rand and Stegeman, 2023; Sanders et al., 2018). I use the general term “adult platforms” to refer to these platforms and “sexual content” to describe the content they host.
The platformization of sexual content has given rise to regulatory concerns about content moderation and identification verification (Blegen, 2023). Partly in response to such concerns, major payment intermediaries, predominantly US-based entities, regulate online content and commerce on a global scale, even as those remain subject to compliance with national legislative frameworks (Blegen, 2023; Tusikov, 2017). While both these payment rules and the platformized sex industry have a global character (Jones, 2020), the analysis here is limited to legislative contexts where pornography and webcamming are not criminalized, where most platforms and credit card networks are based, and where access to financial services and the Internet is high, namely, the United States and the European Union.
Regulating sexual content on the Internet: a literature review
With the platformization of sexual commerce, concerns about what sex and sexual content can be sold and where play out on platforms. The process of platformization involves the reshaping of markets, governance, and foundational infrastructures that support economic and cultural activities (Poell et al., 2021). The actors involved in regulating sexual commerce therefore include not just state actors but have come to encompass those providing the infrastructure and markets for digital sexual commerce: namely, as I demonstrate below, platforms and payment intermediaries.
Social media platforms widely over-moderate, ban, and de-platform sexual content and sex workers (Are, 2023; Blunt and Stardust, 2021; Ruberg, 2021; Stegeman, 2024; Sybert, 2022; Tiidenberg, 2021). Scholars of platform studies argue that, in the process of platformization, private corporations can decide what content they allow, based on their commercial values (Klonick, 2018; McChesney, 2013; Mirrlees, 2021). Sexual content may jeopardize profits for non-adult platforms, as it constitutes a risk for “brand safety,” and consequently for advertising revenue (Griffin, 2023). Commercial gain is a driving force in defining, moderating, and consequently, deplatforming sexual content (Paasonen, 2021; Tiidenberg, 2021). Less research has explored content moderation on adult-specific platforms like those for webcamming and pornography (Stardust, 2018; Stegeman, 2024). These platforms have undergone a shift in the threshold test for acceptable pornographic content, which has expanded beyond indecency and offensiveness to encompass profitability and market risk (Stardust, 2018: 156).
Because hosting sexual content poses a high legal risk to platforms, legal liability contributes to deplatforming sex workers and banning their content. Generally, law and policy affect content moderation, but these are intertwined with platform interests (Gorwa, 2019a, 2019b). US legislation, particularly Section 230 of the Common Decency Act (CDA), protects platforms from liability over user-generated content and determines that Internet corporations have “the right but not the responsibility” to moderate content while giving them discretionary power to do so (Gillespie, 2018: 44). Still, platforms are not protected from liability when it comes to content such as obscenity and CSAM. Moreover, the US 2018 legislative package known as Allow States and Victims to Fight Online Trafficking Act/Stop Enabling Sex Traffickers Act (FOSTA/SESTA) removes the core principle of platform immunity under Section 230 of the CDA when the facilitation of prostitution occurs. To minimize state interventions, platforms generally favor profitable content that does not risk legal complications (Denyer Willis, 2023). This leads to over-moderation that particularly affects sexual content and content posted by sex workers (Griffin, 2023: 45). FOSTA/SESTA’s passage has led to widespread deplatforming of sex workers and adult content across platforms to avoid legal liability (Blunt and Wolf, 2020; Tichenor, 2020; Tripp, 2019).
While there is research focusing on platform-level content moderation and deplatforming, more inquiry is needed to understand the role intermediaries—such as app stores and payment providers—play in affecting platform governance. Intermediaries shape the relational dynamics between parties (Benjamin and Wigand, 1995). Van Dijck’s (2021) metaphor of the platform ecosystem as a tree with interconnected layers is helpful for conceptualizing the power intermediaries situated within the “trunk” of the tree have over sectoral apps and platforms, visualized as “branches” connected to and dependent upon the “trunk.” Studies reveal that moderation policies intermediaries in the “trunk” set often lead to sweeping, imprecise content control across individual platforms, against which there are no avenues for appeal (Busch, 2022; Swartz, 2020; Swords, 2020; Swords et al., 2023). For instance, bans on sexual content in app stores are applied on all hosted platforms (Pilipets and Paasonen, 2022; Tiidenberg, 2021). As a way to promote self-regulation, advertisers individually and collectively pressure platforms to establish brand safety measures to avoid advertisement on “offensive” content (Cunningham and Craig, 2019: 267), through which advertisers implicitly define “appropriate content” that typically excludes sexual content (Griffin, 2023).
Similarly, because they provide essential infrastructure, payment intermediaries operate at the level of the “trunk,” acting as valuable regulators of Internet activity, including online content (Goldsmith and Wu, 2006; Tusikov, 2017, 2021). They exercise discretion that leaves room for “private, strategic, and unaccountable decisions” (MacCarthy, 2010: 1122). Individual platforms located in the “branches” must abide by their rules (Busch, 2022; Swartz, 2020). A growing group of scholars researching payment intermediaries’ effects on moderation of sexual content and online sex work show that intermediaries either blatantly ban sexual content or set up specific, discriminatory rules, terms, and conditions for adult platforms (Beebe, 2022; Hill, 2024; Stardust et al., 2023; Tusikov, 2021). This, in turn, affects platform terms and conditions and labor relations between platforms and sex workers (Easterbrook-Smith, 2022; Stegeman, 2024).
The adult online industry’s payment ecosystem
Before discussing methods and findings, it is helpful to give a succinct understanding of the payment infrastructure for the exchange of goods on the Internet, and in specific for the exchange of digital sexual services and content. This has been set out in detail by Tusikov (2017, 2021) and Swartz (2020). Credit card networks, banks, and payment processors together provide the payment infrastructure that facilitates processing for adult platforms. At the pinnacle of this structure stand Visa and Mastercard, which wield near-duopolistic control over credit card transactions in the adult industry. These credit card companies are networks. When customers want to purchase goods or services from an adult platform with a credit card, the credit card networks rely on processors to move money between the issuer bank of credit card holders (the customers who buy sexual content) and the acquiring bank of “Merchants,” that are in this case the adult businesses operating a platform. The processors are Independent Sales Organizations, who provide the infrastructure and customer service to transfer payments, as a sort of “payment service wholesalers” for merchants’ businesses (Swartz, 2020). The processors maintain relationships with banks and credit card networks and mitigate the risk of chargebacks—meaning card holders can dispute transactions after a service or good has been bought—and help fight them.
Adult businesses cannot use regular payment processors, because they are categorized as high risk for credit card payments, as stipulated by the International Standardization Organization. On paper, high risk means that the business is considered highly vulnerable to fraud or chargebacks. For the adult industry, however, the basis for this categorization is contested (Beebe, 2022: 144–145; Stardust et al., 2023; Swartz, 2020). Most processors that specialize in adult merchants also ensure their merchants comply with banking regulations and credit card requirements. Because many banks likewise do not want to work with adult businesses (Swartz, 2020), processors specializing in adult content often also provide merchants with a bank account. Therefore, a specialized payment ecosystem for sexual commerce on the Internet exists. This makes processors and credit card networks—the payment intermediaries—vital and powerful players that the adult industry has to navigate, as I will demonstrate.
A note on methods
I base my findings on a document analysis, notes from fieldwork at 3 adult industry conferences, and 16 interviews with key stakeholders working in the industry carried out between January and November 2023. Taking an investigative and iterative approach, I first conducted an initial document analysis and interviews with trade association representatives and industry lawyers, through which I gradually shifted my focus to the regulatory power of payment intermediaries, as my data increasingly underscored its importance. This approach ensures that the article’s emphasis on the role of payment intermediaries and their impact on the adult industry aligns with the industry’s view of the most pertinent regulations shaping its practices.
Based in either the United States or the European Union, all the interviewees held “expert” positions in their respective organizations, working for platforms (4), the US adult industry trade association (1), payment processors (4), content moderation and verification services (2), industry law firms (3), or as online sex workers active in advocacy (2). I recruited participants primarily by attending adult industry events, namely, the XBIZ conference in Los Angeles, United States (January 2023), the Webcam Summit in Bucharest, Romania (May 2023), and the European XBIZ conference in Amsterdam, the Netherlands (September 2023). While some prospective interviewees in various positions within the industry welcomed my interview requests, I faced challenges in recruiting higher-level representatives from large platforms. This is in line with the challenges researchers experience with accessing representatives of the tech industry (Bonini and Gandini, 2020). Thanks to my time spent connecting with people in the industry, I was also able to recruit participants through referrals, as many expressed their willingness to help with my research, among others to counter the stigmatizing narratives against their industry. I used purposeful sampling to recruit participants based on their extensive expert insider knowledge of regulatory and political issues inaccessible through media and policy documentation analysis (Van Audenhove and Donders, 2019). I also conducted fieldwork at the abovementioned industry events where I attended 29 panels on industry issues in total.
Participants varied in their levels of public visibility and held different positions within their respective companies. Sex workers are typically categorized as “vulnerable,” whereas participants like lawyers are seen as elite. Recognizing that concepts like “vulnerability” and “harm” to research participants are not static (Lancaster, 2017), and after carefully considering the need to balance potentially conflicting interests of participants in terms of anonymization, maintain consistency in anonymization choices, and ensure transparency and specificity in the findings, I have pseudo-anonymized interview participants but have used company names when these are not directly connected to the interview participants and when I refer to sources available in the public domain, for example on the publicly available schedule of the conferences or the platform websites. This was decided in deliberation with the Ethics Advisory Board of the Amsterdam Institute for Social Sciences Research, who granted approval for this project (2022-AISSR-15640).
I performed document analysis of the policy guidelines, terms, and conditions that credit card networks Visa and Mastercard set, along with those of the three major adult payment processors Segpay, Vendo, and CCbill—all of which have freely accessible policy documents on their websites. In addition, I have examined the terms and conditions of four webcam platforms (Skyprivate, Amateur.tv, Cam4, and LiveJasmin) and two fan platforms (MyClub and OnlyFans), selected based on relevance for the interview data. To understand how guideline implementation functions, I reviewed 29 articles by payment processors spanning the period from April 2021 to July 2023 that offer guidance on compliance. These I retrieved from XBIZ, a leading publisher of adult business news and information.
Employing the abovementioned methods, I generate a unique analysis of the regulation of sexual content and adult platforms at a granular level, rooted in the perspectives of stakeholders involved in the regulatory process. Utilizing iterative, inductive open, and axial coding of interviews with Atlas.ti, I initially coded the descriptive elements of the regulations and their effects. From this, I identified the actors and relationships in the regulatory process. For these actors, I thematically coded regulatory concerns, such as “company values” and “legal liability,” and regulatory tools and devices, such as AI-moderation. I then coded the overarching themes across these layers, such as “vagueness and opacity,” ‘accountability,’ and “over-moderation.” In my analysis, I recognize expert knowledge as comprising both exclusive, elite knowledge and practical, interpretive knowledge derived from personal and professional experiences (Meuser and Nagel, 2009), allowing for contradictions and variations in expert accounts that reflect the diverse interests’ interviewees represent. By centering the key aspects highlighted by interviewees regarding the regulation of adult platforms, I integrated their insights with my analysis of policy documents and articles, thus triangulating the data. The coding and triangulation enabled me to reconstruct the adult industry’s response to the regulatory power of payment intermediaries and the relationships and negotiations between credit card networks, payment processors, and platforms, revealing the various stakes and interests involved, and their impact.
“Protecting our network protecting you”: payment intermediaries as regulators
With the rise of user-generated sexual content, verification and moderation of prohibited content are at the forefront of social and regulatory concerns. Visa and Mastercard have developed rules and guidelines that attempt to address these concerns. As private companies, they can determine for which types of goods they will process payments and on what terms. Most national and supranational governments have legislation in place that regulates permitted sexual content, for example, through obscenity legislation and anti-CSAM legislation. Visa and Mastercard limit the types of sexual content that can be monetized on a global scale by not only prohibiting illegal material but also what they consider to be damaging for their brand. Since October 2021, Mastercard issued global requirements for moderation and verification requirements. Visa followed suit in 2022. These rules set strict requirements for documenting age verification and consent and for content marketing and moderation, that are not stipulated by law (Mastercard, 2021) (see Table 1 for an overview of Mastercard’s requirements). Mastercard titled its requirements “protecting our network, protecting you” and highlighted its commitment to setting industry standards (Verdeschi, 2021).
Overview of Mastercard rules.
This dedication to establishing industry standards is grounded in the understanding that credit card networks’ commercial interests extend beyond mere transaction volume; they also encompass appearing to possess integrity, safeguarding brand protection, and mitigating brand risk. They thus must contend with the reputational damage that servicing adult platforms may do. Pressing regulatory questions around efforts to combat CSAM and content moderation of user-generated social media platforms have become entangled with anti-porn advocacy groups (McKee and Lumby, 2022). Christian Evangelical anti-porn organizations such as Exodus Cry and the National Center On Sexual Exploitation (NCOSE), formerly called Morality in Media, have long petitioned credit card networks to cease processing for adult platforms, which chartered successes in the controversy around Pornhub in late 2020 and 2021. Visa (2021) emphasizes the importance of “safeguards against risks that may negatively affect brand or reputation.” This focus results in disproportionate scrutiny of the adult industry, leading to specific rules and requirements. The new standards set for adult content enhanced and revised Visa’s Integrity Risk Program, replacing the Brand Protection Program, and Mastercard’s Business Risk Assessment and Mitigation Program, whose titles illustrate their focus on brand protection and reputational risk (Mastercard, 2023b; Visa, 2023).
Beyond risks to their reputation, credit card network executives are also concerned with increased legal liability. FOSTA/SESTA legislation passed in the United States in 2018 increased the legal risks for credit card networks by making them liable for potential involvement in criminal or civil litigation related to user-generated content. FOSTA/SESTA makes platforms criminally and civilly liable for hosting content facilitating prostitution, thus conflating sex trafficking with prostitution (Albert, 2022; Elkin-Koren, 2020; Gorwa et al., 2020; Tripp, 2019). The law criminalizes communication facilitating prostitution on platforms, not the act itself. Though it does not directly target payment intermediaries, the legislation increases the likelihood of legal action if intermediaries operate on an adult platform because FOSTA/SESTA eliminates immunity from prosecution for entities facilitating prostitution or sex trafficking. One example of how this plays out is the ongoing 2022 civil lawsuit against MindGeek, Pornhub’s former owner, in which Visa was included as a defendant for allegedly profiting from Pornhub’s “trafficking venture” (Fleites v. MindGeek S.A.R.L, 2022: 145). The complaint cited Visa corporation’s supposed knowledge of sexual abuse and trafficking material based on its awareness of video titles and tags and its correspondence with advocacy groups (Fleites v. MindGeek S.A.R.L, 2022: 146). The judge in the case referenced the New York Times article mentioned earlier to argue that “Visa draws the informal boundaries of what types of content are fair game for profit, or fair game for its payment network, the mechanism through which MindGeek earns profit” (Fleites v. MindGeek S.A.R.L, 2022). Visa is thus potentially liable and can be litigated against because it has the power to affect Pornhub’s governance, which can also play into reputational risk stemming from civil litigations. Following the judge’s decision to allow the case to proceed, both Visa and Mastercard pulled their financial services from MindGeeks’ advertising program, having already ceased processing customers’ cards.
Despite the reputational risk and legal liability, the credit card networks have no substantial interest in cutting off the entire adult industry from processing. According to Christopher, a payment processor representative, “Visa is a business, as is Mastercard. They make money from this space, from transactions. . . . Why would [they] want to switch that off?” Globally, Visa and Mastercard account for 90% of credit and debit card transactions (Krauskopf, 2020). Although precise figures are unavailable, Joshua, an industry lawyer representing major adult platforms, contended that these statistics were similar for the adult industry. Instead, they have stipulated rules on content, content moderation, and verification, driven by commercial interests, reputational concerns, and legal liability.
While credit card networks establish the rules and requirements, processors specializing in the adult industry are responsible for ensuring compliance. Processing for the adult industry carries reputational risks and potential legal liabilities, necessitating heightened compliance requirements and efforts. This in turn necessitates and sustains the high-risk category. Whereas the high-risk category is on paper based on higher chargeback rates, during my interviews, both industry and payment processor representatives alike pushed back on the categorization of the adult industry as high risk because typical chargeback rates for adult platforms such as webcams are lower than for other industries categorized as high risk. While scholars researching the payment industry have underlined the role of processors in having to fight “friendly fraud” of disgruntled customers who ask for chargebacks (Swartz, 2020), other scholars have highlighted social stigma as a key factor behind the categorization of high risk (Beebe, 2022: 144–145; Easterbrook-Smith, 2022; Stardust et al., 2023). The categorization may stem not only from the number of chargebacks but also, as Florin, a payment processor representative, stated, because of:
the way you conceive of the thing. It’s like people see [it] not [as] strange, but [as] unusual to market services related to adult [entertainment]. So that’s why it’s considered high risk. But . . . in my opinion, it’s not high risk.
Another payment processor representative, Natalia, argued “why do I think it’s still considered high risk? Strictly because of the social stigmatization that this industry has.”
The classification produces and makes financially viable a distinct role for payment processors, among others in ensuring compliance. A high-risk yet legal industry may offer a business opportunity for processors (Swartz, 2020).
The high-risk category for the adult industry includes inflated fees: CCbill lists processing costs for adult businesses at 10.8%–14.5%, in comparison with 5.9% for other ventures classified as high risk and 1.5%–2.9% + €0.30 per transaction for major non-adult processors (CCBill, 2022; Stripe, 2023). Furthermore, credit card networks collect an adult-specific annual fee of €1000. Andras, a webcam platform representative, said, “Who is splitting [the money]? Everybody is splitting. Visa, Mastercard is getting more. The bank is getting more. Of course, Segpay, the payment processing [company], is getting more.” Florin, the payment processor representative, argued that a portion of the fees is required to make it financially worthwhile to work with adult businesses. Another part is necessary to pay for the increased costs of compliance and disputing chargebacks. Major processors, such as Segpay, invest these inflated fees in customer service and in the labor involved in ensuring good relationships with banks and credit card networks. As Christopher, a payment processor representative, maintained:
We represent the industry. We serve the industry. Our clients are in the industry. We see ourselves as part of the family, an important, integral part of that connection between the adult industry and the outside world of payments.
Adult-specific processors are integral to providing adult platforms with financial services and are a vital part of the industry as mediators and enforcers of compliance. Even though they charge high fees, representatives of major payment processors such as Segpay and Vendo understand their role in helping the adult industry. They are important and welcomed participants at the XBIZ industry conferences, where they sit on panels and sponsor events. Segpay’s CEO even serves on the board of the Free Speech Coalition, the US adult trade association. While the adult industry welcomes the major adult processors as important players and allies in keeping the industry thriving, they criticize Visa and Mastercard for unfairly regulating the industry, as I will demonstrate below.
“The unaccountable masters of our fate”: (un)accountability and opacity
Mastercard and Visa can dictate and set rules for platforms on the type of content that can be hosted and streamed, and on what terms. These rules and standards make duopolistic credit card networks private, top-down regulators of the adult industry, for which processors become mediators and enforcers. As industry lawyer Joshua put it, Visa and Mastercard “have the power, they control the keys to the Golden City.” Adult platforms are dependent on payment intermediaries for vital financial infrastructure and thus have to comply with these rules.
This creates de facto regulations set up and implemented in a way that is not accountable to the adult industry. Across the industry, there is a consensus that, as industry lawyer Joshua noted, “Visa and Mastercard acted because they were faced with a tremendous amount of pressure,” referring to the reputational risk and potential legal liability that the outcry against Mastercard and Visa’s processing for Pornhub posed in the wake of the New York Times article. Mastercard stated that in the process of drafting the rules they “shared plans with industry experts who are certified in preventing child exploitation” (Cole, 2023). Some of these organizations are explicitly anti-pornography. 2 However, the executives did not substantially involve people from the adult industry in drafting the rules. Mastercard met once with industry representatives, but only after the new requirements had been published, and according to trade association representative Olivia, “it took a lot to get that meeting.” In the aftermath of putting the new rules into force, Mastercard faced criticism from sex workers and their allies for discriminating against them. Notably, the American Civil Liberties Union filed a complaint against Mastercard with the Federal Trade Commission in August 2023, arguing that its requirements discriminate against sex workers (Holston-Zannell, 2023). In response, Mastercard defended its position, stating, “we welcome dialogue and different perspectives about our policies and programs. But let’s be clear—allegations of bias against adult content creators are demonstrably untrue” (Cole, 2023). This stance suggests that while Mastercard has included voices from anti-pornography groups, it is less inclined to address concerns raised by sex workers and the adult industry.
The credit card networks’ rules and standards create a regulatory framework as private actors without a decision-making process that reflects the interests of different stakeholders. Credit card networks do meet with processors for adult merchants, which are in direct contact with the platforms. “You know, none of these rules are created in ivory towers in isolation of the industry,” said Christopher, the payment processor representative. However, this means only a subset of the industry, particularly the large platforms, has the opportunity to voice its concerns through payment processers. As Olivia argued:
They’re kind of like the regulators that we have no ability to influence. If these rules came down from a legislative body, one could work with that body. You could protest publicly, and it may matter. There’s no democratic process for when Visa or Mastercard decides to change the rules.
Many interviewees expressed strong criticism of the lack of accountability in the decision-making process, highlighting that Visa and Mastercard operate as private entities rather than governmental regulators. As Ava, an activist and performer, stated, “this is not even government regulation; this is a private company,” while Andras, a representative from a webcam platform, pointed out that their rules are driven by “commercial interests.”
According to industry and platform representatives and lawyers, the rules have a repressive character that leads to an environment of uncertainty for both processors and platforms. Payment processors are accountable to the credit card networks and the banks for compliance and have the responsibility to the merchants to keep the money flowing. Under the Mastercard and Visa requirements, payment processors are prohibited from processing transactions for merchants that fail to comply. Yet, interpreting the rules can be difficult. At industry conference panels, representatives of payment processors such as Segpay and Vendo explained that the current rules the credit card networks impose are complicated to understand and enforce. 3 Even though processors have frequent meetings with Mastercard and Visa to ensure compliance, payment processor representatives stated that, a week before they went into force, Mastercard failed to specify how they should meet the new requirements. As Olivia, the trade association representative, stated, “These rules aren’t well defined. And so that can be really tricky.”
The credit card rules are vague yet punitive: non-compliance leads to heavy fines and the risk of losing payment processing. Platforms thus must maintain compliant relationships with processors to keep their payment infrastructure intact. As Joshua, echoing what multiple industry lawyers said during my interviews with them argued, for platforms, “the majority of your income is derived from Visa and Mastercard [so] that’s what you have to protect. You can’t make money. You’re not going to stay in business.” Andras, the webcam platform representative stated, “Visa, Mastercard: you just need to comply with them, there is just no working around.” This regulatory process prioritizes the relationships between platforms and payment intermediaries and their interests in the development and enforcement of the rules. In the following section, I explore how this plays out in terms of compliance with moderating allowable content.
“A game of broken telephone”: defining and moderating prohibited content
The rules set up by credit card networks and interpreted by processers on prohibited content illustrate the layers, stakes, and contestations in the enforcement of payment intermediaries’ rules in practice. As mentioned, credit card networks set requirements on prohibited content. The 2021 requirements state that their aim is to “monitor, block and remove unlawful content,” which is not just “illegal adult content” but also “unauthorized adult content.” Mastercard (2023a) defines “unlawful content” as follows in their Standards:
The sale of a product or service, including an image, which is patently offensive and lacks serious artistic value (such as, by way of example and not limitation, images of nonconsensual sexual behavior, sexual exploitation of a minor, nonconsensual mutilation of a person or body part, and bestiality), or any other material that the Corporation deems unacceptable to sell in connection with a Mark (p. 118).
Moreover, the 2021 rules specific to adult merchants further restrict the boundaries of allowable content by tightening permissible terminology for marketing content and the use of search terms and increasing moderation and compliance requirements. Credit card networks thus broadly define prohibited content in a way that conflates illegal activities such sexual exploitation of minors with credit cards’ brand safety.
In turn, processors set up their own terms and conditions based on compliance with credit card rules, the laws in the jurisdictions in which they work, and banking regulations. They interpret the limits of prohibited content based on their own risk tolerance and their past experiences interpreting credit card networks’ rules. For example, CCBill (2021) defines prohibited content as
“posting or display of any image or wording depicting or related to extreme violence, incest, snuff, scat or the elimination of any bodily waste on another person, mutilation, or rape anywhere on the site in a sexual or erotic manner.”
Segpay specifies that bestiality includes content depicting “acts with non-human creatures (aliens, mythological creatures, etc.)” (Segpay, n.d.). Most content is both legal in the European Union and the United States and permissible under the guidelines set by the credit card networks. Specific content, such as CSAM or bestiality, is both illegal and prohibited in content guidelines. However, some content is legal, yet prohibited by these guidelines, as I will show.
It can be challenging for processors to interpret which content is prohibited because credit card networks do not clearly define prohibited content. Christopher, the payment processor representative, describes the process as a game of broken telephone. For example, Visa and Mastercard prohibit the use of the term “force” because non-consensual content is illegal. But, as Christopher explained:
I recently had a discussion around the word “force,” and it’s a prohibited word from a list that Visa, Mastercard use. But again, common sense prevails here. In my mind, it depends on the context. If I’m demanding force in order to have sex with somebody and have that part of the content, that’s prohibited. That’s not acceptable. But if I’m saying, “May the force be with you.” Then, that’s a different context.
Visa and Mastercard define prohibited content and banned words, with no concern for meaning and context. The processors need to interpret and decide on their meaning in specific contexts, using “common sense.” However, such interpretations lead to ambiguity regarding what content and terminology are permissible.
Processors instruct platforms to “self-police” to maintain “healthy relationships” with payment intermediaries (Corona, 2023). Contending with processors’ terms and conditions, platforms strategically navigate the fine line of determining permissible content and implementing moderation that satisfies processors, banks, and credit card networks, all while avoiding overcensoring profitable content. Although platforms are risk-averse when facing the potential loss of payment processing, they also have a financial interest in not excessively restricting the content from which they profit. As Chloé, a platform representative and former performer turned coach for performers, said, “The sites are also playing a little bit of a game of don’t get caught or don’t get slapped by these super vague rules that have been handed down from the processors saying what is and is not allowed.” Platforms may therefore define terms and conditions and moderate content in an intentionally vague manner.
At the same time, when processors identify existing violations or notify platforms of new prohibited content, the platforms need to comply swiftly. Processors, credit card networks, and banks typically employ third-party entities like G2 and Webshield
4
to conduct thorough scans of platforms for prohibited content to ensure compliance (Vendo, n.d.). As Chloé noted:
But rules change and admin people have to kind of interpret it as they will. And then if they get scared, say that they get an email from someone saying, we’re cracking down on regulations, they’re like: “Oooh my God.”
Platforms may scrub sites of prohibited content to avoid payment shutdown. According to my interviewees, enforcement of prohibitions on content has intensified since the introduction of the 2021 requirements. Consequently, vagueness across the chain of command combined with punitive measures renders the specific boundaries of prohibited content both ambiguous and susceptible to change.
This process results in the prohibition of legal content. Returning to the example of the term “force,” content from individual creators may be flagged and deleted when the word “force” is used in a scene or describes it, even when no non-consensual force is used. An illustrative case is the “hypnosis kink,” widely acted out by financial dominatrixes. As Charlotte, a subscription platform representative, stated: “Because their [payment intermediaries’] compliance teams have deemed that this is non-consensual, it is therefore banned.” Chloé clarified that with the implementation of Mastercard’s requirements, processors promptly communicated via email to notify the platform of the newly imposed prohibition on such content: “And then all things that had to do with hypnosis were taken down.” However, Charlotte noted, “In reality, most of the hypnosis content is literally just silly stuff, like putting a swirl over your boobs and being hypnotized by my boobs.”
Moreover, platforms increasingly use AI-powered technological services to enforce ambiguous yet stringent rules regarding prohibited content, but without consideration for or safeguarding against the consequences. This is propelled by credit card networks’ requirements that encourage the use of automated tools to review content prior to publication and moderate live streams, as well as third-party solutions to validate government identification cards, which revolve around biometric data. Payment processors underline the importance of using AI to comply with the rules (Corona, 2023). While there are variations among platforms, industry lawyers and platform representatives stated that many already employed different forms of moderation. For example, some platforms use systems that take screenshots every 4–5 minutes and look for prohibited content using AI-generated technologies, after which a human being reviews the flagged content. Leading AI-powered content moderation and verification services now advertise on their websites that their services are necessary to comply with Mastercard requirements (Airis: protect, n.d.; VerifyMyContent, 2023).
Platforms’ introduction of AI-powered moderation tools reinforces selective over-moderation. While such tools may be valuable for scalable moderation and verification, ample research also points to possible algorithmic injustices and obscuring of decision-making processes (Elkin-Koren, 2020; Gorwa et al., 2020) that disproportionally affect marginalized communities (Haimson et al., 2021; Siapera, 2022). Vague, opaque terms and conditions reinforce disparate and unequal moderation (Quintais et al., 2023). As Mila, a content creator and educator for online sex workers, stated, “You’re having whatever their software is being put in place to decide if something violates or doesn’t. And because they’re trying to err on the side of safety, they’re over-policing instead of under.” The implications of automated content moderation crystallize in payment processors’ warning that pets wandering through performances constitutes prohibited content (Corona, 2023), as this could be picked up on as “bestiality.”
It is important to keep in mind that user-generated platforms like webcamming or subscription-based fan platforms are workplaces for sex workers. Sex workers earn their income directly from sales made on the platform so banning content or suspended livestreams jeopardizes their ability to earn a living. At the same time, adult platforms deny that the labor performed on the platform is work (Stegeman, 2024). The regulatory process set out in this article illustrates that sex workers’ interests are not represented throughout the process of developing and enforcing the rules. While it is beyond the scope of this article to develop these effects in full, research on the Mastercard rules from 2021 shows that their effects on sex workers are devastating: 90% of them experienced adverse effects, unequally born by queer, Black, kink, and fat sex workers (Webber, 2024).
Conclusion
This article demonstrates how the platformized adult industry responds to and navigates the rules, terms, conditions, and guidelines developed by the payment intermediaries, and with what effects for the regulation of adult platforms. While contending with valid concerns regarding moderation and verification of user-generated content in a context where traditional regulation has not always been adequate, payment intermediaries have developed a process that is top-down and largely unaccountable to the industry it de facto regulates. In this process, credit networks, processors, and platforms institute a selective, private ordering and moderation of permissible content that exceeds legal limitations, and instead is based on their different commercial interests. Credit card networks need to mitigate reputational risk and increased legal liability and do so by setting rules and guidelines specific to adult platforms. Payment processors and platforms interpret and implement the rules set in ways that favor platforms’ interests. The amalgamation of vague definitions credit card networks set to safeguard brand safety, processors’ risk-averse interpretations of those definitions, and adult platforms’ interests in protecting their relationship with payment intermediaries, all the while finding leeway in compliance, creates an opaque and punitive regulatory framework.
Sex workers’ interests are not reflected in the regulatory process even though the livelihoods and labor practices of sex workers are directly impacted by platform governance, and, as this article demonstrates, therefore also by payment intermediaries. Emphasizing the importance of creating labor platforms that prioritize the needs and interests of workers over profits (Scholz, 2023), sex workers have been advocating for worker-led and worker-centered platforms (West and Thornhill, 2023). A platform centered on workers could help mitigate the impact of payment intermediaries’ rules by prioritizing workers’ interests in their implementation and enforcement, while also communicating these priorities in their communication with payment processors. Still, given that the payment infrastructure forms the “trunk” of the platform ecosystem and affects all adult platforms situated in the “branches,” changes on a platform level can only mitigate, not change, the effects set by payment intermediaries.
This article ultimately highlights the need to investigate payment intermediaries as regulators of online sexual commerce, and more generally as key actors in platform governance, so as to hold them accountable. The power that Visa and Mastercard wield as large private corporations over which types of sexual content may be sold on adult platforms and on what terms are a case in point. In addressing the payment intermediaries’ role in the blatant deplatforming of sexual content, some scholars have emphasized the need to move beyond a narrow focus on the legal concept of freedom of expression to evaluate the moderation of online sexual content (Hill, 2024; Stardust et al., 2023). Hill (2024) calls for a more comprehensive approach that considers the responsibilities of intermediaries in the context of workers’ livelihoods, alongside regulatory frameworks grounded in sex positivity, consent, and choice. Building on this call, I also emphasize the need to shift accountability in the regulatory process, both toward the industry as a whole and more specifically toward workers, to ensure that regulatory principles, such as “consent,” and decisions are informed by workers’ lived experiences and interests. This would prevent such principles from being broad, underdeveloped standards that primarily serve to protect corporate reputations, and minimize legal risks, and from being inconsistently enforced to safeguard platform profits. This commitment points to the need to further research the comprehensive effects of this regulatory framework on online sex workers, both in terms of labor practices and precarities, as well as in terms of what types of sexual expression, identities, and practices are prohibited and why.
Footnotes
Acknowledgements
The author extends her gratitude to all participants in this study, who took the time to generously share their expertise. I would also like to thank the people with sex work experience who have shaped my thinking by informally sharing their experiences with me. I am grateful to Olav Velthuis, Thomas Poell, Hanne Stegeman, Emilija Jokubauskaitė, and the anonymous reviewers for their constructive and helpful feedback.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by Sociale en Geesteswetenschappen, NWO [Grant Number 406.DI.19.035].
