Abstract
To mediate the tensions between state regulation and content-creator incentives, Chinese social media platforms have developed an interesting practice of using platform official accounts to communicate their rules to the creator community. These accounts anthropomorphize platforms, enabling platforms to represent their regulatory bodies using fictional human characters or animated figures. The phenomenon of platform anthropomorphization in the Chinese context stems from a different ontological understanding of platform governance. The first part of this article discusses the logic of platform governance in China, and highlights a different state–platform relationship in comparison to the US and European countries. In the second part, the article turns to focus on Douyin as a case study to further investigate how Chinese social media platforms establish rules and govern content creators. By analysing Douyin's public-facing policy documents and its platform official account, Douyin Safety Centre, we reveal a mechanism of playful governance.
Introduction
Social media platforms often claim to be neutral in their role as aggregators that connect content creators and end-users, justifying their infrastructural or algorithmic changes as providing better services (Gillespie, 2018a). Nonetheless, digital media studies scholars have extensively discussed how changes in social media platforms’ technological architecture, algorithms, and policies can impact users’ practices (Ruberg, 2021; van Dijck, 2013). Content creators are, without doubt, governed by platforms in the sense of following the rules and boundaries created and imposed by platforms. Western platforms’ content curation and/or moderation process often relies on automation systems, which allows platforms to put on a mechanical face and downplay the factor of human intervention (Gorwa et al., 2020; West, 2018). The result of social media platforms enacting automatic control and offering limited opportunity for human interaction in platforms’ content moderation system is an increasing frustration for content creators. Accordingly, West (2018: 4381) calls for ‘considering what a content moderation system designed to educate and engage with users as community members might look like’.
In stark contrast with Western platforms’ efforts to position themselves as neutral conduits, Chinese social media platforms appear to blatantly admit and showcase human intervention in the process of content regulation and moderation. One of the most prominent social media platforms in China, Douyin, for instance, has created a 平台官方账号 (platform official account), namely, 抖音安全中心 (Douyin Safety Centre, henceforth DSC). Through DSC, Douyin regularly announces its updated rules and policies in alignment with the state's regulatory actions. In addition, Douyin directly engages with users and content creators, posting short videos on this account. Other Chinese social media platforms, including Kuaishou, also created similar platform official accounts. These accounts anthropomorphize platforms, enabling platforms to represent their regulatory bodies with fictional human characters or animated figures. These accounts are part of a contemporary trend of brands and entities increasingly engaging with personification as a tactic to interact with other users on social media (Sligh and Abidin, 2023). In this research, however, we focus on how platform official accounts enact communicative and discursive functions that allow platforms to discipline content creators through anthropomorphization.
This phenomenon of platform anthropomorphization in the Chinese social media landscape stems from a different ontological understanding of platform governance in China. The first part of this article discusses the realities of platform governance in China, and highlights a different state–platform–user relationship in comparison to the US and European countries. In the second part, the article’s focus turns to Douyin as a case study to further investigate how Chinese social media platforms establish and promote their own rules and policies, and govern content creators. For this inquiry, a thematic analysis of four of Douyin's public-facing policy documents that set up rules for users and content creators was conducted. By doing this, we reveal how Douyin adopts a logic of governing content creators, in contrast to the approach of governing content that Western social platforms such as Instagram and YouTube undertake. Following this analysis, we collected 132 short videos posted by DSC, and conducted a discourse analysis on the selected materials to reveal how Douyin relies on anthropomorphization to achieve communicative and discursive function for regulation. This process is conceptualized as a playful governance mechanism, which helps the platform deliver legal and moral commands to content creators, and optimize a self-disciplinary content creation process.
Contextualizing platform governance in China
In the realm of cultural production, platforms play a crucial role in affecting the process of content creation, distribution, and/or monetization, bringing forth economic, governmental, and infrastructural changes to the operation of cultural industries (Poell et al., 2022). Platforms provide economic and infrastructural conditions for cultural production constituting multi-sided markets, while they constantly interact with various stakeholders/actors, including governments, industries, content creators, and end-users. These interactions are structured within ‘the layers of governance relationships’ (Gorwa, 2019: 855), underlying the foundation for discussing and conceptualizing platform governance. Nonetheless, academics often discuss the two sets of governance relationships – between governments and platforms, and between platforms and users – separately.
Yet, these governance relationships are intertwined. One strand of scholarship focuses on the aspect of ‘governance of platforms’, viewing platforms as private companies whose conduct is directly informed by local, national, and supranational laws and policies (Gillespie, 2018b; Gorwa, 2019). This approach is often used to typify the Chinese situation. However, legislation often takes a long time to go into effect, enabling platforms to take actions for specific interventions by changing terms of service and policies, resulting in a mode of self-governance in relation to the platform's users (Gorwa, 2019). This self-governance approach, where the state plays a reduced role, has been emphasized as typifying internet governance in the US and Western European countries (Helberger et al., 2018; Stockmann, 2023). It grants a high degree of autonomy and flexibility to platforms in practice (Stockmann, 2023), yet there is a lack of research focusing on exactly how platforms practise self-regulation amid pressures for policy change.
The reason social media platforms self-regulate is not simply to meet state commands or legal requirements but also to safeguard (certain) users from hate speech or harassment and create a so-called ‘healthy’ community, to maintain their appeal to new users, advertisers and commercial partners, as well as the public (Gillespie, 2018a; Roberts, 2019). Hence, another strand of platform governance scholarship attends to the aspect of ‘governance by platforms’, focusing on how platforms actively take up the role of ‘governor’ to police the activities of content creators and users (Gillespie, 2018b). Existing research has revealed how social media platforms design their content policies to discursively prompt content creators to act according to their business models (Caplan and Gillespie, 2020; Petre et al., 2019). In addition, platforms justify and frame their interventions in content circulation as following community values and safety principles but avoid addressing their own responsibilities and accountability (Gillett et al., 2022; Scharlach et al., 2023). Platforms rely on algorithmic/automatic systems to classify, monitor, and remove content, and render the issue of transparency as one of technical measures or errors (Cotter, 2023; Gorwa et al., 2020). Nonetheless, Roberts (2019: 34) reminds us that ‘the complex process of sorting user-uploaded material into either the acceptable or the rejected pile is far beyond the capabilities of software or algorithms alone’. Therefore, social media platforms extensively hire or collaborate with commercial content moderators. While commercial content moderation has become a global industry, the existence of human moderators is never made explicit by the prominent social media platforms: they are treated as ‘hidden custodians’ (Gillespie, 2018a). In these ways, platforms attempt to take an ‘objective’ stand and downplay their power in defining and promoting certain economic and moral values.
Instead of having content moderation rules and procedures ‘coalesced into functioning technical and instructional systems’ and pushing them into the background (Gillespie, 2018a: 6), Chinese social media platforms tend to show more human intervention in the process of content moderation (Chen and Yang, 2023). In the case of Douyin, it performatively presents rules and procedures to users and creators through its platform official account. This interesting operation reflects Chinese social media platforms’ unique understandings of their political and social role in mediating state regulation and individual creator incentives.
The interconnectedness between the legal regime, the platform and the user is also dealt with by the emerging co-governance approach favoured by the European Union (EU), as reflected in the recent deliberation of the Digital Services Act (Stancil, 2022). The co-governance approach engages multi-stakeholders in the regulatory process to ensure that platforms, users, and governments take on cooperative responsibilities (Helberger et al., 2018; Gorwa, 2019; Stockmann, 2023). In this approach, the state no longer plays a reduced role, and platforms – as digital intermediaries – are allocated increased obligations under the legal regime of the EU (Leerssen, 2023). Hence, platforms need to navigate the laws and regulations, and the social and moral responsibilities that civic societies lay on them. Platform governance in China shows some similarities with this co-governance approach, but the Chinese logic of governance differs from the EU approach in terms of its legitimation and operations.
Understanding China's state–platform–user relationship with a different ontology
Hence, there is a need for a different ontological understanding of the state–platform–user configuration in the Chinese context. This state–platform–user configuration is far beyond the simplified premise whereby the Chinese state enacts direct censorship and control. Instead of solely looking at the state's role, we need to pay attention to platforms: how Chinese social media platforms developed in the context of omnipresent government regulation, and how they can expand our thinking about the platform society more generally (Davis and Xiao, 2021; de Kloet et al., 2019). For Chinese platforms, the precondition for accomplishing their commercial goals is not to antagonize the state (Chen and Yang, 2023; Chen et al., 2021; Wang and Lobato, 2019). Yang (2021) uses the term ‘state-sponsored platformization’ to describe the complexity behind the central and vital role that the party-state plays in shaping the development of commercial social media platforms in China. However, the state-sponsored platformization does not necessarily serve the state's political goals, as the state's influence is not imposed in a top-down model or through a command-and-control method (Yang, 2021).
To unravel the state–platform–user configuration, we first need to understand that the relationship between the party-state and social media platforms is embedded within an institutional logic of governance in China. In an important recent work, The Logic of Governance in China: An Organizational Approach, Zhou (2022: 26) offers a detailed account of how the Chinese configuration of governance manifests through ‘the relationship between the central and local governments’ as well as through ‘the interaction between state and society, as mediated by government bureaucracies’. Here, the societal response to state governance includes network ties and informal institutions perpetuated in market relations and financial resource mobilization accompanying the state organizational apparatus (Zhou, 2022). He suggests that the coexistence of arbitrary power by central authority and the bureaucratic power based on administrative positions, rules, and procedure is fundamental (2022: 9) to the formation of an institutional logic of governance in China. Using a Weberian lens, Zhou (2022: 42) argues that the logic of governance in China contrasts with Weber's view on a hierarchically structured bureaucracy that is validated by clean lines of authority and rational rules. Instead, the party-state dominated governance in China is supported by the intertwining, and sometimes contradictory, legal-rational legitimacy and charismatic authority. ‘The charismatic authority relies on officials at different levels to transmit and implement its directives and instructions’ (Zhou, 2022: 47), which often take the shape of abstract rules and ideological values.
When diverting Zhou’s (2022) theorization to examine platform governance in China, we see a significant similarity: the central government and the Cyberspace Administration of China hold arbitrary power in determining encompassing internet regulation (Miao and Lei, 2016). Meanwhile, platform companies are granted autonomy and rights to develop informal practices, coexisting with and complementing formal institutions, ranging from local Internet Information Offices to many other regulatory agencies such as the Ministry of Culture and Tourism and the State Administration for Market Regulation (Xu et al., 2022; Ye, 2023). Chinese platforms have become an integral part of the state's governance structure and they comply with this party-state dominated governance. But platform companies are also commercially driven and have their own agenda for business development. The tensions and interactions between state and private business in developing the digital economy (Hong, 2017) lead the state to adopt campaign-style internet governance to redefine the boundaries of flexibility from time to time.
In practice, platforms developed a set of informal practices and patronage ties to implement state policies or tame undesirable behaviours of users. These platform operations align with local bureaucrats’ logics of government behaviour in Zhou's (2022) work: platforms are in a subordinate position and sensitive to the targets and directives imposed by the central authority, thus they strategically form alliances with the regulatory agencies. For example, in April 2018, the National Office against Pornography and Illegal Publications initiated a meeting with representatives of 18 platforms to discuss content regulation and, in response, livestreaming platforms such as Huya and Douyu organized self-regulation in the following months, resulting in the temporary or permanent shutdown of thousands of user accounts that could be associated with harmful content (Ye, 2023). To some extent, state–platform–users forms a configuration in which the relationships are constantly shift between tight-coupling and loose-coupling, resembling Zhou's (2022: 105) model of ‘principal–supervisor–agent relations’ as a result of governance.
Due to the fact that internet laws and regulations issued by central authority and the multifaceted regulatory agencies are often arbitrary and sometimes feature conflicting interests (Shen, 2016; Miao et al., 2021), platforms need to react with flexibility in their regulatory operations, mediating formal policies and rules into cultural and social norms to effectively govern users. This brings us to the second layer of the state–platform–user configuration, which is the relationship between the platform and the user. Scholars point out that play has long been an integral component of the Chinese internet, where public expression is infused with jokes, jingles, videos, and various forms of visual memes as an idiosyncratic way of responding to strict regulation and content control from the state the platforms (Xie et al., 2021). Chen and Yang (2023) add that the interests of users and the party-state are not inherently incompatible, hence successful governance requires platforms to find and cultivate common ground between users and the party-state, and proactively facilitate communications between the two sides. To navigate diverse forms of users’ expression, social media platforms adopt the logic of incentive provision and discount factors beyond control, as in bureaucratic settings (Zhou, 2022: 168). On the one hand, platforms turn individual content creators into entrepreneurs who shape their approach to content production in line with the platform's regulations in exchange for commercial success (Hou and Zhang, 2022; Lin and de Kloet, 2019). On the other hand, platforms target proactive users, such as feminist activists, and ‘bomb’ their accounts to limit the visibility of politically sensitive content (Fang and Wu, 2022; Gu and Heemsbergen, 2023). Platforms provide services to users to achieve their own commercial goals while simultaneously mobilizing users not to go against the party-state's political demands (Chen et al., 2021; Xu et al., 2022). It is within this context that we see the rise of a playful governance mechanism, where social media platforms rely on anthropomorphization to playfully communicate and interact with content creators and skilfully enact governance.
Methods
Understanding how social media platforms respond to and incorporate legal requirements into their decision-making process remains a challenge for researchers and policymakers. One common approach is to conduct critical analysis on platforms’ public-facing policies and notices in order to reveal their cultural politics (Caplan and Gillespie, 2020; Gillett et al., 2022; Ruberg, 2021). In this research, we first selected the four most recently updated public-facing policy documents published by Douyin as research data: 1. 用户服务协议 (Terms of Service); 2. 社区自律公约 (Community Self-regulation Convention); 3. 规则解读 (Rules Explanation); 4. 社区评审团规范 (Community Jury Guidelines). The Terms of Service and the Community Self-regulation Convention are the two most important documents for our investigation. Terms of Service is a legal agreement between users and the platform that allocates the rights and responsibilities of both parties, while the Community Self-regulation Convention functions similarly to what other social media platforms would call ‘Community Guidelines’, outlining how users should interact with each other. The Rules Explanation and Community Jury Guidelines are supplementary materials provided by the platform to further explain its rules and regulatory mechanisms.
Employing a thematic analysis, we analysed these documents in three rounds. First, we coded anything relevant in relation to our research question (open coding). Next, we identified patterns and themes that emerged across these documents, presenting the standpoint of the platform in relation to state regulation, community protection, and/or social responsibility (e.g. axial coding). Last, and more specifically, we looked into what types of content are allowed or prohibited, what mechanisms are available for users to report ‘harmful content’ or circumvent hate speech, and what consequences are faced by content creators when violating the rules (through using selective coding). These three themes: allowed and prohibited content, available mechanisms, and consequences formed the guiding principles for the discourse analysis of video material posted by DSC (cf. Tonkiss, 1998).
We also collected all 132 short videos posted from June 2021 to August 2022. It was in June 2021 that Douyin's platform anthropomorphization started, quickly developing in a persona, Mei Loufeng, that speaks for Douyin. In August 2022, the persona of Loufeng morphed into ‘Community Jurors’. Our analysis specifically looked into what Foucault (1970) calls ‘procedures to maintain the status quo’. These procedures, internal and external to the discourse, inform us what the current status quo is: what is normal and what is not and why this is the case: and who is to be taken seriously and who is not (that is, who is the subject). Hence, this part focused on three subsequent questions to distil the discursive practices of Douyin that mobilize playful governance mechanisms: what is governed, who is governed, and who can govern. The first question responds to the aforementioned three themes under regulation, while the latter two investigate the power relations between Douyin and users within the regulatory mechanisms.
Reading Douyin's rules and policies
The four selected public-facing documents of Douyin function both as strategic documents responding to the broader regulatory landscape, and normative documents articulating visions of how users should express themselves and interact with others on the platforms (Scharlach et al., 2023). To register an account on Douyin, the user needs to provide personal data, including real name, national ID number, and contact details. This measure shows Douyin's compliance with China's strict internet laws that require users’ real-name registration (Jiang, 2016). Moreover, similar to other social media platforms that impose verification interventions for users, such as Facebook, this measure reflects an assumption that digital spaces are safer when users create accounts tethered to their real identities (Gillett et al., 2022: 6). Douyin's Terms of Service reflect this government regulation, as the document includes an item stating that if the platform and its affiliated companies were to be penalized by state institutions due to a user's illegal or unconventional behaviour, this user should compensate the platform for all economic loss. In addition, Douyin stipulates that it is not allowed to use an account name, profile image, or description that feigns or suggests another person's identity, neither can the registered real-name account be rented out or transferred to others. In this way, the platform emphasizes the close connection between an individual person and the user of an account.
Chinese internet users have long been using creative and subversive expressions – including metaphors, homophones of sensitive phrases, and sarcasm – to voice their political opinions, which is difficult to detect if only keywords-matching software is in use (Zidani, 2018). For Douyin, the circulation of subversive expressions on the platform could have political consequences, hence it is important for it to address this. In Douyin's newly updated Community Self-regulation Convention, one section was added: ‘We suggest content creators focus on the correct use of language. Avoid using wrongly written characters. Reduce using abbreviation as expression. Comply consciously with the standardized use of language.’ That is to say, Douyin recognizes individual creators as the sources of content production and circulation, and aims to co-opt them to use the correct language in order to better facilitate automatic detection of politically sensitive content. In this section, we reveal how Douyin relies on its policy documents to build an approach towards governing content creators.
First, Douyin's policy documents are dynamic and are subject to regular changes, presenting the platform as a constant mediator between tightening state regulation and the content creators’ activities. Douyin's Community Self-regulation Convention (henceforth, Convention) starts with a vision of the platform as ‘a healthy, harmonious, open, and friendly life-sharing platform’, followed by a statement of ‘deeply knowing the importance for users and the platform to have a regulated, equal, and positive community environment, we created this convention according to relevant laws and regulations’.
Notably, the most recent revisions of Douyin's Terms of Service and the Convention (published on 6 July 2022 and 21 September 2022 respectively) can be seen as responses to the state regulators’ announcement of the Code of Conduct for Livestreamers on 22 June. The Code of Conduct for Livestreamers, introduced by China's National Radio and Television Administration (NRTA) and the Ministry of Culture and Tourism, listed 31 categories of illegal, immoral and undesirable livestreaming activities (NRTA, 2022). These activities and behaviours mentioned in the Code of Conduct were then all incorporated into and elaborated in the Convention under the section ‘平台禁止及不欢迎 (Platform prohibits and discourages)’, covering themes such as national security, sovereignty and dignity, party leadership, ethnic unity, and mis/dis-information. The similarities between these documents demonstrates that one of Douyin's main goals of self-regulation is to comply with the state legal institutions. Additionally, these rules and regulations are reactive and incidental, functioning as a form of patchwork platform governance (Duguay et al., 2020; Gu and Heemsbergen, 2023) dealing with ambivalent issues outside the legal framework. Section 3 in Douyin’s Convention regulates actions and content of ‘侵犯人身权益 (Infringing personal rights)’, and section 4 on ‘违法和不良内容 (Illegal and harmful content)’ selected infringements that were once discussed heatedly in Chinese society and caused social unease and moral panic, such as soliciting suicide, animal abuse, and dangerous pranks.
Second, these documents also articulate what kinds of interaction between the platform and users, as well as between users, are promoted and what kinds are prohibited. In the Convention, Douyin writes: ‘We advocate for an equal and friendly Douyin community, for showing respect to other users in the community.’ The use of ‘we’ as a subject, and the personal tone of the language, reflects the platform's attempt to communicate with its users on seemingly equal grounds. What was newly added to Douyin's revised Convention in September 2022 was a section specifically devoted to tackling the issue of online violence. Under this section, Douyin elaborates on online violence, describing four detailed categories: humiliation and trolling; rumours and slander; harassment and blackmail; and 人肉搜索 (Human flesh search), referring to citizens’ voluntary collective participation in a people-powered search to identify a target or reveal the truth (Huang, 2021). It also promotes the platform's technical design, such as privacy settings and the function of 举报 (reporting) for protecting users from online violence. These formulations show how Douyin presents itself as a community gatekeeper to safeguard the common values of care and equality, suggesting that any rule-violation should be seen as the culpability of individual users.
Lastly, Douyin's documents indicate that the platform plays a role as governor, holding individuals accountable for their activities. In the Convention, Douyin states that: ‘we encourage you to document beautiful life and express your true self. Avoid posting false and buzzworthy fake-documentary behaviours, and avoid exaggerated and fake personas.’ The ‘true self’ discourse undertaken by the platform, instead of by content creators themselves as many studies revealed (cf. Duffy and Hund, 2019), is significant because it shows the platform's standpoint of treating the account created on the platform and its data profile as equal to an individual person.
From what Douyin set out in its policy documents there emerges an approach of governing content creators. Though this approach overlaps with what Fang and Wu (2022: 3562) called ‘user-targeted censorship’, as individual users are directly punished for their online activities and punishments are enforced at the platform level, this research chose to focus on the governing effect on content creators instead of on general users.
Governing content creators
The approach of governing content creators affects how Douyin circumvents the issue of legal liability when hosting content. The platform's mechanism for combating copyright violation differs fundamentally from the automated systems used by other mainstream social media platforms. For example, YouTube introduced a Content ID system to identify videos that match the same audio and visual materials as the content submitted by certain platform-approved copyright-owners, and block or remove copyright-infringing content (YouTube Help, 2023). This automated system, where content is subjected to investigation and detection, enables YouTube to maintain its conditional liability while hosting the sheer volume of user-generated videos (Poell et al., 2022: 83). Douyin, however, introduced a system of ‘Community Jury’ in 2022, similar to Bilibili's Disciplinary Committee mechanism (Chen and Yang, 2023). The Community Jury Guidelines explain how it works: ‘Adult users with good records’ are invited to participate in the verdict of copyright-related disputes on the platform. When a copyright-related dispute case happens, these invited users can review the case and decide whether it is a copyright violation; if more than 31 votes are collected of which 19 votes (over 60%) declare copyright violation, the account hosting this disputed content is penalized. As required by the EU's legal regime and the deliberation of Digital Service Act (Leerssen, 2023), YouTube and other Western social media platforms are held accountable for copyright infringement and hence adopt a ‘notice-and-takedown’ approach. Douyin, on the other hand, holds the related account owner/content creators accountable.
Content creators who violate the rules face several consequences, ranging from a temporary account suspension to a permanent ban from Douyin. In the Rules Explanation section, Douyin communicates its suggestions to rule-violating content creators through educational videos. Each of these videos ends with: If you are a first-time rule-breaker, follow the instructions on screen to enter the creator centre and get informed about the rules. You can unlock your account through participating in a quiz of the rules. If your account is banned permanently, you can only submit an appeal and wait for a platform personnel's evaluation.
However, instead of accusing rule-violators as ‘bad actors’ and banning their accounts directly, Douyin regards them as ‘astray actors’, who need the platform to (re)inform and educate them about rules and values. The platform, therefore, holds a moral authority over content creators, fostering a dynamic of platform paternalism (Petre et al., 2019). What is more complex and nuanced about Douyin's authority position is how it diverts the state's logic of governance over citizens into the platform space. The party-state has a long tradition of using political education and knowledge exams as key techniques to exert ideological influence on citizens, and even introduced the state-owned platform Xuexi Qianguo to enact a ‘platformized mode of persuasion’, nudging citizens to participate in political learning in a gamified manner (Liang et al., 2021: 1863). Douyin integrates political and moral values into carefully designed educational videos and quizzes, to construct the interrelation between nationalist and patriotic citizens and content creators with ‘positive energy’ (Chen et al., 2021). To maintain access to their cultivated networks, content creators tend to avoid the de-platforming experience (which entails economic loss) and try to reactivate their temporarily suspended accounts by aligning with Douyin's regulation (cf. Hou and Zhang, 2022).
From these examples, it is worth noting that the platform prioritizes controlling content creators over the administration of data/content. Douyin's policy documents articulate the Foucauldian idea of governmentality as a form of activity aiming to shape, guide, or affect the conduct of people, or ‘the conduct of conduct’ (Gordon, 1991: 2). Governing content creators is not a way to force content creators to do what the platform wants, but establishes what Foucault and Blasius (1993) called a subtle integration of the structures of coercion into the techniques of the self. Through the techniques of the self, creators conduct themselves in such a way as to produce desirable outcomes for the platform, turning themselves into the subject of governance.
Interacting with Douyin Safety Centre: towards a mechanism of playful governance
Douyin's strategy to co-opt creators to behave according to its regulations is further amplified by the platform official account (DSC) which anthropomorphizes the public-facing policy documents. Specifically, we look at DSC's anthropomorphization and how this process contributes to DSC's communicative and discursive functions for regulation – what we call a playful governance mechanism. The goal of being playful is twofold. First, it ‘connotes a light-hearted tone and something that is intended for amusement rather than to be taken seriously’, marking the autotelic characteristic of content circulating on Douyin (Chen et al., 2021: 111). When enacting regulation, DSC creates content that is imbued with this playful culture to please the content-creator community. Second, it softens Douyin's political intentions into light-hearted episodes of social incidents, and helps the platform to dissolve users/citizens’ creative manoeuvres against rules (Xie et al., 2021; Zidani, 2018). In this way, the platform controls political risks in a playful manner. This section will further illustrate the disciplinary power of playful governance that leads to content creators undertaking self-discipline.
DSC was created in November 2019. This account posted announcements and explanations of the platform's rules and policies, but this text-heavy video content (see Figure 1, left) failed to attract the attention of Douyin users and content creators. However, DSC gradually shifted its style of content creation. On 8 June 2021, DSC curated a video featuring two content moderators humorously reacting to and judging some problematic livestreaming performances (see Figure 1, middle). This video generated more than 312,000 ‘likes’, hitting the account's record in terms of visibility. From then on, DSC embraced a content curation logic that anthropomorphized the platform through a series of posts that playfully reveal the work of human content moderators and, later, another series featuring a fictional character Mei Loufeng, as an embodiment of the platform. The homonym of the name Mei Loufeng (see Figure 1, right) means Mr Not-forgetting-to-ban-any-account. He dresses as a 捕快 (captor), which is a police-officer-like figure in ancient Chinese society in most of the posted short videos.

From left to right, the screenshots show a change of content on DSC.
In terms of what is governed, DSC produced different short videos of content moderators playfully reacting to some banal activities deemed dangerous, harmful, or vulgar.
To effectively communicate the changing rules and regulations, the platform has had to develop an engaging style. The persona of Mei Loufeng comes into play. For instance, DSC posted one video (14 December 2021) to deal with men's nudity as a problem of vulgarity, featuring Mei Loufeng talking straight to camera: Let's see who we are going to deal with today – the Anubis dancing boy group! [see Figure 2] Shaking the flesh, performing the fat, and showing vulgarity. The platform will penalize relevant content. Now it is winter; gentlemen, please wear your clothes so as not to catch a cold, and improve your aesthetics when posting content.

Screenshots of the aforementioned video posted by DSC.
The playful style of these videos enables the platform to package its power over content creators in a light-hearted and amusing way, and in a discourse of care for the community (Ruckenstein and Turunen, 2020; West, 2018). As West's (2018: 4376–77) ‘affective dimensions of content moderation’ make clear, creators often complained over not knowing exactly how they violated community guidelines, and the lack of channels to communicate with a real person led to frustration. The difference is that DSC relies on its anthropomorphization to clearly, yet playfully, communicate its rules to content creators. This practice proves the platform's transparency in the rule-making process to some extent, and increases creators’ willingness to align with its rules. As a result, most content creators respond to this strategy of DSC positively. For instance, some of the half-naked men who were reminded by DSC to ‘put on clothes’ commented under the video, saying that they have deleted related content, and amusingly apologized for being inconsiderate in posting them. The positivity and compliance of content creators suggests that playfulness leads to a common ground for communicating and negotiating what (not) to do with the platform. These creators, as Xie et al. (2021: 371) put it, ‘played within the rules of the system’, in order to avoid being treated as ‘bad actors’ and suspended from the platform, which would hamper their commercial opportunities.
The discourses of who is governed and who can govern are generated through the co-construction of close affective relationships between DSC and content creators. DSC frequently collaborates with celebrities and wanghong 1 creators to produce playful yet educational short videos to optimize its content visibility and ensure creators’ alignment with the rules. In one way, DSC partakes in the memetic culture of the platform, creating imitational videos based on viral and trending content. This is a primary means for attaining visibility and sociality on Douyin (as well as TikTok) due to its technological features (Zulli and Zulli, 2022). For instance, DSC created one video (14 September 2021) with a popular hashtag #qingshuihebeat, which referred to trending content of a middle-aged woman cracking a whip on the ground that created a rhythmical beat. In this video, Mei Loufeng and his colleague announced that some ‘tainted accounts’ are permanently banned from the platform while vibing with the beat, delivering punitive results in a fun and popular way.
Additionally, DSC actively searches for interactions with content creators and encourages them to practise surveillance among their peers. The most prominent example is the ‘直播巡检 (Inspection livestreaming)’ sections organized by DSC, where Mei Loufeng enters different wanghong creators’ livestreaming channels and interacts with them. The highlights of each livestreaming section are then edited into funny short videos to post on DSC. One of these highlighted videos, which received more than 1.5 million ‘likes’, was pinned on top of the account, becoming the first post users see when they visit DSC's homepage. The video (25 November 2021) shows Mei Loufeng conducting a livestreaming inspection, asking three famous wanghong creators in sequence if they know or suspect anyone who would easily violate community rules. These three creators all responded playfully, turning the question into an entertainment show. The first creator, for instance, gave Mei Loufeng two (account) names and added: ‘they’re both rule-breakingly handsome (帅得违规)! That's a form of rule-violation, right?’ Mei Loufeng laughed at her joke and commented: ‘being rule-breakingly handsome would not cause account suspension’.
It is noteworthy that DSC's first livestreaming inspection was organized the day after the Cyberspace Administration of China issued a regulatory notice on ‘further strengthening regulation of online entertainment celebrities’ on 23 November 2021 (People.cn, 2021). This is by no means a coincidence because the state, platforms and entertainment industry associations often work in conjunction with each other in drafting and deliberating regulatory notices or measures (Xu et al., 2022; Ye, 2023). In other words, the organization of livestreaming inspection is an adaptive and performative move developed by DSC to appeal to state regulators, reflecting the platform's close monitoring of what is happening within its digital space.
DSC continued to conduct livestreaming inspections regularly from November 2021 to January 2022. In its last livestreaming section, the account of DSC was suspended for showcasing examples of ‘dangerous behaviours that involve scamming’, which was explained in a video (21 January 2022) posted shortly afterwards. In the same video, Mei Loufeng then introduced and promoted Douyin's AI-assisted safety system, which automatically collects data to evaluate risks of online scamming and sends out notifications to users. Whether this incident of account suspension was accidentally triggered by the automatic detection system, or a carefully staged promotional campaign in order to introduce Douyin's AI-assisted safety system, one result is certain: users paid attention to the DSC, reacting to this video with more than 163,000 ‘likes’. Hence DSC established an image of itself as equal to other creators on the platform from this incident – even as a platform official account, it is still subject to the same regulation as everyone else.
While DSC presents the discourse that everyone is subjected to the platform's governance with these examples, it also establishes a discourse of ‘everyone can practise governance’. DSC often collaborates with wanghong creators to produce short-video content, usually so as to promote certain campaigns or introduce new features of the platform. On 28 July 2022, Douyin launched a campaign called ‘人人都是梅楼封 (Everyone can be Mei Loufeng)’. In this video, after the usual opening by Mei Loufeng, Li Shufan, a lawyer and a wanghong creator, appeared in the same captor costume, took over from Mei Loufeng and introduced the platform's campaign against online trolling. In another video (26 August 2022), 10 wanghong creators (including Papi Jiang) appeared in sequence and announced the platform's Community Jury system together. These creators, representing ‘good creators’ who produce creative and original content, showed their support for the Community Jury system in dealing with copyright-infringing content and participated as jury members. In setting up the Community Jury system, Douyin hides its fundamental authority position behind a seemingly fair system of participatory surveillance of which everyone is part. The jury system adopted by Douyin, as well as other Chinese platforms, such as the food delivery platform Meituan, enables users to feel ‘the fun of passing judgement’ (Yang, 2023). This system enables content creators to participate in the platform's regulatory regime, marking a scene of participatory surveillance (Albrechtslund, 2008). Albrechtslund (2008) argues that participatory surveillance is embedded in the social and playful aspects of social networking sites, allowing users to practise mutual surveillance and build their subjectivity.
DSC's practice of creating playful videos based on viral content and organizing livestreaming inspections to interact with wanghong creators reveal that the platform embeds its disciplinary power in the culture of visibility (Zeng and Kaye 2022). It relies on trends or wanghong creators’ fame to generate visibility for its rules and norms, while the interactions with DSC, in turn, grant creators a certain degree of visibility and legitimacy. Hence, creators willingly partake in participatory surveillance, feeling a sense of autonomy and empowerment legitimized by the platform (Chen and Yang, 2023). In this way, Douyin optimizes a self-disciplinary content creation process in which creators’ subjectivity is shaped by the platform's monologic and playful discourses.
Conclusion
Building on Zhou's (2022) theorization of the logic of governance based on bureaucratic organizational settings in China, we articulate the symbiosis and isomorphism between state–platform and platform–creator under the topic of platform governance. Our analysis of Douyin's policy documents and its official account, Douyin Safety Centre, shed light on the phenomenon of platform anthropomorphization as a means of playful governance. By explaining Douyin's playful governance mechanism, we bring out the nuances of how these playful discourses contribute to the formation of multiple regimes of governmentality in China: governance is an activity that concerns relations between Douyin as a commercial platform and the party-state that holds centralized power, and relations between Douyin as a social and cultural institution and the creator community. In these entanglements, power is ‘never a fixed and closed regime, but rather an endless and open strategic game’ (Gordon, 1991: 5).
What lies at the core of this strategic game is a different ontological understanding of the political, economic, and moral position that social media platforms occupy in China, and how they approach human users/content creators as the subject of governance, nudging creators to join the game and play within their rules. We recognize the limitations of this research in explaining how creators perceive the playful governance mechanism. Future research could advance our analysis by studying the contestation and negotiation of content creators facing this state–platform symbiosis, and the micro-powers operating between content creators and their peers.
Footnotes
Acknowledgments
The authors wish to thank the editors of this special issue for their editorial guidance, and the anonymous reviewers for their invaluable feedback, and the participants at the Global Perspective of Platforms and Cultural Production workshop for their support and help.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
Notes
Author biographies
Zhen Ye is a PhD candidate in the Department of Media and Communication at Erasmus University Rotterdam. Her PhD research project studies the cultural production practices in China’s livestreaming industry.
Qian Huang is an assistant professor in the Centre for Media and Journalism Studies at University of Groningen. Her research interests and expertise include digital vigilantism, digital harms, creator culture, and digital culture in general.
Tonny Krijnen is an associate professor in media studies at the Erasmus University Rotterdam. Her research interests lie with popular television, morality, gender, and qualitative research methods. Her current research concentrates on big data and TV production, gendered morality and emotions, and transnationalization of the TV industry.
