Abstract
Based on in-depth interviews with 18 Chinese TikTok account nurturers and three months of online ethnographic observation, this study reveals how platform algorithms reconstruct users' physical bodies. The research shows that users seeking platform visibility “voluntarily” accept systematic algorithmic modifications to their bodies. Users: (1) manipulate their physiological rhythms, forming an “algorithmic clock”; (2) correct body postures, developing highly precise “movement patterns”; and (3) utilize painful bodily experiences to create “suffering capital.” I conceptualize these findings as the “algorithmicized body.” These practices present a significant paradox. Users willingly accept and intensify their alienation while pursuing autonomy. They instrumentalize basic physiological functions as traffic resources. By analyzing this microprocess of self-exploitation, the research reveals how platform algorithms not only control behavior as disciplinary devices but also penetrate and reshape users’ physiological foundations and perceptual systems. The study extends disciplinary theory to algorithmic environments and offers important insights for platform design ethics and user rights protection.
Introduction
The era of the algorithm has profoundly reshaped the interactive relationship between humans and their bodies (Crawford, 2021; Lupton, 2014, 2016; Zuboff, 2019). This process is particularly evident in China's short video platform—a competitive ecosystem with more than one billion users. Here, users are making unremitting strategic efforts to obtain the visibility of the algorithm. Although this echoes the global “visibility game” (Cotter, 2019; O’Meara, 2019), the practice in China is called “account nurturing”—a unique and strengthened phenomenon.
The core of account nurturing is a set of well-designed and often high-intensity strategies, and users carefully manage their content publishing schedule and interaction patterns 1 . The main purpose is to improve the perceived “weight” within the initial “Liuliang Chi” (traffic pool) 2 , and to show high activity, so as to send a signal to the platform algorithm that the account belongs to a real and active human user, rather than an automated robot or an inactive account. This distinction is crucial because the algorithm is designed to prioritize and amplify content from what it considers “real” sources 3 . Therefore, the ultimate goal of account nurturing is to obtain the coveted algorithmic recommendations and significantly improve traffic exposure (Zhang, 2023), which is the key prerequisite for gaining visibility and influence on these platforms. It is crucial to note that while platform officials deny the existence of account nurturing, insisting instead that visibility is driven by content quality or paid promotion tools like DOU+, these practices are nonetheless accepted as truths within the community.
The practice of “account nurturing” is the basic component of the so-called account operation. It is widely regarded as an essential step by aspiring creators and has become a routine procedure for a wide range of users, from independent individuals to talents who sign up for large-scale multi-channel network organizations. For new accounts, the initial registration period is a high-risk “novice training camp” stage, which may last from a short period of 7–15 days to a more intensive one to six months. However, once the account gains influence, this labor does not stop. For established creators, nurturing work has turned into a continuous effort to maintain the “weight” of their accounts and their relevance in the face of ever-changing algorithms. This kind of intense investment, whether in the initial sprint stage of gaining visibility or in the endless marathon of maintaining visibility, is a feature of the phenomenon of “account nurturing.” Account nurturers often nurture not just one account, but multiple accounts simultaneously. It raises a key question that is largely ignored in the wider literature of algorithmic games: What is the physical cost of this relentless pursuit of algorithms?
Although the existing research emphasizes the body as a power field in the platform economy (Bucher, 2017; Duggan et al., 2020
This study investigates how users complete self-exploitation in the process of account nurturing. I conducted in-depth interviews with 18 account nurturers in Chinese TikTok 4 . I also conducted an online ethnographic observation for three months. I am concerned about “self-harming” behaviors, such as upside-down sleep schedule, disordered biological clock, and self-imposed pain. These reveal how users can turn their bodies into platform “fuel” while pursuing algorithm visibility.
It is found that users embed their bodies into the algorithm system. They will change their sleeping habits, and they also develop some specific operation actions. Some people even develop a distorted atmosphere that beautifies self-harm into “culture” (Cheney-Lippold, 2017; Lupton, 2014, 2016). I conceptualize these findings as the “algorithmicized bodies.” This describes how platforms penetrate and manipulate users’ physical behaviors while users remain “voluntary” yet “uninformed” (de Certeau, 1984; Introna, 2016; Schüll, 2016).
These behaviors show that users are profoundly subject to algorithmic power, and they are really in a state controlled by the platform structure (Gane, 2012; Sewell, 1992). I found an obvious contradiction in user subjectivity: while pursuing freedom and exposure, everyone actively accepts and even strengthens “digital slavery” (Lordon, 2014; Zuboff, 2019). In order to be seen and recognized on the platform, users have to abide by the rules of the algorithm (Beer, 2019). As a result, users are not only governed by algorithms but also actively help maintain this control structure.
This study provides an important angle to understand “body politics” in the era of algorithm and also makes us rethink the ethical issues in platform design.
Literature review
Disciplinary power: From external control to internalized self-discipline
The way the power of discipline works is to gradually turn the external compulsion into self-management. It relies on some subtle skills, such as schedule and posture training, to incrementally normalize the body (Foucault, 1977). Foucault called this “anatomical politics,” which means that the body must be rendered both “useful” and “docile.” This kind of micro-discipline not only changes the movements of the body but also internalizes the norms slowly. What ultimately changes is not what the body can do but what the body is “molded into” (Crossley, 1996; Foucault, 1977).
As these skills become more and more sophisticated, the body becomes the basis for maintaining social order. Control not only affects behavior but also penetrates bodily rhythm and sensory experience (Turner, 2008). Therefore, the body is not passively accepting rules but itself is the place where social order is constantly being rebuilt (Shilling, 2003). For example, the industrial society requires the body to keep up with the rhythm of the machine (Thompson, 1967), and the consumer society promotes certain aesthetic standards (Featherstone, 2007). These norms leave traces inscribed upon the body itself (Bordo, 2023).
The most powerful thing about this kind of power is that people can start self-monitoring and restrain themselves without external supervision (Zuboff, 2019). Through what Foucault called “technologies of the self,” individuals can actively manage their bodies and minds in order to become his ideal self. As a result, people are both “people watching” and “people being watched” and are always in a state of self-examination (Bartky, 2015).
In neoliberal contexts, this self-discipline intensifies into a personal responsibility and moral obligation. Freedom is reframed as a regulated field of action, compelling individuals to address what were once political responsibilities as matters of private choice and personal endeavor. The body has become the core field of self-realization, and discipline has also changed from external pressure to internal driving force, making people take the initiative to implement discipline on themselves (Qiu, 2017).
Algorithmic discipline: Body politics in the platform economy
Platform economy adopts discipline methods different from traditional organizations. Platform discipline does not rely on visible supervisors but on invisible and ubiquitous algorithm monitoring and information asymmetry, which encourages users to self-discipline (Rosenblat and Stark, 2016; Zuboff, 2019). This kind of monitoring goes beyond the work results and extends to the whole labor process (Calo and Rosenblat, 2017). This platform-based discipline can be analyzed from three perspectives.
First of all, the algorithm plays a role as a new power device to realize comprehensive discipline. Platform economy builds a digital “panopticism” (Mateescu and Elish, 2019), that permeates the labor process, and uses rating system and algorithmic decision-making to cultivate workers’ dependence on data. Therefore, workers voluntarily pursue the performance standards defined by the platform (Fourcade and Healy, 2017; Woodcock and Johnson, 2018). This system guides workers to adjust themselves through data feedback, creating a more efficient and hidden form of discipline than the traditional command and punishment mechanism (Chen, 2018; Moore, 2017).
Second, the algorithm reconstructs labor control by inheriting Taylor's logic to decompose and reorganize labor in real time (Wood et al., 2019). The platform redefines the employment relationship as “partnership” and the salary as “reward,” thus circumventing the labor laws and covering up the extraction of surplus value (Duggan et al., 2020). By mediating power through the so-called objective data, the platform weakens workers’ resistance consciousness (Gandini, 2019; Lei, 2021), although the algorithm design itself embeds specific control logic and value orientation (Griesbach et al., 2019; Shapiro, 2018).
Thirdly, the algorithm realizes fine behavior discipline by redefining “ideal body.” On the physical level, they regulate the speed, route, and posture of body movement (Ravenelle, 2019; Rosenblat and Stark, 2016; Schor et al., 2020). On the performance level, the platform requires a specific emotional attitude and service attitude (Ticona and Mateescu, 2018). The uniqueness of this discipline lies in its numerical accuracy (Griesbach et al., 2019), using rating system to evaluate physical performance such as smile frequency and intonation change, and realizing accurate management of emotional labor through “emotional digitalization” (Macdonald and Giazitzoglu, 2019).
The existing research has thoroughly examined disciplinary methods and power mechanisms. However, it fails to profoundly reveal how the “disciplined” adapts and internalizes the algorithm rules through daily physical practice. Nor does it demonstrate how these translate into specific physical rhythms and habits. These microscopic physical practices are the material foundation of digital living. Insufficient emphasis has been placed upon these aspects within the current research. Moreover, the special digital environment of the Chinese mainland, featuring a very competitive platform market, massive scale of online population, and complex socioeconomic pressures, has led to a special relationship between the body and technology. The practice of account-nurturing is not only a techno-practice but a social and cultural phenomenon featuring a special logic of survival and value orientation. The current research has not delved deeply into the micro-body politics of the phenomenon.
Methodology
What is account nurturing?
The behavior of “account nurturing” is the performance of the “visibility game” acknowledged academically on different platforms (Bishop, 2019; Cotter, 2019; O’Meara, 2019). Much like their counterparts on Instagram or YouTube, the Chinese creator also skillfully decodes and adjusts themselves to the complex and opaque algorithms to achieve visibility and influence. Nevertheless, if one focuses merely on the fact that the account nurturing is a further instance of such a “visibility game” performance, the profound difference brought forward by the particular environment of the Chinese digital world would be overlooked. This environment is characterized by over one billion people and a competitive environment that could be described as “involution” and renders the online world a tough environment where competing for visibility is very harsh (Sun and Chen, 2021).
In China's short video ecosystem, “account nurturing” is a strategic process, which aims to make the account appear “valuable” in front of the opaque algorithm on the platform, so as to increase its “weight”—the jargon of algorithm goodwill-in order to get recommendations. Because the platform does not provide official guidance, users rely on “folk theory” spread through online communities and tutorials (Issar, 2024; Obreja, 2024). This process involves two main activities: first, by watching the video in its entirety, leaving comments in specific fields and maintaining the completely completed personal data, users present themselves as “ideal users” to demonstrate authenticity rather than appear as bots. Second, users strategically publish high-quality content at a well-designed rhythm to establish topic identity and ensure high initial participation, which is the key signal to the algorithm. The whole process operates as an iterative feedback loop, in which users improve their strategies by interpreting data signals such as viewing times. This has even turned into a business. People pay for “weight analysis” or buy and sell high-weight accounts. This proves that “weight” is seen as a real asset with a price, not just a superstition.
This kind of environmental pressure and the specific practice it brings have given extreme emphasis to the initial stage of account nurturing, and strengthened it into a basic and desperate “novice training camp” stage. Although it is a continuous process, the dangerous journey from “0 to 1”—from an unknown entity to an account with “weight” recognized by the algorithm—is the most critical. This is not a gradual optimization process but a high-risk probation period. Failure means that the account is actually downgraded to obscurity, forcing users to abandon it and start the whole expensive process again. It is the acute severity of this initial stage that indicates that “account nurturing” under the background of China is a practice different from the typical algorithm.
Sampling strategy
This research employs a qualitative, phased research design, generating data through a combination of semistructured interviews and digital ethnography to interpret the experience of the account nurturers within the Chinese TikTok platform. I conducted the research in two distinct but interconnected phases, designed to first generate and then rigorously test a grounded theoretical framework.
Phase 1: Foundational inquiry and initial theory generation (N = 12)
The initial stage of the study is guided by the principle of theoretical sampling. My main goal is not to achieve statistical generalization but to achieve profound conceptual depth. The key point of selection is to identify the following participants: (1) Having rich experience in Chinese TikTok; (2) Showed a lot of continuous time and energy investment in the account-nurturing activities, far exceeding leisure users; (3) Self-identity means consciously adjusting its physical practice and daily rhythm to meet the requirements of the algorithm. These participants are chosen because their intensive participation makes them ideal objects to generate the initial basic categories of my theory.
Phase 2: Theoretical expansion and validation (N = 6)
In order to surpass the initial discovery and test its conceptual generality, the second stage of data collection was carried out. The main goal of this stage is not statistical representativeness but theoretical triangulation. By deliberately recruiting more heterogeneous samples in age, gender, and professional background, I can systematically test whether the observed body discipline mechanism is mainly driven by the algorithm itself or can be better explained by other power dynamics. This expanded sample is more diverse in composition, including three new male participants and a wider age range, covering creators from different fields.
The final sample consists of 18 Chinese TikTok users. In the whole sample, participants’ motives are consistent, mainly including the desire to enhance personal influence, realize business, and develop their careers. Although sharing these common goals, participants come from different socioeconomic backgrounds (Patton, 2014).
Data collection
In-depth interviews
My core data come from in-depth semistructured interviews. Each of the 18 participants was interviewed once or twice, each lasting 25–120 min. All the interviews were conducted by online video, and I recorded and transcribed them with informed consent. The aims of the interview are centered around the following five aspects: (1) specific account-nurturing strategies and techniques; (2) their understanding and perception of the algorithms used by the platforms, (3) the impacts of account nurturing and the influence it has on the participants’ everyday life and physical condition, (4) their subjective experiences and emotional responses during the process; and (5) the reflection of the subjects concerning the relationship between the platform and the self.
After the funnel approach, the questions within the interview progressed from being general and then shifted toward the specific and meticulous information related to the experience of account-nurturing and the physical experience (Creswell and Poth, 2016; Roulston, 2010). Concerning the quality of the interview, I consider it paramount to first of all build a relationship of trust with the respondents. I take a nonjudgmental attitude, encourage free expression of complex feelings and contradictory experiences, and use probing techniques to explore key concepts in depth, paying special attention to participants’ rich descriptions of physical feelings (Holstein and Gubrium, 1995; Seidman, 2006).
Digital ethnography
In order to supplement the interview data, the digital ethnography method allows me to observe the actual behavior of participants in the natural environment. This method is very important to overcome the limitations of self-reporting by capturing the dynamic process and situational characteristics of these practices (Hine, 2020). Different from traditional ethnography, digital ethnography is particularly suitable for studying physical practice and subject construction in the digital environment, and revealing how users negotiate their identities and adjust their behaviors between online and offline life (Postill and Pink, 2012).
Specifically, I conducted a three-month online ethnographic observation in two account-nurturing communities on Chinese TikTok. With their consent, my observation focuses on: (1) the time pattern of their content release and platform activities; (2) their interactive behavior and algorithm participation strategy; (3) their content adjustments according to the algorithm feedback; (4) their sharing and discussion the experience of account nurturing online.
Iterative analysis and theoretical saturation
My data analysis follows an iterative and constant comparison method. Firstly, the data of the first stage is coded to establish the initial theoretical category. Subsequently, the data from the more diverse second-stage samples are analyzed and compared with these categories systematically. This process reveals that the core mechanism exists in different demographic groups. There is no fundamentally new theoretical category in the second stage; on the contrary, the existing concepts have been strengthened and verified. This strongly shows that my analysis has reached a stable state of theoretical saturation and supports the effectiveness of my core proposition in a wider population than the initial sample.
Data analysis: A phased and iterative grounded theory approach
My analysis is not a discrete stage after collecting all the data; on the contrary, it is an iterative process deeply intertwined with my two-stage research and design. Under the guidance of the grounded theory of constructivism (Charmaz and Thornberg, 2021), my main goal is to establish a theoretical explanation from scratch.
Stage 1: Initial coding
The first stage involves deep immersion in the original data, starting with 12 participants in the first stage. I carefully coded line by line all interview records and ethnographic field notes. In order to be as close to the participants’ real life as possible, I initially created native codes using their own words. For instance, the video produced by the subjects relating to staying awake until 3 am was coded as “Time Difference Reversal Strategy”; The subjects’ meticulous explanation of the sliding action is code named “Algorithm Perception Behavior”; The subjects’ list of activities scheduled for a day is code named “Bodily Discipline Establishment.” The first code work generated a total of 47 codes.
Stage 2: Focused coding and saturation testing
After stage one, I undertook two distinct analytical steps for the data I have collected, reflecting the nature of my research design. Firstly, I worked with the first stage data of the study (N = 12) and generated a preliminary theoretical model using the first stage data. This consists of examining the relationship of the first 47 codes generated and contrasting and grouping them. This analysis transition, from concrete description to more abstract concepts, enabled me to condense the initial code into 17 core analysis categories. For example, I combined the initial codes such as “reverse sleep schedule,” “use melatonin to force sleep,” and “scheduling meals around traffic peaks.” By synthesizing these, I developed a higher level analysis category “algorithm clock.” Similarly, initial codes such as “performance fatigue” and “dramatizing pain to produce content” are integrated into the core category “suffering capital.”
Second, after establishing the initial 17 categories from the first stage data, I introduced the data of six participants in the second stage. The main function of this second and more heterogeneous sample is to strictly test the transferability of emerging theories. I systematically analyze the comparison between these new data and the existing 17 categories, looking for evidence of confirmation or nonconfirmation, and looking for any new categories that may appear. This comparative analysis confirms that core experiences and mechanisms exist in different demographic and content fields, and there is no need for fundamentally new core categories. This strongly shows that my analysis has reached a steady state of theoretical saturation.
Stage 3: Theoretical coding
As the 17 core categories are now verified in the whole sample (N = 18), the final stage is to integrate them into a coherent and structured theoretical framework. I analyzed the relationship between 17 categories, and grouped them into a broader and overall theoretical dimension, which can tell the central story of the data. This final integration reveals the three dominant themes around which the core categories gather, and forms the main theoretical dimensions to construct my research findings: (1) Body rhythm: integrating the categories of “algorithmic clock” and “segmented sleep strategy.” (2) Body movement: the categories of “precise sliding” and “body discipline” are grouped. (3) Body perception: the categories of “suffering capital” and “exhaustion commercialization” are brought together. These three dimensions constitute the core component of my core argument: the emergence of “algorithmic body.”
Weaving together interviews and ethnography
My analysis is driven by the iterative dialogue between my two data sources and compares the “backstage” private pain revealed in the interview with the “foreground” public strategy observed in the online community. This kind of triangulation is very important to determine whether the “algorithmic body” is the product of algorithmic training or other power dynamics (such as youth culture or gendered labor).
Two-stage sampling design played a decisive role in this respect. It reveals that the core mechanism of algorithmic discipline—time control, instrumental movement and the commodification of pain-appears in different participants. This cross-demographic consistency shows that although the social background shapes its expression, the common experience of controlling opaque and data-driven systems is the basic force to forge an algorithmic body.
Findings: Composition of the algorithmic body
Bodily rhythm: Algorithm manipulating the body's biological clock
The platform algorithm reconstructs the body by colonizing its physiological rhythm. For the aspiring workers in this study, it is an economic necessity, not an option, to keep consistent with the time preferred by the algorithm. This forced recalibration has created what I call an “algorithmic clock”: a new time system that is not determined by nature or industry but is purely determined by the flow of traffic. This system fundamentally reshapes the body into the material carrier of platform economic logic.
Users generally report that the TikTok platform has never explicitly informed them of the specific rules about the “best release time.” On the contrary, it indirectly guides users to adjust their biological rhythm through unpredictable traffic fluctuations. An interviewee said with exhaustion and helplessness: “Between 1 and 3 a.m. is when emotional knowledge content gets the most traffic. For three months, I've been working during these hours and sleeping during the day. The platform never tells you these things. You can only understand what the algorithm likes through constant trial and error…” (P5, Male, 24) The interviewee showed material evidence in his home. Walls, refrigerator doors, and even bathroom mirrors are covered with notes detailing the flow fluctuations and interaction indicators at different times. The most striking thing is the items neatly arranged on his bedside table-instant coffee and melatonin. The former forces vigilance at night; the latter forces daytime sleep. “Without them, my body cannot sustain staying up late…” (P5, Male, 24)
Another account nurturer explained her time adjustments based on data analysis: “… I've changed my schedule. I go to sleep at 2 a.m. and wake up at noon. Although my family jokingly calls me a ‘rat person,’ the follower count proves it's worth it.” (P1, Female, 21) She showed me her personal notes. These records track her posts and their overall performance and become the basis of her physiological rhythm adjustment.
Account nurturers generally recognize that missing algorithm-preferred optimal time slots may cause their content to be buried. This urgency drives them to adjust their schedules even at the cost of sleep quality. Some creators adopt more demanding schedules to maximize their perceived activity. One interviewee described how she adapted her sleep: “To keep my account feeling active, sometimes I have to break up my sleep. I might wake up in the middle of the night just to check my feed and reply to comments, then try to go back to sleep.” (P3, Female, 20) When I asked whether the platform explicitly requires this, she pointed out: “TikTok never says it directly, but it educates you with view counts and follower numbers…” (P3, Female, 20) This often involves setting alarms at unusual hours, a practice that blurs the line between rest and work. As she said: “The hardest part is… um… you can never really switch off. After I post a video, my brain just goes into overdrive. I'm constantly, and I mean constantly, refreshing the data panel, you know? Especially for the first hour. It feels like you can't even put your phone down for a second. I get so paranoid about missing a sudden traffic spike… I mean, the phone comes with me everywhere.” (P3, Female, 20) “I don't pull all-nighters anymore, but my ‘golden hours’ for posting and engagement are definitely between 9 p.m. and midnight. My entire evening is structured around that.” (P18, Male, 32).
The “algorithmic clock” not only affects sleep but also permeates basic physiological functions such as eating and swallowing. In the interview, I noticed that the interviewees generally lacked appetite at noon. An interviewee said: “My original lunch time was 12 p.m., but now it's 2 p.m. I no longer eat because I'm hungry.” (P11, Female, 21)
The reconstruction of physiological rhythm is finally embodied in specific body marks. Account nurturers are often pallid. They often show a physiological state that is inconsistent with the actual time—they are unusually alert at midnight and their eyes are shining with dim light; in the daytime, it looks tired and sleepy and often falls into a short sleep in public places. Their eating patterns are completely out of place—they often force themselves to eat when they are not hungry, or delay eating when they are extremely hungry. Their mood swings are out of touch with the natural circadian rhythm, but highly synchronized with the platform traffic cycle. Many interviewees reported unusual excitement or anger during their most active period at midnight.
It is worth noting that the information about the best release time and interaction mode comes from user community discussions and unofficial tutorials, rather than official platform notifications. These unofficial online communities spread folk experiences. For example, the platform algorithm has a “midnight snack period.” Content pushed at 11 pm has a 60% higher chance of being recommended again the next morning. Different video content should match different “prime time”—morning is suitable for knowledge content (when people's mental state is most suitable for high-quality knowledge content), noon is suitable for game content (accompanying meals), evening is suitable for all types, and midnight is suitable for knowledge and emotional content (because people are more emotionally sensitive at night). The platform never explained the details of these algorithms. The official explanation only mentions providing high-quality content, but does not define quality. However, even when the account nurturers later decided to completely reverse their biological clocks to adapt to the algorithm recommendation, they did not receive health warnings or restrictive suggestions about this practice.
For the new users who join the ranks of account nurturing, this biological rhythm reconstruction process lacks any official guidance. All the account nurturers I interviewed said that even after many content releases, they never received any advice on adjusting their biological clocks. These new users can only guess the algorithm by observing the behavior patterns of experienced account nurturers or accidentally discovering unofficial guides. The platform vaguely encourages users to “stay active” and “update regularly” without clearly defining the specific standards of these terms. These vague guidelines force users to overinterpret and overexecute, which eventually leads to a comprehensive reconstruction of physiological rhythm.
Bodily movement: Algorithm correcting body posture
For the content creators in this study, as aspiring workers, opaque algorithm is the ultimate gatekeeper of economic feasibility. Deciphering it is not an intellectual exercise but a core labor activity. Through collective consultation and research on online “grey literature,” they built a powerful “algorithmic imagination” (Bucher, 2017)—a shared strategic framework for optimizing their production efficiency.
This widely accepted “folk theory” hypothesis algorithm constructs user profiles by accurately analyzing a series of interactive data, including viewing time, likes, comments, sharing, and most importantly, video completion rate. Within this framework, different user behaviors are given different “weights,” and the completion rate is generally considered as the most important indicator. Although the exact calculation formula is still a “black box,” the creator's action model is essentially an attempt to simulate the complex formula they imagined—they describe it as a dynamic system integrating elements such as “behavior weight,” “access duration,” and “content attenuation factor.”
The research shows that account nurturers transform their bodies into the best interactive interface with the algorithm through systematic training and sensorimotor modification. They form a specific “movement pattern.” This modification not only involves adjusting body movements but also establishes a closed feedback loop. The body is instrumentalized as the ideal executor of the algorithm. An interviewee explained when demonstrating his elaborate sliding screen technology: “When you like similar content, the platform tags you with labels. It categorizes your account into that category, and you receive traffic. But in this process, liking and swiping content has tricks. The frequency is crucial. You can't be too fast, or the platform will judge you as a bot and you get ‘Xianliu’ (traffic throttled)
5
… Of course, you can't be too slow either. You know, that would make your day inefficient…… ” (P10, Male, 23)
Interviewees also described in detail how they determined optimal operation parameters through experimentation. These include screen brightness adjustments (turning off auto-brightness during the day, not having the screen too dark at night, enabling eye protection mode), grip optimization (for right-handed people, try to hold the phone with the left hand), and even breathing rhythm control (breathing through the nose can maintain more regular swiping)—all to create the most algorithm-friendly body posture. One account nurturer showed his specially designed phone stand and chair: “…45 degrees is the optimal viewing angle. It allows the face to be captured perfectly by the camera while keeping the wrist in a relaxed state… The neck doesn't feel uncomfortable either.” (P8, Male, 22) He demonstrated different ways of interacting with the screen, developed through trial and error: “You learn there's a difference. For some content you just swipe past quickly, but for the really good stuff, you might slow down. You also get a feel for where the ad links are and try to avoid them…” (P8, Male, 22)
My research observed that a clear majority of interviewees demonstrated similar precise finger movement patterns. These movements have been internalized as muscle memory. They generally emphasize that establishing this standardized set of interaction movements aims to make the algorithm judge them as “real people.” As one interviewee said: “The important thing is to interact genuinely with content, not too… mechanically… You can't maintain the same set of movements all the time. Sometimes you have to change because the algorithm will recognize it.” (P12, Female, 24) Ironically, account nurturers generally develop a set of robot-like “mechanical movements” to prevent being identified as “bots” by the algorithm.
Additionally, account nurturers transform platform algorithms into quantifiable mechanical behaviors. They establish a “bodily discipline”—converting daily behaviors into measurable account-nurturing indicators: “Oh, for sure. You can't just do it randomly. I've got my own… well, I guess you'd call it a daily routine, a checklist in my head. It's not something I write down, it's just you do it so much it becomes automatic. Just seeing what others are posting, leaving comments. Then around lunchtime, that's my dedicated window to reply to my own comments, clear out the inbox. The evening is the main event, you know? I'll post, usually around ten, and then for the next hour. Just watching the phone, seeing how the numbers tick up, replying to people right away. It's exhausting, honestly, but you get this feeling that if you miss your window, the whole day's wasted.” (P11, Female, 21)
Accompanying this bodily discipline is a rigorous self-monitoring system. It transforms one's behaviors and states into analyzable, optimizable data objects, forming an internalized algorithmic gaze. Users view their bodies as projects for continuous testing and improvement. They constantly iterate to optimize their performance: “Sometimes I'll try out different things with a new account, just to see what works. Like, I might try posting at different times or using a different style of comment. You have to experiment a bit, it's the only way to get a sense of the patterns and what the algorithm seems to reward.” (P6, Female, 19)
Interaction postures are also systematically instrumentalized. One interviewee showed a large collection of categorized comment templates in her phone's notes app, with different interaction language for each content type. “I have preset comment copy for different types of content, stored in my phone's clipboard. I can type comments without looking at the keyboard while watching videos.” (P11, Female, 21) These preset comments aren't just time-saving tools but algorithmic optimization strategies. They believe certain word combinations and expressions make them appear more like “normal people,” reinforcing algorithmic judgment and improving interaction quality scores.
Bodily perception: Algorithm utilizing the body's painful experience
The research reveals a disturbing economic logic: the algorithm makes it possible to turn physical pain into productive assets. For aspiring workers who lack other forms of capital, the body itself becomes the last production tool, and its pain becomes a resource to be exploited. This logic supports the formation of the value system of “no pain, no gain,” in which pain is redefined as a symbol of “great sacrifice” and professionalism. This reconstruction of perception creates a self-perpetuating deadlock: users earn algorithmic rewards by commercializing their pain, which in turn deepens their alienation and forges a new subjectivity—a “pain container” that is proud of its own exploitation process.
I observed that the platform algorithm cultivates a systematic inspirational narrative. They continuously instill the value that pain equals success in account nurturers. This was confirmed in interviews. Multiple interviewees reported frequently seeing content about successful people sleeping only few hours daily or famous historical figures’ “suffering quotes” in their feeds. These contents often appear with dramatically impactful images (such as someone exercising under dim lights or laborers doing physical work, suggesting competitors are secretly working hard and you should suffer more). They feature intense, highly provocative background music (lyrics often praising life's difficulties). One interviewee stated: “As I started staying up late to publish work, I would scroll through many motivational videos, like… calm seas don't make skilled sailors… tough days will pass… be open and honest… you always dream of success but are trapped in laziness… these kinds of videos.” (P4, Female, 22)
After long-term exposure to this narrative environment, I observed that many interviewees experienced substantial changes in pain perception. They become numb and insensitive to physical pain. They not only ignore physical warning signs but also interpret chronic pain as a “necessary path for growth.” This passivation process gradually internalizes into a part of the body. Respondents are used to it and no longer need conscious intervention. I recorded all kinds of reinterpreted physical symptoms. The universality of these questions among all 18 participants is shocking. The most commonly reported symptoms include poor eyesight, chronic neck pain and frequent migraine, while many people also cite wrist discomfort and back pain. These symptoms should be used as warning signals. However, in the interviewee's cognitive system, they are translated into “proof of professional input” and “signs of imminent breakthrough.” Ironically, these presentations themselves become the reason why users must continue to sacrifice their bodies—to maintain this desperate image and satisfy the algorithmic label of their accounts.
It is worth noting that some interviewees dramatized their pain. For example, participants consciously turn personal struggles into content materials. These performances include emotional outbursts such as screaming and sudden crazy laughter; physical pain behaviors such as kneeling, slapping yourself, or twisting your body; and strategically highlight their own physical defects to attract the attention of the audience. The vast majority of respondents said that adding “painful elements” to the content usually produces above-average interaction. More notably, a considerable number of participants admitted to deliberately exaggerating or prolonging the painful experience in order to gain more sympathy and recognition. “Sometimes the work is actually finished, but I deliberately… add edited footage at the end of the video… often publishing a ‘finally completed’ status after staying up until dawn. Publishing at dawn earns more sympathy and praise.” (P11, Female, 21) One interviewee candidly shared: “It's a strange thing you learn… the more worn-out you look, the more people seem to connect. So you learn to capture and even lean into those moments of pure exhaustion. You post that raw, tired selfie after a long night, with a caption about the hustle, and the sympathy just pours in. Honestly, in a weird way, all that encouragement becomes the very fuel that makes you do it all over again.” (P7, Female, 21)
This disturbing yet strategic commodification of pain, or “suffering capital,” proved to be a robust concept that extended beyond the initial sample. A knowledge creator who shares career advice videos confirmed this logic, illustrating how the performance of struggle is perceived as a marker of authenticity and value. “In the academic vlogging, showing your messy desk, your stacks of books, and talking about burnout is a key part of the genre. It's a performance of intellectual suffering. It makes you seem more ‘real’ and ‘relatable’ than someone who just presents the polished final product. The struggle itself becomes part of your brand” (P17, Female, 25).
Physical pain is no longer an obstacle to production, but a resource for production. It creates the illusion of “professional identity” and interprets the platform algorithmic discipline as career development. The user and the algorithm form a complex parasitic relationship. Together, they turn health damage into data capital and social recognition. This distorted culture of pain is not completely imposed by the platform. It is constructed through a series of vague, indirect yet effective mechanisms. The algorithm reward system of the platform, the strengthening of community culture, and the user's own desire for success have jointly created this phenomenon.
Discussion and conclusion
This study reveals how algorithms deeply reconstruct users’ physical existence by analyzing Chinese TikTok account nurturers’ bodily practices. It particularly analyzes how algorithms form an “algorithmicized body” phenomenon by “manipulating physiological rhythms,” “correcting postural movements,” and “utilizing painful experiences.” This process not only confirms platform capitalism's disciplinary techniques but also shows how users “unconsciously” embed their bodies in algorithmic environments to redefine self, body, and value.
Transformations of disciplinary power in the algorithmic era
My discovery challenges and expands the traditional discipline theory and reveals the key transformation of power in the era of algorithm. Different from Foucault's (1977) traditional discipline which relies on visible supervisors, platform algorithm creates an invisible, decentralized, and opaque disciplinary mechanism (Rahman, 2021). This invisibility forces users to overexecute in the case of fuzzy rules, facing a constantly changing technical system rather than a clear monitor. The evidence of the interviewees’ extensive self-monitoring system, such as their detailed personal notes and systematic routines to track performance, proves this point. Although Lupton (2016) describes voluntary self-tracking, my research reveals a quasi-mandatory quantification, which is a necessary means to control opaque systems, rather than a purely selective project (Lupton, 2014). This kind of data collection is less about introspection and more about appeasing the algorithm “the other.”
Compared to Foucault's concepts, algorithmic discipline achieves a more covert and effective form of internalization. Users turn the platform rules into self-imposed body discipline. Take the “algorithmic clock” as an example, they reverse the biological clock for the peak traffic. This shows that discipline permeates the basic physiological rhythm, transcends Foucault's concern for thought and behavior, and directly reconstructs physiological instinct. Although Lupton (2014) discussed self-governance through self-tracing, here it was deeply shaped by platform logic. “Consciousness” becomes the sensitivity to algorithmic feedback, and “governance” becomes the alignment of body function with platform perception preference, which usually harms health. The body itself becomes a data point, which is constantly measured according to external and algorithmically defined ideals.
The algorithmicized body: A fundamental transformation beyond habitual adaptation
I propose the concept of the “algorithmicized body” to capture what my findings suggest is not merely an adaptation but a fundamental recoding of corporeal ontology. This concept describes a process in which the body's materiality—its physiological rhythms, perceptual systems, and even its experience of pain—is systematically reconfigured to align with the operational logic of a nonhuman, data-driven system. To substantiate this claim of a fundamental shift, the phenomenon must be distinguished from the more familiar logics of rational self-regulation and social conformity.
First, the process transcends the teleology of rational self-regulation. A student modifying their sleep for an exam operates within a human-centric narrative: a temporary sacrifice for a clearly defined, finite goal, followed by an expected return to normalcy (Vohs and Baumeister, 2016). The algorithmicized body, in contrast, operates under a machinic, non-narrative imperative: a chronic, open-ended struggle for perpetual optimization. The goal is not a future state of achievement but a continuous state of legibility to an opaque system whose demands are infinite. This marks a profound shift from a body managed in the service of a life-narrative to a body reconfigured as a standing resource for a data-processing apparatus.
Second, this phenomenon is distinct from the logic of social conformity. An individual conforming to a peer group seeks social legibility—recognition from a human gaze that interprets shared cultural signs (Turner, 1991). The account nurturer, however, seeks statistical legibility. This calculated performance for a nonhuman audience is born of economic necessity, with the goal of becoming a more efficient data-producing node for a system that demands constant optimization. This is evidenced by the interviewees who systematically test and track different engagement strategies—such as varying their posting times or comment styles—embodying a calculated fusion of flesh and data logic.
These distinctions reveal why this constitutes a “fundamental transformation.” While earlier theories powerfully described how culture inscribes itself onto the body's surface (Bordo, 2023), the algorithmicized body entails a deeper, more participatory process. The subject becomes a willing data-analyst of their own flesh, actively reengineering their corporeal existence to function as an optimized infrastructure for the algorithm. The most profound evidence of this ontological recoding lies at the perceptual level: the transformation of pain from a biological signal of self-preservation into “suffering capital”—a productive asset to be endured, performed, and commodified for algorithmic visibility. This instrumentalization reveals the body's deep and willing integration into the logic of data-driven capital itself.
The paradox of algorithmic labor: Perceived autonomy and embodied alienation
The most contradictory finding of this study is the profound paradox users exhibit, which provides a crucial answer to why these specific users, accept such profound bodily modifications. They engage in “digital enslavement,” yet experience it through a powerful illusion of autonomy (Lordon, 2014). This paradox is actively sustained by a symbiotic, dual mechanism that serves the very power structure within the platform economy the algorithm represents. On one hand, the platform's deliberate opacity—its “black box” nature—compels users to seek guidance within folk communities, a process I directly observed in my ethnographic work within online account-nurturing forums. Here, they collaboratively construct an “algorithmic imaginary” (Bucher, 2017), a shared set of beliefs and “folk theories” that offer a map for navigating the unknown (Zhang et al., 2023). This act of collective sense-making provides a crucial foundation for their feeling of agency. These communities function as informal institutions where users share strategies and collectively legitimate certain practices, transforming individual guesswork into a shared, strategic endeavor (Obreja, 2024). This social construction of the algorithm creates a powerful sense of control and community, masking the underlying precarity of their labor (Issar, 2024).
On the other hand, these imagined strategies are immediately and relentlessly subjected to the algorithm's direct, material judgment. The platform's logic manifests not through formal rules but through the cold, immediate feedback of data: the rise and fall of view counts, engagement rates, and follower metrics. The human body becomes the primary interface where these two forces—collective imagination and machinic feedback—collide. An imagined tactic, like sacrificing sleep, is performed by the body and instantly evaluated by the algorithm. If the data rewards the act, the bodily practice is validated and internalized; if not, it is discarded. This relentless cycle transforms their labor into a Sisyphean task: users are condemned to perpetually push the boulder of content creation up the hill of algorithmic approval, only to see it roll back down with the next unpredictable shift in traffic or platform logic.
In the process of account nurturing, creators’ bodies become productive tools, fueling a form of alienation that runs far deeper than mere data extraction (Gerbaudo, 2024; Zuboff, 2019). Through their unpaid labor, they generate the platform's core asset—the audience commodity (Jones, 2023), yet paradoxically internalize this exploitation as an act of “autonomous choice.” This misrecognition is rooted in an algorithmic update to Foucault's “technologies of the self,” which recasts self-optimization as a strategic dialogue with a nonhuman system. By learning to reframe their physical suffering as a professional “badge of honor,” creators willingly participate in their own alienation, leaving their exhausted bodies as the most tangible evidence of this modern paradox (Lupton, 2014).
Limitations and future directions
Although it provides insights into the behavior of account nurturing, there are several notable limitations in this study. First of all, my samples are mainly from Chinese mainland platform users. This may limit the cross-cultural applicability of discovery. The practice of digital labor in different cultural backgrounds may produce differentiated performances influenced by different values and institutional environments. Second, although the qualitative methods used provide rich experience descriptions, they are difficult to accurately quantify the universality and influence level of various behaviors of account nurturing. Future research can be combined with large-scale questionnaire survey to evaluate the distribution characteristics and influencing factors of different breeding mechanisms more systematically. Finally, although this study identifies the main construction mechanisms in the behavior of account nurturing, it has limited exploration on how these mechanisms dynamically evolve with the update of platform algorithms. With the rapid development of platform technology, the practice of account nurturing may present a new adaptation mode that needs continuous observation and analysis (Lupton, 2014).
Supplemental Material
sj-docx-1-bds-10.1177_20539517251407955 - Supplemental material for Algorithmicized bodies: Account nurturing behaviors on Chinese short video platforms
Supplemental material, sj-docx-1-bds-10.1177_20539517251407955 for Algorithmicized bodies: Account nurturing behaviors on Chinese short video platforms by Yizhang Quan in Big Data & Society
Footnotes
Acknowledgements
The author sincerely appreciates Editor-in-Chief Matthew Zook, Associate Editor Jing Zeng, and the three anonymous reviewers for their generous intellectual engagement with this work. Their thoughtful critiques and patient guidance were instrumental in refining this paper to its present form. The journey of completing this article was a challenging one, marked by the personal loss of my dear grandfather. His unwavering belief in me was a guiding light through a difficult time. The author dedicates this work to his loving memory.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data availability
Data are not publicly available due to ethical restrictions.
Supplemental material
Supplemental material for this article is available online.
Notes
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
