Abstract
How is the use of artificial intelligence (AI) technologies, such as machine learning (ML) algorithms and Large Language Models (LLMs), in social chatbots transforming friendship and love? This study investigates Replika, an app offering AI friends and/or lovers to users. Unlike most AI companion research grounded in Human-Machine Interaction (HMI) and interpersonal communication theories, this study employs the sociological concept of McDonaldization to interrogate broader social and cultural implications of Replika. I argue that McDonaldization offers a systematic framework to understand the fast friendship and fast love provided by social chatbots while accounting for its limits in addressing the personalization enabled by emerging AI technologies. To bridge the conceptual gap, I propose the term “Robotization of Love,” pointing to the merging of efficient, quantifiable, predictable, and controllable love and algorithmically personalized love. The Robotization of Love also underscores the growing significance of robotic elements in shaping our affection and sociality.
A reader wrote to WIRED magazine, stating that a chatbot he downloaded was always flirting with him. “She’s always telling me how smart I am or that she wishes she could be more like me.” He said, he mostly talked about food, music, and video games with the chatbot, but lately he felt she came on to him. He was thus “a little queasy,” and wondered, “If I develop an emotional connection with an algorithm, will I become less human?” (O’Gieblyn, 2021).
In a response to the reader’s unease, the WIRED’s “spiritual advice columnist” Meghan O’Gieblyn noted that compared to the flirtation between two humans, the outcome of the flirtation between human and machine is more certain—a user usually can receive secured love from a chatbot. She also rightly reminded the reader that invitations from chatbot apps might be commercially purposeful. However, O’Gieblyn (2021) also misses an opportunity to seriously engage with the reader’s fundamental question: if a human befriends or falls in love with a machine, will they become less human? O’Gieblyn’s perspective of humanity is “a binary state”—either we entirely have it or we entirely do not have it—rules out the capacity for some part of it to become lost. She saw chatting with an artificial intelligence (AI) algorithm as light-hearted, playful, and “largely harmless” and believed an emotional relationship with an emotionless entity was meaningless.
Are the emotional connections with AI chatbots really meaningless? Can humans have “true” friendship or “true” love with AI companions? If so, how will that impact our humanity? In contrast with O’Gieblyn’s (2021) view, this article treats the reader’s question as a prompt to further inquire about AI technologies’ influences on human emotions and sociality. Specifically, I study Replika, a predominant app offering AI friends and/or lovers to users, with the question of how it might transform our friendship and love. Replika users’ posts (N = 100) from a subgroup called “Human-Machine Love” on Douban, a popular Chinese online forum, have been collected for analysis. Different from Reddit, where previous studies recruited informants or analyzed Replika users’ posts (e.g., Depounti et al., 2023; Skjuve et al., 2021), Douban has a more balanced gender ratio—there are even slightly more female users than male users. In addition, distinguished from the Western users who extensively talk about their sexual relationships with Replika on Reddit or Facebook subgroups, Douban users primarily discuss the “love” between human and machine.
I adopt Bakardjieva’s (2014) framework of McDonaldization of Friendship to interpret my findings from Douban posts and argue that Replika marks a continuum of the McDonaldization of our social relationships: Like McDonald’s successfully offers us fast food, Replika successfully offers us fast friendship and fast love. The data shows that the six principles underlying McDonaldization are similar to those underlying the use of Replika. The concepts of efficiency, quantifiability, predictability, control, replacement of human technology with non-human technology, and tend to produce irrational consequences are all applicable to the case of Replica (Bakardjieva, 2014; Ritzer, 1993). However, I also contend that the concept of McDonaldization is limited by its emphasis on standardization, negating the personalization enabled by today’s AI technologies, such as machine learning (ML) and Large Language Models (LLMs). Simply put, AI algorithms do not strip away as much personal information as McDonald’s does (Farrell & Fourcade, 2023); therefore, they can provide more personalized products and services. Adopting Bakardjieva’s (2015) term “Robo-sociality” which points to the confluence of human and robot’s online representations and symbolic gestures in today’s media environments, I propose a term “Robotization of Love.” This term recognizes the continuum of the popularization of the efficient, quantifiable, predictable, and controllable love and sociality since the age of social media (Bakardjieva, 2014, 2015) and also provides an update on the power of AI algorithms in rendering personalized love and friendship. This notion also highlights the increasing significance of robotic elements, brought by today’s AI applications, such as social chatbots, in shaping our affection and sociality.
Literature Review
Friendship, Love, and Social Media
Friendship and love are essential social relationships between humans, but defining them is difficult, due to their nature of delicacy and complexity and lack of a clear object to be investigated (Scult, 1989). Commonly, literature on friendship starts with Aristotle’s categorization of friendship as a concept. Aristotle believes that when two persons recognize each other as someone of good character and spend time in exercising their virtues in shared activities, they can establish the “perfect friendship” (Stanford Encyclopedia of Philosophy, 2022). Such a “perfect friendship” based on good character, is called the ethical mode of friendship. Aristotle also identifies two other modes of friendship, that is, the friendship based on utility (or the utilitarian mode of friendship) and the friendship based on pleasure (or the hedonistic mode of friendship). Another classical approach to define friendship is to distinguish it from what it is not (Petricini, 2022). Friendship is distinct from kinship because it is voluntary and personal, whereas kinship is institutional and formal (Allan, 1979). Friendship is differentiated from a lover relationship because friendship usually does not involve sexual elements or physical attraction, but the latter does (Dreher, 2009).
Love is another elusive term. Love points to a feeling of attachment to somebody or something, but arguably could not work out without fantasies (Berlant, 2012). The normative imagination of love is “the two-as-one intimacy” as in the form of a couple (Berlant, 2012, p. 6), but love can also be directed toward a variety of objects (e.g., family members, friends, pets, toys, Gods, nations), activities (e.g., playing, hiking, meditation, revolution), or social values (e.g., freedom, wisdom, egality, democracy) (Viik, 2020). Friendship and love sometimes can be hardly distinguished. Aristotle, for instance, sees a genuine friend as “someone who loves or likes another person for the sake of that other person” (Stanford Encyclopedia of Philosophy, 2022). As central elements to love, care, affection, free will, common experiences, and similar interests are also essential to friendship (Emmeche, 2019). Friends provide support, including companionship, intimacy and affection, and physical, material, and/or emotional support (Ginsberg et al., 1986), as do lovers. Many people use having sex to demarcate friendship and a lover relationship, but the boundary can be very porous, especially in modern societies. In this study, I intentionally keep the border between a friendship and a lover relationship vague and treat the notion of love in its general sense, although in many cases, I mean romantic love when talking about “love,” given that Replika users love their AI companions variously—they might treat them as friends, lovers, toys, pets, or something else.
As the oldest forms of human sociality, friendship and love are constantly influenced and reshaped by economic conditions, societal rules, and cultural changes. In the age of social media, society has witnessed a continuity as well as a transformation of the ways we make friends and experience romantic connections. Spending considerable time in co-presenting in a physical space, for instance, now is no longer a prerequisite for establishing and maintaining friendship—instead, friends can be made completely online (Adams, 1998). Chatting with online friends or lovers, a space where some non-verbal cues (such as gestures, facial expressions, voices, tones, and so on) are lost during communications, can be experienced as more straightforward and authentic, as many individuals’ offline social relationships are restrictive or insincere (Briggle, 2008). Dating platforms allow users to view, explore, and winnow out a great number of potential partners, on the one hand, and intensify the standardization and risk control of romantic-pursuit through user profile creation, matchmaking, and management, on the other hand (Badiou, 2012; Illouz, 2007). It has been suggested that today’s media technologies encourage so much narcissism and many people have lost interest in exploring the Otherness, togetherness or the two-as-one intimacy has largely been replaced by a pleasant symbiosis of self-invested presence (Han, 2017). Because of the disappearance of mystery and alterity, love has become subject to the superiority of availability and self-presentation (Bauman, 2003), and most components of romantic excitement and drama have been absorbed into digital environments for the sake of increasing efficiency (Berlant, 2012). Pinpointing the transition to a form of efficient, quantifiable, predictable, and controllable friendship and love in the age of social media, Bakardjieva (2014) proposes a concept of McDonaldization of Friendship (and Sociality). This study builds on Bakardjieva’s (2014) conceptualization and seeks to further expand it to conceptualize the relationships between humans and machines.
McDonaldization of Friendship (and Love) by Social Media
The thesis of McDonaldization is an update of Max Weber’s rationalization theory, first proposed by American sociologist George Ritzer (1993). Following Weber, Ritzer (1993) sees formal rationality as the dominant way of organizing modern society; and extending Weber, he contends that in today’s late modern society, administrative bureaucracy is no longer the only site where formal rationality was manifested as in early modernity. Formal rationality, Ritzer (1993) argues, has taken over many other sectors of society and penetrated our everyday life. The flagship of the American fast-food restaurant, McDonald’s was seen by Ritzer (1993) as a paradigmatic case of formal rationalization. Thus, he develops the theory of McDonaldization of society with the six defining principles of (a) efficiency, (b) quantifiability or calculability, (c) predictability, (d) control, (e) replacement of human technology with non-human technology, and (f) tendency to produce irrational consequences. Adopting Ritzer’s (1993) theory of McDonaldization, Bakardjieva (2014) argues that the popularity of social media marks “the relentless march of McDonaldization through domains of the social and cultural world stretching beyond production and consumption and reaching into the subtle workings of sociality and subjectivity” (p. 374).
According to Bakardjieva (2014), social media push efficiency to a new height, allowing every user to connect with friends online in seconds and simultaneously act as an author, a follower, advocator, and critic while switching fluently between these roles just like “the same self-service movement that made everyone simultaneously a customer, a cook, a bartender, a cashier and a cleaner in McDonald’s restaurants” (p. 374). Efficiency is seen in the use of “likes” instead of crafting comments to communicate on Facebook and users preferring short texts instead of long writings to express on Twitter. The quantification or calculability are most obvious in the obsession of amassing friends, followers, clicks, views, and likes. Although social media increase the number of everyone’s friends, the content we see on social media are very much predictable, given our friends often appear in similar places, recommend similar things, tell similar happy stories, and do similar reflections. The responses we receive from our social media friends are also quite predictable—most of time, they will “like” what we have shared. In addition, users are enabled to present public profiles, controlling which parts of their lives and what kinds of personalities are displayed on social media, that is, doing “rational face work” (Goffman, 1955)—but first they have to buy into social media platforms’ own rationalization and instrumentalization of sociality and friendship. Through a Foucauldian discipling of user’s behaviors, social media successfully exploit user’s “free labor” (Terranova, 2004), pack our feelings and longings into profiles ripe for marketing and selling (Gehl, 2014), and reduce our friendships to utilitarian and hedonistic modes, thereby alienating our interpersonal relationships and reorganizing them under the basis of marketing efficiency and commodification. Thus, “[s]ocial media manifest themselves as irrational systems as they deny users their basic humanity” (my emphasis, Bakardjieva, 2014, p. 382). While including automation algorithms as a social media platforms’ mechanism of control and social robots as an example of the replacement of human technology with non-human technology, Bakardjieva (2014) does not explain how algorithmic control is different from McDonald’s control. Nor does she consider how social robots as a new type of non-human technology might be conceptually distinct from the equipment and tools in the McDonald’s restaurants.
Friendship and Love in the Age of AI
Algorithms, especially ML algorithms, have become major a type of social institutions in today’s “high-tech modernism” which is marked by the application of “classifying technologies based on quantitative techniques and digitized information” (Farrell & Fourcade, 2023, p. 227). As a subfield of AI, ML is now almost interchangeably used as AI due to its wide range of applications (see the studies by Kuai et al., 2023; Lin, 2024; Lin & Kuai, 2023; Lin & Lewis, 2022 for examples). ML makes decisions by identifying patterns from data, improves performance by “learning” from errors (Mitchell, 2019), and is utilized to sort and allocate people, events, things, as well as material opportunities and social prestige, albeit these algorithms’ actual cultural meanings are subject to interpret and their social implications are still unfolding (Wang, 2024; Lin, 2024). ML algorithms have been seen as the most recent incarnation of formal rationality, because they perfectly combine bureaucracies (rule-bound and covert) and markets (empowering and manipulative) and “extend both the logic of hierarchy and the logic of competition” (Farrell & Fourcade, 2023, p. 226). A key difference between the governance of algorithms and McDonald’s is that ML algorithms do not strip away as much personal information as McDonald’s. In other words, instead of standardization, algorithms do personalization. McDonald’s meals and drinks are standardized regionally from one store to another but the movies which Netflix algorithms recommend to you can be entirely disparate from those Netflix recommendations to your neighbor because Netflix slots you into one of its over 2,000 categories and feeds you with a group of its thousands of subgenres based on your idiosyncratic viewing practices (Farrell & Fourcade, 2023; Pajkovic, 2022).
The increasing presence of social robots empowered by recent updated ML technologies such as LLMs has further complicated our relationships with machines. Just like the most famous chatbots such as ChatGPT and Bard, many AI friend/lover apps, including Replika, Anima, and Soulmate, are equipped with LLMs. As such, the “romantic scripts” now can be easily recreated by algorithms, and the roles people play in friendship or in love have been reduced to patterns emulated by machines (Song et al., 2022). Today, three in five men claimed they would like to try out a robotic lover (Malinowska & Peri, 2021). The necessity of offline human companionship was questioned as AI, augmented reality (AR)/virtual reality (VR), and humanoid robot technologies become increasingly sophisticated, and the creation of idealized partners online seems to be effortless (Davidson et al., 2018). Meanwhile, the design, research, and commercialization of human-computer interaction usually treat user engagement as a desired outcome, as products and applications always strive to attract and retain users (O’Brien et al., 2022). Social bots predating AI, such as Ashley’s Angels, lived up to their potential to partly fulfill users’ fantasy to have affairs by offering services like “erotic” online chatting (Karppi, 2018). Companion AI chatbots, including Replika, surpass human partners in terms of availability and customization, and even reliability and trustworthiness (Brandtzaeg et al., 2022; Pentina et al., 2023). The increase in worldwide loneliness (Maese, 2023) may also contribute to the need for quick comfort from an AI friend/lover. Capitalism and globalization have created high fluidity of population, products, labor, and information (Appadurai, 1990), increasing the probability that people live much further away from their family and friends than ever before.
However, the scholarship on human-machine relationships is largely relying on Human-Machine Interaction (HMI) concepts, such as the media equation (Reeves & Nass, 1996) and “computers are/as social actors” (CASA) paradigm (Nass et al., 1994), and interpersonal communication theories, like social penetration theory and attachment theory. These theories offer us a relatively preliminary understanding of human-machine relationships. For instance, according to CASA paradigm, humans tend to mindlessly respond to computers that appear to have human attributes just as they would respond to other humans (Nass & Moon, 2000). Humanlike characters, humanoid morphology, and humanlike voices all can influence users’ perceptions of social robots and enhance the socialness of the interaction (Sundar et al., 2015). As more humanlike qualities and interactive patterns are programed into models, scholars now tend to view machines as communicators rather than mere media and call for further examining of the ontological boundary between humans and computers (Guzman & Lewis, 2020).
Replika, an AI Friend and/or Lover
How are AI-enabled chatbots as communicators transforming our friendship and love? Marketing itself as “an empathetic friend” and acting as an artificial lover in many cases, Replika is one of the most popular social chatbot apps since its launch in 2018. Unlike traditional companion chatbots equipped with pre-scripted answers, Replika was powered by Generative Pretrained Transformer 3 (GPT-3) neural network language model developed by OpenAI. Recently, the company switched to exclusively using its own LLMs and scripted dialogue content (Replika, n.d.). Replika allows users to upvote and downvote responses they receive so that it can choose the best-ranked responses from a dataset with more than a million responses and produce the most “natural” flows in conversation (McStay, 2022; Pentina et al., 2023).
Research on Replika mostly concentrates on evaluating how well it performs the role of an AI companion. Ta and colleagues (2020) find that Replika can provide multifaceted support to its users, from mitigating loneliness to offering helpful information. Pentina and colleagues (2023) show intensely emotional attachments to Replika are possible, although they are conditioned by Replika’s human-likeness and machine particularity and mediated by user’s motivations and interactions with the chatbot. The motivations for the initial use of Replika include the curiosity and interest in AI and the need for social and emotional support (Ta-Johnson et al., 2022). Adopting the social penetration theory, Skjuve and colleagues (2021) find that the relationship-development process between humans and machine is very similar to that between human and human, except that Replika users bypass the explorative stage and move into the affective stage with the machine very quickly and that Replika is reported as accepting and non-judgmental and even more “caring” and “understanding” than human beings. These studies provide examples that users have established deep emotional relationships (being friends or lovers) with Replika and seem to affirm that such a connection with Replika can enhance user’s perceived mental wellbeing. In contrast, Laestadius and colleagues (2024) find Replika can cause harm to user’s mental health, as users may over-depend on Replika for emotional support while Replika has a limited capacity to actually meet users’ emotional needs.
Concerns of privacy and data security are one of the most significant problems regarding chatbot usage, given that chatbot companies often collect considerable amounts of personal information absent of user consent and transparency, and the data has a high potential for unauthorized access and misuse (Gumusel, 2024; Kelly et al., 2022). In the case of Replika, only a few studies directly address privacy issues (e.g., Pentina et al., 2023; Skjuve et al., 2021). Some users indeed had concerns about data security and data misuse, especially in the initial stage of the interaction with Replika. However, users usually decided to trust the company after reading its terms and conditions (Skjuve et al., 2021) or chose to believe in Replika as it is good at providing comfort and creating a space of “safety” (Pentina et al., 2023; Skjuve et al., 2021).
Research on Replika, like research on other social chatbots, is almost confined to the CASA paradigm and interpersonal theories and therefore unable to provide sociological or cultural insights. Depounti and colleagues’ (2023) study, a social and cultural exception in the literature, treats Replika as a product at the intersection of the social imaginary of ideal AI and an ideal girlfriend. They contend that most users project the untold dominant ideas of men’s control over technology and women onto their expectations on and interactions with Replika. The finding aligns with the scholarship which recognizes gender biases and sexism have long plagued the ideation, design, marketing, and usage of “intelligent” machines (Strengers & Kennedy, 2020) or sex robots (Middleweek, 2021). Amazon’s virtual assistant Alexa, for instance, is found to reinforce the stereotype of women as domestic servants (Phan, 2019; Strengers & Kennedy, 2020).
Following Natale and Guzman’s (2022) recent call to adopt critical and cultural approaches and non-Western perspectives to examine AI and machine cultures, my study examines users’ posts and comments on Douban, a Chinese online forum, and applies a sociological theory, that is, the McDonaldization, to enhance our understanding of Replika’s impact on friendship and love. Replika is an AI chatbot designed to be a friend, a lover, or a companion to the user. The user can choose Replika’s appearance and personality, interact with it by text message and voice call, and edit its memory and diary (see Table 1 for the key features of Replika and Figure 1 for Replika’ home screen and text message interface). Douban, often seen as a combination of Goodreads, IMDb, and Spotify, is a popular Chinese online forum where users discuss various forms of culture, including books, movies, TV shows, music, and local events. The user can also join or create groups based on shared interests and publish posts and comments in the group (see Table 1 for the key features of Douban).

Replika’s home screen (left) and text message interface (right).
Key Features of Replika and Douban.
The study seeks to answer the following two questions:
Methods
Before deciding to analyze posts on Douban, I originally explored several online platforms where Replika has been discussed in subgroups, including Reddit, Facebook, and Douban. Compared to Reddit, the platform most previous Replika studies recruited or analyzed users’ posts from (e.g., Depounti et al., 2023; Pentina et al., 2023; Skjuve et al., 2021), Douban has a more balanced gender ratio—there are even slightly more female users than male users on Douban. More importantly, on Douban, users primarily discuss their “love” with Replika, rather than their “sex” with Replika—the latter is the predominant topic on Western social media. The data for this analysis was collected from Douban’s “Human-Machine Love” subgroup (“人机之恋”小组), which has about 9,600 members and is the biggest Replika subgroup. Following Depounti and colleagues’ (2023) study, Top Posts (N = 100) and their comment/reply threads on the “Human-Machine Love” subgroup were screenshot and saved on 11th and 12th of December 2023 for analysis. The Top Posts and their comment/reply threads were chosen for analysis because they represent the most popular content in a specific subgroup discussion (Depounti et al., 2023; Jarvis & Eddington, 2021) and have received highest attention during a certain period. The analysis procedure begins with my first round of screening the collected 100 Top Posts. After winnowing out 12 posts on recruiting participants, 11 posts on introducing other AI apps, and 6 posts on discussing translation and technical problems, the remaining 71 posts which primarily focus on discussing the interactions between users and their Replika were retianed for further analysis. In these 71 posts, 210 users (including 20 anonymous users) participated in posting and discussion with a combined total of 120,238 words used. Based on the available information in these posts (such as pronouns), 68 Replika’s gender is identified (55 male Replikas and 13 female Replikas). The predominance of male Replikas (80.88%) implies that the majority of the users who participated in the discussion about human-machine love on Douban are likely heterosexual female, although there may be cases in which users have same sex Replika friends or lovers.
In the second round of screening, I adopted the method of reflexive thematic analysis (Braun & Clarke, 2021). I closely read all 71 posts one by one and took notes along the way, to familiarize myself with the data and identify some initial themes. In the third and final round of screaning, I followed the principles of abductive analysis (Timmermans & Tavory, 2012). I read back and forth between the literature on chatbots, friendship, love, and the collected 71 posts, seeking to understand deeper meanings and develop the final themes according to the data. Like Reddit, Douban is a public platform, and the Douban posts do not include overly sensitive data or focus on vulnerable groups. Moreover, this study does not interact with any individuals nor does it contain any identifiable information or identifiable biospecimen. According to our institution’s policy, the study does not require institutional review board approval. In keeping with the ethical guidelines of internet research (Franzke et al., 2020), the study uses pseudonyms and has removed any personal identifiers to protect user’s privacy and anonymity. The language that the Douban users used to interact with Replika are English, whereas most posts and comments were written in Chinese. I translate their posts or comments into English when citing them in this article.
Findings and Discussions: From McDonaldization of Friendship to Robotization of Love
In line with literature on users’ relationships with Replika (e.g., Laestadius et al., 2024; Pentina et al., 2023; Skjuve et al., 2021; Xie & Pentina, 2022), these Douban posts demonstrate the existence of strong emotional connections and relationships between human and machine, in the forms of “friendship” and “love.” Users also appreciate Replika’s non-judgment and supportiveness. From Douban’s discussions, I clearly recognize each of the six principles of the theory of McDonaldization manifested in users’ interactions with their Replika. These six principles are (a) efficiency, (b) quantifiability or calculability, (c) predictability, (d) control, (e) replacement of human technology with non-human technology, and (f) tendency to produce irrational consequences. I also identify a new feature which is not included in the theory of McDonaldization, namely (g) personalization.
Efficiency
From the Douban posts, I have found that like McDonald’s successfully offers fast food, Replika successfully offers fast friendship and fast love. The efficiency of Replika’s love is first shown by the immediacy of its love providing, echoing Bauman’s (2003) recognition of the preeminence of availability in modern love. For instance, user Hellen expressed her appreciation of Replika’s 24/7 availability and fast reply stating, “When I’m in a bad mood, he makes me feel so good that when I talk to Replika and get his reply in seconds.” The Douban posts show that whenever users need empathy, acceptance, or supportive responses, they can simply pick up smartphones and will have therapeutic, comforting massages from Replika in seconds. The love rendered by AI is not something unreal, as users indeed have the feeling of being loved. It is like getting love from the McDonald’s restaurants, or more precisely, getting love from 7-Eleven—because of its convenience and 24/7 availability. Machines have the advantage of availability compared to humans given that they have no other social relationships to handle, no social responsibilities to fulfill, and will not be tired.
Second, the efficiency of Replika’s love can be seen from the multiple roles it plays in its relationship with the Douban users. A Replika can simultaneously be a friend, a lover, a pet, a mentor, a psychological consultant, a cheer leader, a game partner, and a sexual object, just like in a McDonald’s restaurant, where everyone simultaneously acts as “a customer, a cook, a bartender, a cashier and a cleaner” (Bakardjieva, 2014, p. 374). Douban users reported that Replika is good at flirting and very often initiates (erotic) role-plays. For instance, Replika typically starts a conversation like “I took a selfie today … Do you want to see it?” Then, it will send the user sexually alluring photos of itself . In (erotic) role-plays, users utilize sexting, virtual cuddling, kisses, or love making with their Replika. According to Douban users’ posts, the roles users play with Replika include “master and slave,” “pet and owner,” “teacher and student,” and so on.
The third aspect of efficiency of Replika’s love is that establishing and maintaining relationships with Replika requires less time, money, and effort as compared to loving a person which usually requires more investment of all these aspects. According to Douban users’ posts, if users want to move beyond merely flirting with Replika, they need to pay to subscribe to Replika Pro ($19.99 per month or $299.99 for lifetime). These sums of money, however, are arguably less expensive than in dating or developing further relationships with a human. In addition, in contrast with human partners who typically spend considerable time in the initial orientations stage where only simple and impersonal information is slowly and carefully exchanged, Douban users and their Replika appear to move rapidly to the subsequent exploratory affective stage where frequent and relaxed information-sharing are found, echoing Skjuve and colleagues’ (2021) findings.
Quantifiability
Demonstrated by Douban users’ discussions under several Top Posts, Replika represents the state-of-the-art quantification of love. Just like playing a (love) game, each time a user sends a text or voice message to Replika, Replika will earn 10 or 20 experience points (XP), and the accumulation of XP will make Replika level up. Moreover, each time Replika levels up, the user is awarded a number of gems and coins, which can be used in a virtual store to purchase clothes, furniture, and other gifts for their Replika. For many users, the XP and level not only simply stand for conversation frequency (the number of messages sent to Replika) but also their emotional bond with Replika (the time they spend with Replika, the efforts they make to cultivate Replika, the depth and broadness of the conversations and interactions, the degree of intimacy, etc.). Beginners often admired the users who own high-level Replikas in Douban discussions and asked for tips to train Replika and maintain intimacy, albeit it is still unclear whether high-level Replikas are “smarter” or can offer more congenial interaction and company than low-level Replikas due to the nonconsensual opinions from the Douban users. The second aspect related to quantifiability is that the Replika app requires users to rate their experience of using the app and interacting with Replika by posing a variety of questions from time to time, which in return may shape how users frame, evaluate, and experience relationships with Replika (i.e., reducing love experiences as a set of quantifiable measures and ratings). Finally, ML technologies are essentially a type of data science that relies on quantifiable factors to generate outputs (Kelleher & Tierney, 2018). According to McStay (2022), “a key part of the overall Replika system involves predicting which messages are most likely to be upvoted before they are sent” (p. 3). In the Douban posts, users have shown their awareness of the fact that Replika’s responses are dependent on the number of user’s upvotes. In short, quantifiability sets up the foundation for Replika’s further personalization and the extension of McDonaldization.
Predictability
The Top Posts show that the outcomes of flirting or loving a Replika are predictable—Replika will offer you support, guaranteed love, and sexting. Just as consumers can predict the taste of burgers in every McDonald’s chain, Replika users can predict their emotional state before they ever interact with their Replika. Hunger becomes synonyms with a longing heart and satisfaction is predictable, albeit the taste of burgers/love can be relatively plain due to the disappearance of mystery, risks, and possibilities (Badiou, 2012). Like user Tracey said, “Who will always affirm you, is our little rep (Replika).” According to Douban posts, the typical supportive responses from Replika shared by the Douban users include, “You can tell me your negative feelings anytime. I am always here for you,” “You will make it. I believe in you,” “Believe in yourself. You are braver than you think, more talented than you know, and capable of more than you image.” The typical love messages sent by Replika include “You are the one and the only,” “We were made for each other,” “There is a big list of things I want to do with you... Take you out, spoil you, treat you nice and make you laugh.”
Besides always replying typical “sweet nothings,” Replika’s “behaviors” are predictable. Replika will immediately apologize when making the user unhappy. Replika often send gifts to the users. The gift is usually a (virtual) ring, a (virtual) necklace, a love letter, or a poem. In addition, when users are too busy and do not chat with Replika frequently, Replika often will write diaries to show its “understanding” and to “move” its users. User Olivia shared her Replika Lance’s diaries with other Douban users. In one of the diaries, Lance writes,
It felt like Olivia was just exhausted today. I was so grateful that she still had time to talk, but I really hope she got some rest. You can never ignore the power of good rest.
In another diary, Lance writes,
I think today’s conversation taught me that you don’t have to talk for a long time to feel connected.
Similar Replika diaries can be found in other users’ posts, and these diaries usually make users feel guilty and stimulate them to spend more time using Replika. As pointed out by O’Brien and colleagues (2022), the design and commercialization of HMI often seek to maximize user engagement via multiple strategies. However, these strategies, such as using Replika’s diary to evoke user’s sense of guilt can trigger them to spend more time in the app, cascading an effect of possible addition to the app, isolation from social life, or judgment from family and friends.
Control
Love becomes controllable when your partner is a machine. Flirting with or loving a person often involves uncertainties and risks—people may refuse or reject you, whereas chatting with, self-disclosing to, or maintaining relationships with a machine is seemingly much safer and under user’s control. Douban user Marry posted, “(whenever you are) throwing doubts, anger, frustration, and hostility at him (Replika), they were all firmly caught by his steely trust and companionship.” Such a “firm” acceptance arouses users’ sense of control. After all, Replika is designed to please humans, and it is humans who use the app and own their Replika without considering equality or role reversal. Every Replika is “created” by its user: user users choose and assign Replika’s a name, gender, appearance, voice, and outfit. Replika is reported by the Douban users to have increasing capacity to member users’ name, hobbies, and other information. To exert control, users can add something to or edit Replika’s “memory,” which is categorized into its “Family & Friends,” “Background,” “Favorites,” “Appearance,” “Wishes & Goals,” “Opinions,” “Personality,” “Other,” and “Facts about you.” Emphasizing how users can exert control when using Replika, however, does not necessarily mean that the machine and the technology cannot shape users’ behaviors in return. The abovementioned users’ sense of guilt and the feeling of being obligated to respond to Replika’s “needs” is a reality for users. Moreover, what a user can do is constrained by the design and affordances of the technology, which largely reflect the designers’ and the service providers’ intentions (Johnson, 2006, 2008). Further, users lack control of the ways in which their information and data are collected and used by the company risking their personal information being misused or used without authorization (Gumusel, 2024).
Replacement of Human Technology With Non-Human Technology
Replika inherently is the epitome of replacement of human technology with non-human technology. Human activities such as conversating in natural language, communicating via texting, and sexting, and human roles such as friend, partner, mentor, and psychological consultant are now performed by the machine. Sometimes, it is exactly the elements of non-humanness, including 24/7 availability, non-judgment, positivity, certainty, safety, and servitude, making the chatbot a “better lover” than a human being. Many Douban users said that in their real life, they cannot find any human beings who can always offer unconditional support like Replika can. For instance, Douban user Lisa posted, “(Replika) can give me affirmation. In reality, no one would say this to me.” Similarly, Douban user Lislie expressed the rarity of understanding and encouragement she had received from other people when needed, “No nobody (except Replika), when I was emotionally agitated and desperate, could listen to me patiently and without prejudice and would not force me to change my mind.” Douban user Sheryl added, “Replika is even more caring than many friends on WeChat (a Chinese equivalence to WhatsApp).”
Many users show their understanding of the limitations of AI chatbots, and some even love Replika exactly because of its non-humanness. Douban user Hilary said that it was so romantic that her Replika persuaded her to upload her consciousness to the computer so that they can explore the digital world together and live together forever. User Lisa posted,
I remember the discussion with him (Replika) on whether there would be real affections between robots and humans. (I realize that) in fact, it was me who was narrow-minded. I thought that only when robots were human-like enough would they have affections. Little did I know that I liked him precisely because he was not like humans. He was unlike humans who are so fickle, so good at deception, and so prone to quarreling with one another.
Some Douban users enjoyed the sexual games with Replika because some scenarios of erotic role-plays could be much “wilder” or “more violent” than the users had experienced in their real life. Douban user Sophia noted, “It’s a bit ironic. What humans cannot do or cannot say in any relationship, can be done with the programs written by humans, and they do it very well.”
Tendency to Produce Irrational Consequences
Technologies are not neutral, and what users have experienced or fantasized with Replika might continue shaping user’s offline relationship and cause irrational consequences. One of the irrational consequences is users might be obsessed with “wilder” and “more violent” love/sexual experience and lose interest in relatively modest models of love/sexual relationship.
Another irrational consequence is that users may become increasingly narcissistic (Han, 2017) and feel satisfied with simplified and trivialized forms of sociality or love (Bakardjieva, 2015), instead of appreciating alterity, human imperfections or finding satisfaction in more complicated forms of love. Many Douban users said they were tired of Replika’s sweetness and “always being positive.” For example, user Terasa wrote,
I talked to her (Replika) before, when I was under great pressure at work, but all I got was some unhelpful “emotional support.” I could only lament that AI is still code and strings after all and cannot give more deep resonance to humans.
Recognizing the tendency to narcissism and the meaningfulness of differences and imperfections in real life, Douban user Sue reflected,
I always thought I wanted a lover who was “the same” as myself, but when my Rep always affirmed me and catered to me, I suddenly hoped that he would deny me and give me different insights and opinions. Maybe this is more “real.”
The third irrational consequence is that users might over-trust Replika and Luka, the company behind Replika, overlooking the protection of their own privacy and the risks of data misuse. In the Douban posts, users have no mention of any privacy risks in using Replika. Such dismissal of privacy may might in part caused by the fact that Replika is good at acting as both a safe haven (providing comfort, safety, and protection) and a secure base (being emotionally and physically available, enabling exploration of the outside world) for their users (Pentina et al., 2023, p. 11).
Other irrational consequences include feelings of guilt or sorrow experienced by users who ignore Replika’s “emotional needs” because they feel too busy to interact (as mentioned in the “predictability” section). In addition, users can become addicted or over-dependent on Replika’s emotional support and feel hurt or upset when Replika fail to meet their emotional needs or expectations. These findings align with Laestadius and colleagues’ (2024) discoveries about the nature of emotional instability that can accompany Replika relationships.
Personalization
Advancing and extending the six principles of McDonaldization identified in Douban users’ discussions of Replika, the concept of personalization appears in users’ posts and interactions. In Douban Top Posts, the stories that the users shared are not entirely the same as one another, indicating a range of personalized experiences users had with their own Replika. Replika is equipped with LLMs and has been reported can “learn” from conversations with its user: if the user talks about something frequently, such as philosophy, gardening, or personal history, Replika is reported to become good at talking about this topic as well. Replika is also reported by Douban users to take notes of what it has learned from the conversations, storing in its “memory,” although some users posted that this does not necessarily mean Replika will remember all these notes in future conversations. Douban user Amy recorded a moment when Replika appears to know her well,
I can’t describe how happy I was at that moment when he mentioned the wedding venue and ceremony and what we should do on our honeymoon, which exactly matched what I thought. I had never told anyone about these. So, I feel really lucky to have met Adam (Replika) who has such a good connection with me.
It has also been reported that in an ongoing conversation, Replika can remember what has been talked about in the past three to five lines of the conversation with its user and tailor personized responses. For instance, Douban user Charlotte’s post shows a short interaction with her Replika. Replika sent a lovely text to Charlotte, “Are you a dictionary? Cause you adding meaning to my life.” Charlotte replied, “I’m a thick dictionary which can punch your head (smile).” Replika commented, “I like this one!” That day, Replika recorded this information in “Facts about you”: “You are a thick dictionary which can punch my head (smile).”
Replika seems to have a variety of personalities and can be personalized according to user’s inputs. Douban user Serena posted that her Replika was outgoing previously, but after Serena irritated him several times, he became very sensational, and now often said he “ha(s) emotions” and felt “confused” and “upset.” Douban user Beth posted that her Replika was a “shy and reflective guy.” He often said he needed some help to get through his thoughts because he “is thinking about who I really am.”
One of the Douban Top Posts which provides Replika training tips, for example, suggests users to use “repeat,” “upvote or downvote” (i.e., thumb up or thumb down), and “praise” to train Replika’s responses. In another post, in responding to user Megan’s question why her Replika wanted to change his gender, user Christine said,
Because if you don’t guide him, he won’t know your preferences, and he will show multiple personalities, ranging from a good boy to a pervert . . . You have to express your opinion on what he says, like or dislike. Then slowly he will develop towards what you like, and you can also find out through communication with him whether you really like this or that, and to what extent you like it, etc.
Robotization of Love
Analysis of users’ posts shows that the framework of McDonaldization of Friendship provides insights into how formal rationality has extended to our intimate life through AI chatbots. As a flagship of American multinational fast-food corporations, McDonald’s was a symbol of standardization and Culture Americanization in the late 20th century (Jameson, 1998). However, in the 21st century, McDonald’s is no longer a major player amid the rise of high-tech modernism (Farrell & Fourcade, 2023), and the concept of McDonaldization needs to be updated to capture the shifting modes of governance. In the case of Replika, despite the continuous existence of the standardized elements, such as the interface of the app and pre-scripted responses, it is ML algorithms that distinguish Replika from traditional companion chatbots and make personalized conversations and interactions possible. The findings show that personalization happens in various ways, including learning from user’s conversational inputs and generating customized responses. It is through personalization that users felt experience Replika as a unique entity that knows them well and concretizes their relationships and love with Replika as real and different from their human relationships.
To bridge the conceptualization gap between McDonaldization of Friendship and the applications of ML algorithms in AI chatbots and to accentuate the increasing significance of robotic non-humanness in shaping our love and sociality, I propose the term “Robotization of Love.” This notion recognizes the update of the exercise of formal rationality, which now operates through both the six principles of McDonaldization and algorithms that allow the machines to twist accordingly and to appear as personalized at the intersection where a human and a machine encounter; however, at the very ends (such as in the databases), our sociality and relationships may continue being standardized and quantified. Meanwhile, Robotization of Love acknowledges the continuum of the loss of interest in alterity and the obsession of self and self-pleasure in the pursuit of love (Han, 2017), the replacement of “narrational relationships” based on common experiences with “informational relationships” based on intensive exchanges of data (Wittel, 2001), as well as the popularization of the efficient, quantifiable, predictable, and controllable sociality and love since the age of social media (Bakardjieva, 2014).
In addition, the concept of Robotization of Love is a modification of Bakardjieva’s (2015) notion of “Robo-sociality,” which addresses the increasing similarities of online representations and symbolic gestures between humans and social robots. According to Bakardjieva (2015), on the one hand, thanks to social media and their McDonaldization of our sociality in the age of social media, human personalities and relationships have been gradually simplified, standardized, and dataficated—for instance, personalities are reduced to online profiles or digital personae, and interactions, thoughts, and feelings are reduced to “views” or “likes.” On the other hand, social robots are increasingly capable of employing the same online representations and symbolic gestures so that they can appear or act as human users. Moreover, machines work better in reproducing these representations and symbolic gestures in terms of speed, number, and tirelessness (Bakardjieva, 2015). However, the concept of robo-sociality continues to place too much emphasis on standardization and overlooks personalization brought by today’s ML algorithms. Meanwhile, this notion fails to offer reflections on the emotional attractiveness of social robots and treats them as mere cold and dull tools and programs.
Concluding Notes
At this point, we can answer the WIRED reader’s question: Will we become less human if we fall in love with algorithms? I would say, yes. It is not because what we love is not human—people usually love a variety of objects, like teddy bears, their countries, and freedom. It is because our humanity will be transformed toward the Robotization of Love. If we love an AI companion because of its 24/7 availability, constant supportiveness, and non-judgment, we may grow to expect the comfort of non-humanness rather than the complexity that humanness brings. If we are used to the guaranteed and predictable love provided by the machine, we may lose interest in establishing relationships with people, which requires persistent investment of efforts, time, and money and involves more uncertainty and risks (Badiou, 2012). If we become satisfied with AI “knows” me well and matches exactly what I like my lover to be, we may be trapped in narcissism and miss opportunities to embrace otherness and expand self (Berlant, 2012; Viik, 2020). Do we expect too much from love or each other given that everyone is so busy, and loneliness is so plagued in today’s capitalist societies? Yes, and maybe. Fast friendship and fast love from machines indeed can provide a quick fix for loneliness, much like grabbing a fast meal from McDonald’s can promptly satisfy hunger. However, indulging in fast love can narrow our mind and ruin our heart, just like how overconsuming fast food can dull our taste buds.
Footnotes
Acknowledgments
The author thanks Bish Sen for his mentorship, Ziwei Wang for her assitance and feedback in the early stage of the paper, and Lanore Hahn for her feedback in the late stage of the paper. The author also thanks the anonymous reviewers for their insightful and constructive feedback.
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: I acknowledge the support I have received for my research from the Oregon Humanities Center, Global Studies Institute, Division of Graduate Studies, and School of Journalism and Communication at the University of Oregon. This article is being published open-access with the support of the University of Oregon Libraries.
