Abstract
In this article, we explore how a very simple telepresence robot avatar becomes a technology multiple when interacting with humans. Based on Mol’s notion of the body multiple, we explore how AV1 – a social telepresence robot avatar designed to act as a substitute in schools for homebound students – becomes a technology multiple. The analysis is based on 105 interviews, including interviews with homebound students and kindergarteners in Norway using AV1 and/or their guardians, interviews with school workers, and focus group interviews with classmates. In the analysis, we explore AV1 as a plastic bust, a toy, a creep, an avatar, and a reverse cyborg. The different perceptions come into being in interaction with human bodies, and the technology thus arguably emerges and is co-constructed with human bodies, creating a technology that is more than technological.
Introduction
The body can be understood in many ways. In her influential book The Body Multiple, Mol (2002) specifically proposes an understanding of the body as multiple, meaning ‘that there is manyfoldedness, but not pluralism’ (p. 84). This means that the same body (part) can be many things even if it is just one single body (part). Taking atherosclerosis (an ordinary disease) as her point of departure, Mol describes how a single object can be enacted in different ways. Atherosclerosis is an encroachment, but it is also pain, bad skin, and so forth. Furthermore, in a hospital practice, atherosclerosis appears to be many other things; through different apparatus, specialties, x-rays, images, e-mails, records, doctor discussions, and interventions, a slightly different atherosclerosis occurs, which again influences the perceptions and experiences of atherosclerosis. As such, the body is multiple and is perceived and experienced in many ways.
Increasingly, technologies are understood to influence embodied experiences as well as how the human body is understood and transformed. With technological developments, the human body is extended, and when using a technology, the said technology becomes part of the human user (Kember and Zylinska, 2012). Thus, the user emerges and is co-constructed with the technology (Kember and Zylinska, 2012), and technologies thereby become ‘integral parts of how life and the body is lived, experienced, and continuously transformed’ (Stage et al., 2020: 4). As technologies interact with human bodies, the understanding of technology arguably changes as well. With this article, we focus on what technology is and what it becomes as it interacts with humans. Specifically, inspired by Mol’s (2002) work on The Body Multiple, the aim of this article is to explore the manyfoldedness of a specific and very simple technology, and how, in interactions with humans, this simple technology turns into a technology multiple.
Technologies have been understood in terms of multiplicities before. In a book chapter about the fluid materiality of tablets, Burnett (2017) introduces the iPad multiple and explores ‘the multiple ways in which iPads get taken up in classrooms in practice and the implications of this multiplicity for teaching and learning’ (p. 16). Burnett (2017) argues that the ‘idea of the multiple is useful in drawing our attention to how tablets can [. . .] be “actor-enacted” in various ways’ (p. 16). In her analysis, Burnett (2017) lists four actor-enactments of the iPad – namely, the iPads as schooled devices, playthings, community artefacts, and objects among many objects. The list is not definitive but ‘provides examples of what iPads seemed to become, or how they came to be known, in the classroom’ (Burnett, 2017: 25).
In this article, we similarly wish to explore the ‘fluid materiality’ of technology, thus arguing for a wider application of the notion of the technology multiple. Understanding the manyfoldedness of technologies can, for instance, help to explain various and contrasting experiences and thoughts people may have concerning the use of technology as well as how the technologies are domesticated and how they afford. The technology we focus on in this article is the telepresence robot AV1 – a social telepresence robot avatar that is designed to act as a substitute for homebound students, which we will introduce as a case in the Methods section. Based on an analysis of the materiality of AV1, and the ways in which the AV1 robot avatar is understood by those who interact with the robot, including homebound students and kindergarteners, their classmates, and their teachers and other school workers, we ask the following research question: How does a very simple telepresence robot avatar become a technology multiple when interacting with humans?
Robots have previously been found to be simultaneously ‘enacted as a thing and as a social agent’ (Alač, 2016: 519), and a social robot’s character can be multifaceted (Alač, 2016). The multifaceted character of the robot is a situational achievement, and Alač (2016) shows ‘how these seemingly contradictory features – a thing and a living creature – unproblematically coexist’ (p. 533). With our research question, we wish to further elaborate on the understanding of the multifaceted nature of not just robots, but of technologies in a wider sense as these are enacted in interaction with other actors.
Bodies and technology
In the age of digitalization, the human body is increasingly intertwined with technologies. Boundaries between the body and the technology it uses have collapsed (Shilling, 2012), resulting in the need for more complex understandings of what a body is and what it can do (Hvidtfeldt and Stage, 2019). New materialists stress the importance of analysing human and non-human assemblages (Brownlie and Spandler, 2018), and several terms have been coined and used in attempts to understand and conceptualize the intertwinement between body and technology. Famously, Haraway (1991) introduced the cyborg figure, arguing that ‘[b]y the late twentieth century, our time, a mythic time, we are all chimeras, theorized and fabricated hybrids of machine and organism; in short, we are cyborgs’ (p. 150). As cyborgs, Haraway argues that our bodies are recrafted by communication technology. Similarly, Braidotti (2013) argues that the digital ‘second life’ has blurred the traditional distinction between the human and its others (such as animals, plants and humans that are distinct from the naturalized white, middle-aged, middle-class man), exposing the non-naturalistic structure of the human. The human, Braidotti (2013) further argues, is a historical construct that became a social convention about ‘human nature’ (p. 26), and individualism is not an intrinsic part of ‘human nature’, as liberal thinkers are prone to believe, but rather a historically and culturally specific discursive formation – one which, moreover, is becoming increasingly problematic. Braidotti problematizes the categorical distinctions between the human and other species, seeds, plants, and animals, and the sustainability of the planet as a whole.
The understanding of the posthuman and the cyborg figure can help us to understand and theorize about how human flesh and technology interact (Lupton, 2015). Through digital technologies, the human body has become increasingly datafied (Lupton, 2018), and in recent years, research building on the cyborg figure has been developed, further taking into account the digitized and datafied formats of human bodies. Bringing the cyborg together with the term assemblage – a term used ‘to encapsulate the idea that human bodies are complex and dynamic configurations of flesh, others’ bodies, discourses, practices, ideas and material objects’ (Lupton, 2015: 571) – Lupton (2015) argues that the term cyborg assemblage can be ‘understood as a melding of body with technologies that are able to provide cybernetic (feedback) mechanisms’ (p. 572). Consequently, human embodiment can be argued to be ‘always already more than human: entangled and relational with things and places’ (Lupton et al., 2022: 3), thereby creating blurred boundaries between humans and nonhumans, including technologies. Digital technologies, for instance, facilitate the ability to upload bodily representations on social media, the ability to engage in self-tracking, or even to three-dimensionally (3D) print parts of one’s body, just as various apps and devices can generate digital information about people’s bodies (Lupton, 2018; Lupton et al., 2022). Looking specifically at robots from a phenomenological perspective, robots can be experienced as part of a human body through ‘robot suits [. . .], robot arms, or robot “exo-skeletons”’ (Coeckelbergh, 2011: 198), robots can help to mediate between humans and the outside world, and we can understand robots ‘as more than a thing: an other to which I relate’ (Coeckelbergh, 2011: 198).
The assemblages that human bodies form with digital technology ‘configure in the contexts of social relations, discourses, and materialities of place and space’, and assemblages constantly come together and apart through various encounters with other bodies – both human and nonhuman – shaping embodied experiences (Lupton et al., 2022: 13).
Much of the research conducted on the relationship between the body and technology focuses on the blurred lines between bodies and technology. Although the research mainly helps us to understand how embodied experiences are influenced by technology, some research also focuses more on what technology becomes in interaction with humans. Turkle (1995, 2005), for instance, explains how children that she studied in the 1970s and 1980s ‘appropriated computers by thinking of them as psychological machines’ (Turkle, 1995: 80). Children talked about the attributes, capacities, and personalities of machines, and discussed whether computers could feel, think, cheat, or know things. Furthermore, children sometimes treated computers as though they were alive (Turkle, 2005). The way in which computers were understood changed as new generations became more accustomed to computers, and whereas children in the 1970s and 1980s, for instance, thought that computers could be alive, children in the 1990s no longer thought of computers as alive, but rather as inanimate objects that could think and have a personality (Turkle, 1995). As Rose (2023) suggests in her studies of the world’s first AI toy, the Furby, which was introduced in 1998, Furby fans knew that Furbies were not alive in the true sense, yet they found that the Furbies could appear to be ‘sort of alive’ (Rose, 2023). Furbies and other human-like technologies or toys can be understood in relation to anthropomorphism, namely ‘the attribution of human characteristics or traits to nonhuman agents’ (Epley et al., 2007: 865, cited in Frazer, 2022). Anthropomorphism is relevant to how people engage with technologies such as robots, and through anthropomorphism, robots can become (sometimes too) human-like (Festerling and Siraj, 2022), thus further blurring the lines between bodies and technology.
Building on this research focusing on the nature of machines and how computers and other technologies come into being in people’s thinking, we wish in this article to further explore how computers and machines enter the thinking of humans. As such, we are concerned with how AV1 appears to the people engaging with it, a concern related to a phenomenological understanding that ‘what matters for understanding and evaluating human-robot relations is how the robot appears to us’ (Coeckelbergh, 2011: 198). We focus specifically on the polysemic nature of how technology enters human thinking and the implications of this manyfoldedness.
Methods
Case: AV1
Robots can play various roles in education; they can for instance be pedagogical or teaching tools, or they can ‘deliver the learning experience through social interaction with learners’ (Belpaeme et al., 2018: 1). In this article, we focus on the telepresence robot avatar AV1, a robot avatar that was developed for school students mainly for social interaction. As opposed to social robots that foster human-robot interaction, telepresence robot avatars foster human-to-human interaction. Since AV1 only mediates communication between humans and cannot autonomously respond to input, it does not facilitate human-robot interaction. Telepresence robot avatars are increasingly used around the world for children who are unable to be physically present in the classroom (Johannessen et al., 2023b; Page et al., 2021; Powell et al., 2021; Weibel et al., 2023), and the robot avatars give students the opportunity to participate in school, both socially and academically (Weibel et al., 2020). A variety of telepresence robots exist, including, for instance, AV1, Fable Connect, PEBBLES, VGo, Double, Beam, and Kubi (Bouquain et al., 2023; Johannessen et al., 2023b; Newhart and Olson, 2017; Powell et al., 2021; Weibel et al., 2023). Looking specifically at AV1, it has a hard outer shell, with a TPE (thermoplastic elastomers)-coated backside, and it has various features, such as a microphone, a camera, a speaker, and motors that can turn parts of the robot avatar. The robot avatar has an anthropomorphic design: it is designed to look like it has a head, a neck, a body, and eyes. AV1 is quite small and lightweight, and the individuals who are in the same physical space as AV1 are able to lift the robot avatar up, move it, and carry it around.

AV1. Picture from No Isolation’s website.
AV1’s design allows communication between the individuals sitting with the robot avatar and the homebound user. AV1 is controlled by an app that is installed on the homebound student’s phone or tablet. From their home (or the hospital, or wherever else the homebound student is), the homebound student should ideally be able to control the robot avatar – for instance, the homebound student can control the motors to move the ‘head’ up and down, and thereby see what the robot ‘sees’ through the camera. Furthermore, the speaker and microphone allow sounds to transmit between the app user and the individuals that are in the same space as the robot avatar. The homebound student can control AV1’s lighting via the app: they can, for example, make the top of the ‘head’ light up, and they can change the shape of the robot avatar’s ‘eyes’ (if they have a new version of the robot avatar).
In Norway, AV1 can be acquired in two main ways, either through a (sparingly used) centralized model ‘where the school (or its municipal owners) acquires a pool of robots that they distribute to those in need’ (Johannessen et al., 2023b: 155) or through a more commonly used decentralized model in which the homebound students or their guardians are typically responsible themselves for convincing schools to allow them to use AV1 (Johannessen, 2024). This also means that there is a risk for homebound students and their guardians that school workers will decide not to use AV1. In the Norwegian AV1 case, although not all school workers interpreted the technology similarly, some school workers specifically believed AV1 ‘to threaten widespread ideals of schools being pedagogically oriented, physically co-present and bounded institutions, a series of concerns that reflect ideas about schools and schooling and how technologies tend to (not) function within this context’ (Johannessen et al., 2023a: 12).
Data collection
The analysis is based on 105 semi-structured qualitative interviews. The interviews are part of a larger project about AV1 in Norway – a country in which schools are currently undergoing large-scale digitalization (Johannessen et al., 2023a); specifically, in Norway, since 2006, the ability to use digital tools has been ‘one of the five basic skills running through all subjects and levels of schooling’ in the national curriculum (Erstad and Silseth, 2022: 31), and internationally, ‘Norway has been a front-runner in providing schools with ICT’ (Blikstad-Balas and Klette, 2020: 57). This project was motivated by the importance of understanding how technology can help children in a vulnerable situation, and the project interviews were conducted between 2018 and 2021 by the second author and two other researchers. The interviews included in this analysis are 67 interviews with homebound students (who attended primary school, secondary school, and upper secondary school) (n = 34) and children in kindergarten (n = 2), who use the AV1 telepresence avatar in Norway, and/or with their guardians, 35 interviews with 30 school and kindergarten workers, and 3 focus group interviews with 4 classmates in each (n = 12). We decided to include these interviews in the article’s analysis because classmates, school workers, and homebound children and/or their guardians are the ones primarily experiencing the robot in use as opposed to for instance people working in the municipality and the developers of the technology. During the interviews, among other questions, the interviewees were asked about their experiences with AV1 as a social tool and as a learning tool. Most of the interviews were conducted before the COVID-19 pandemic, but we also conducted follow-up interviews with some of the teachers during the pandemic (n = 4). 1
Interviews were conducted both physically and by telephone. 2 The project received ethics approval from the Norwegian Centre for Research Data, and all participants consented to participate in the study.
Data analysis
The analysis was conducted in two main steps. Initially, the first author coded all interviews to see the manyfoldedness of AV1 in the computer-assisted qualitative data analysis software NVivo – that is, a code was created for each ‘thing’ that AV1 ‘was’ in the data material. The initial analysis was then discussed among both authors several times, who then ended up structuring the analyses according to various enactments of AV1. The analyses do not focus on differences in experiences between homebound students, kindergarteners, guardians, school and kindergarten workers, and classmates, as the focus of the analysis is on the manyfoldedness of the technology and not on different perceptions between the various individuals or groups of individuals involved in the use of AV1.
Analysis
Various enactments of AV1
The simple robot avatar becomes multiple in its interaction with humans, including the homebound user (who is typically not in the same space as the robot avatar), the homebound user’s classmates, and the homebound user’s teachers and other school workers. In this first part of the analysis, we wish to provide a list of different enactments of AV1. Specifically, we present AV1 as:
A plastic bust
A toy
A creep
An avatar
A reverse cyborg
Similar to how Burnett (2017) presented her analysis, our list is not exhaustive but helps to show the manyfoldedness of the AV1 technology (Mol, 2002). In our data material, AV1 was more frequently referred to in ways that we would characterize as it being a plastic bust, an avatar, and a reverse cyborg than it was referred to in ways that we would characterize as it being a creep and a toy.
A plastic bust
AV1 is sometimes merely described as a plastic bust. In these cases, AV1 is not ascribed anything but hardware and software functions, as, for instance, seen in this quote by a teacher:
Yes, I actually compare the robot to a modern web camera. That it . . . yeah, with a microphone, and just that it [the webcam] doesn’t have these blinking lights, I mean.
In this quote, AV1’s features are in focus: a web camera with a microphone and lights. As similarly suggested by another teacher, ‘it’s not a robot either, it’s actually Facetime in a box’. When the technology is merely a plastic bust, homebound users, teachers, and classmates refer to the plastic on the robot avatar, the robot avatar’s speaker, the robot avatar’s standby mode, and the robot avatar as being something that has a charger. They refer to the size of the robot avatar and how, because of its size, ‘[i]t is very easy to carry around’. They refer to the ‘clean, defenceless . . . cute design’, the cost of the technology, and AV1 is referred to as ‘AV1’ or ‘the robot’. AV1 is also considered a plastic bust when something is wrong with it, whether that be the hardware or the software of the technology. Many informants, for instance, shared frustrations about the robot avatar’s internet connection and explained how AV1 repeatedly connected and disconnected from WiFi. Similarly, when AV1 breaks, it is referred to as a plastic bust, as seen in this interview with a homebound student’s classmates:
Classmate 1: The first time was when we played Danish longball [a bat-and-ball game], and someone had to hit, and then the person dropped the robot on the ground. And then the head fell off . . . Classmate 2: Just the top. [. . .] Classmate 3: And after that I guess we got another one.
In this conversation, AV1 is referred to as something having ‘a head’, thus anthropomorphizing AV1. However, in the continued conversation, one of the homebound student’s classmates says that ‘just the top’ of AV1 fell off, thus de-anthropomorphizing AV1 again, as the classmate is specifically referring to the building materials of the robot avatar – a technology that, when it is only a plastic bust, was seemingly easily replaced by a new one.
A toy
AV1 can sometimes become a toy. One kindergarten teacher specifically said, ‘it was like a toy’. Classmates of homebound students explain how they ‘drew on it [AV1] with whiteboard [markers]’ to draw ears, a t-shirt, a moustache and glasses on AV1. A teacher also described her surprise that ‘ninth and tenth graders were so fascinated by it and started dressing it up like a doll and walking it around and cradling it’. Similarly, one homebound student explains that, in the classroom, ‘this [AV1] is a cool thing’ that his classmates were able to have in the classroom because of his illness. This is also reflected in a quote by a teacher talking about the classmates’ reaction to AV1:
As I said, they were very positive, and probably experienced this as something VERY cool. And that the class was VERY lucky that they were going to have a ROBOT. ‘It’s the one we saw in the commercial’, you know, and ‘it’s THIS student who got it for us’. They knew there was a reason the student had got it for them, but they thought it was very COOL.
As a toy, AV1 can be given a name of its own. Some classmates have, for instance, named AV1 Robert: ‘We call it “Robert” . . . Because it’s kind of “robot”, “Robert”, it’s kind of similar’. In this case, by naming the robot avatar, it is given its own identity, different from the student it represents, which personalizes the robot as something other than the homebound student – similar to how children may name their teddy bears.
It is not unusual to view robots as something that has worth similar to that of a human. Previous research has, for instance, found that ‘children think that one should avoid harming a social robot’ (Ayalon et al., 2023), that anthropomorphism can ‘evoke caring or nurturing behaviors’ towards robots (Festerling and Siraj, 2022: 712), and that ‘children consider robots to have moral worth, although not to the same extent as they do living creatures’ (Sommer et al., 2019: 2). This can also be seen with objects such as teddy bears, which often receive names and can be ascribed language, thoughts, feelings, intentions, and actions, and that can both comfort and be comforted (Borovski Lübeck et al., 2016). In our study, we also found that classmates would care for AV1 in ways that did not necessarily care for the student, as explained by a teacher:
[. . .] for the girls, it’s a bit CUTE, it’s something they want to – even if they can’t – cuddle with. It looks very simple and nice at the same time. Also for some of the boys – [but] for many of the boys, it’s the person behind it [AV1].
The teacher here sets up a distinction between the boys seeing AV1 as the person behind it and the girls seeing AV1 as something cute that they want to cuddle. Another study has also found that classmates of a homebound student treat AV1 as if it were a human through actions such as gently touching the cheeks of the telepresence robot (Weibel et al., 2020), and that the classmates of the student using a telepresence robot ‘did not want to return the telepresence robot as they cared about it’ (Weibel et al., 2023: 1404).
A creep
On a more negative note, AV1 can also be a scary piece of technology. One mother of a student with a long-term illness who had largely not been physically present in the classroom in which the robot was supposed to be placed explains this as follows:
Well, it’s that you don’t know one or the other – they don’t know anyone, either on one side of the robot or the other; it becomes a bit scary then too – it would have been better if, for instance, this had been in middle school or elementary school. Then she would have known the students, the students would have known her. Then it’s sort of harmless, you know – then it becomes her. But now a robot came instead of a person they don’t know.
In this case, AV1 represents someone unknown, and the technology’s features become something that one can fear could be misused by a largely unknown other. The robot avatar is a representative of the unknown: the voice that comes out of the robot avatar is that of a stranger, and the camera that films you transmits to someone you do not know. Having the robot avatar in the classroom can therefore be scary, as it can be unnerving to have something (or someone) in your class when you do not know what it represents, and in this case, the robot avatar does not represent either a friend or a (known) classmate. In addition, as also seen with other telepresence technologies, AV1 allows people to ‘arrive’ unnoticed (Bouquain et al., 2023) – that is, a mother or father could, for instance, come into the homebound student’s room without anyone knowing, adding to the creepiness.
An avatar
At times, AV1 is an avatar of the student. In these instances, AV1 may be given a short version of the student’s name plus Bot, for example, Chris-bot. When AV1 is an avatar, the ties between the homebound user and AV1 are closer than when AV1 is a toy. This can, for instance, be seen in that when AV1 is a toy, it receives a name of its own (similar to how one gives teddy bears names), but when AV1 is an avatar, its name is closer to the homebound student’s name. As such, as an avatar, AV1 is less of a thing in its own right; rather, it is more strongly connected to the homebound user. AV1 is, for instance, placed in the student’s seat (or on the student’s desk) in the classroom, symbolizing the student. As one teacher explained,
Then it was on the student’s desk, and I guess it wasn’t entirely in the front, more on the second row, a bit to the right, and I didn’t place it on the chair but on the desk. And the students thought that was nice. Because then it was sort of a reminder that like, ‘that’s where he usually sits, and he’s not here today, but we have the robot instead’. And then they gave the robot the nickname of the student. So that was quite nice.
In such cases, AV1 represents the student, and, as another teacher said, ‘We [the class] are not complete until AV1 lights up’. Other studies have similarly shown that an AV1 robot was placed on the homebound student’s desk in the classroom (Weibel et al., 2023). AV1 thus functions as a reminder of the student. Furthermore, as AV1 is increasingly explained as a representation of the student, AV1 is also increasingly assigned the rights of the student. This is the case as ‘the robot is there equally as if it’s her as a person’, when, for instance, the children in kindergarten are taking turns talking. Also, in relation to attendance, when AV1 represents the homebound student, the student is considered present in the classroom when AV1 is present in the classroom: ‘if the robot’s there – then you’re there’, as one teacher said, referring to how a student will be counted as attending if AV1 is present. In such cases, the homebound student and the AV1 technology are starting to meld together (Lupton, 2015), blurring the lines between what the technology is and what the student is (Lupton et al., 2022).
A reverse cyborg
From being a representation of the student, AV1 will sometimes be the student, thus collapsing the boundaries between student and technology (Lupton et al., 2022; Shilling, 2012). As such, it is like a cyborg, but in reverse; like bodies are recrafted by technology (Haraway, 1991), the technology is recrafted by human bodies, and similarly becomes a hybrid of machine and body in which the body becomes a small part of what the technology is. This is largely visible in teachers, classmates, and the homebound students’ use of pronouns and names, but it is also visible in how different occurrences with AV1 are experienced and how AV1 is referred to. For instance, when asked if her classmates considered AV1 as a toy, a homebound student answered that ‘No. They looked at me as a person’. As such, AV1 is the homebound student in this case; it is not a toy, despite being that in other instances. When AV1 is the student, what happens to AV1 happens to the student. As one homebound student said, ‘they forgot me in the copy room’. Another student also mentions that ‘I was left standing in the cabinet’ when AV1 was not picked up by a teacher. In these cases, it was the student and not AV1 that was forgotten in the copy room or in the cabinet. Similarly, another homebound student said that ‘I didn’t get to participate in the same way when I was a robot’. This statement testifies to great unfulfilled expectations; the quote suggests that, to a certain extent, and like a cyborg, the student experienced transforming into a robot and was disappointed as a robot when that robot avatar was not assigned the same rights as a human student. Sometimes, however, the robot avatar does have the same rights as the student, as seen in the following quote from a teacher:
I know that in the beginning there were some parents . . . who often did not oppose it, but who were sceptical [of AV1]. But the pupil’s right to be in the classroom trumps the parents’ fears and their opportunities to opt out of it.
In this quote, the teacher is not explicitly referring to AV1 as the student. However, he talks about the student’s right to be in the classroom, implicitly suggesting that when AV1 is in the classroom, the student is in the classroom. As opposed to the earlier quote, in which the student was disappointed to not have the same rights when being a robot, the teacher’s quote suggests the opposite – namely, that the robot and the student should have the same rights of access to the classroom.
Furthermore, we can see how the robot avatar sometimes is the student by the students, teachers and classmates’ use of pronouns. Homebound students will, for instance, refer to AV1 as ‘me’, as seen in this quote: ‘my friend looked after me really well. She was even the only child allowed to carry me!’ In this statement, AV1 is talked about as something that must be cared for, and the experience of care is experienced through a personification of the robot avatar as the homebound student. Another homebound student also suggests being the robot avatar when asked if AV1 has a name: ‘I guess I just call it “me”, [. . .]. I refer to it as “me”, you know?’ Similarly, classmates will refer to AV1 using the homebound student’s pronouns: ‘Also, we took him with us on trips’. A homebound student also explains an instance in which classmates considered AV1 to be the student:
Then he took me into the classroom, and then he said, ‘Now [homebound student’s name] is here! It’s [homebound student name]!’ And then . . . everyone was just like, ‘Huh, is that [homebound student name]?’ And it was lunchtime, then, so there were a lot of people who were in the cafeteria buying lunch, and then they kind of came in one by one and like, ‘Huh, do you have [homebound student’s name]?’ and sort of flocked around.
The homebound student referring to the telepresence robot as ‘me’ and the use of the homebound student’s pronouns and name when referring to the robot have also been seen in other studies on telepresence technology (Newhart and Olson, 2017; Weibel et al., 2023, 2020). In relation to virtual environments such as Second Life, Bloustien and Wood (2013) also explore individuals’ use of the possessive pronoun ‘my’ to describe a feature of an avatar in the environment. According to Bloustien and Wood (2013), this use of the ‘my’ pronoun is a slippage in discourse, making it seem as though a Second Life avatar is a mirror of the person talking. With AV1, the use of the ‘my’ and ‘me’ pronouns does not necessarily appear as a slippage. One homebound student, for instance, specified, when her parent talked about AV1, that the robot avatar was in fact her:
Parent: She was very fond of watching the gatherings that took place in the mornings. Everyone explained what they had been up to and . . . And then there were some trips that they had been on, in the woods and fields – and they had taken AV1 with them [on the trips]. Homebound student: Me!
As such, AV1 is her, in this case, and thus becomes a reverse cyborg.
The fluid materiality of AV1
In this second part of the analysis, we wish to look more closely at the fluid materiality of the technology multiple. In our empirical material, we see several different cases of how the same robot avatar can change what it is, again suggesting manyfoldedness in the case of AV1 (Mol, 2002). This can, for instance, happen if the situation changes, as we see in this quote from a homebound student:
No, it’s like if I’m there [in the classroom] talking to it [AV1], then you’re talking about the ‘robot’. Whereas if I’m at home and the robot is there [in the classroom], then it’s ‘I’m going to go and talk to [homebound student’s name]’, you know?
But this can also change within the same situation:
If they were eating or something, then I would turn [the head of AV1] myself. But there were times when they kind of went . . . for instance, there was this one time when we went to the cafeteria. Then they turned me towards people [. . .] it was natural for them to turn it [AV1] towards the people I wanted to talk to.
In this sentence, the homebound student refers to AV1 both as me and it when talking about AV1. The same student also points out that AV1 can be referred to in different ways, even in the same situation: ‘Either they say “put AV1 in the charger” or “put [homebound student’s name] in the charger”’. In this case, AV1 is both a plastic bust and a reverse cyborg.
Sometimes the extent to which the robot is considered linked to the student depends on whether or not it is turned on, as seen in this interview with a homebound student’s classmates:
Classmate 1: So, we can kind of come up to, say, somebody and say: ‘OK, do you want him?’, you know? Classmate 2: Yeah. Interviewer: Ask someone else, ‘Do you want him now?’ Classmate 3: Yes, because it can be a little annoying to carry it [AV1] around. Interviewer: It’s a little annoying? Classmate 3: Yes, because if he logs off or something like that, then you’re kind of just walking around with a robot that’s unconscious.
In other instances, the robot avatar is something different for different individuals. For instance, teachers can put AV1 in a bag, presumably considering it a piece of hardware to be moved, but, as a homebound student said, ‘I was still on while he moved me and put me in the bag’. As such, the student assigns the robot her own spirit, consciousness, and psyche, and it becomes questionable whether it is moral to treat the robot avatar simply as a piece of hardware. Similarly, one homebound student talked about classmates having different experiences regarding the extent to which AV1 was the student or not: ‘Someone said, “It’s so weird that you can see us, but we can’t see you”’. That was one response, but then another said, “But he is here!’’’ The classmates therefore express different perceptions of whether or not the homebound student is there or not when AV1 is present, suggesting simultaneous manyfoldedness (Mol, 2002). Similar to how Burnett (2017) finds that the iPads’ materiality are fluid, in that they ‘are actor-enacted in multiple ways as they combine with other things, people, ideas, priorities, practices and so on’ (p. 16), the telepresence robot avatar also changes with circumstances, experiences, and people. As also suggested by Alač (2016), ‘[i]n interaction, the robot presents its multiple facets so that each theme can resurface at any particular moments, articulating each other dynamically’ (p. 530).
Discussion
According to the developers of AV1, the robot avatar is ‘a blank slate’ that can be named and decorated (NoIsolation, n.d.). As such, the developers have designed AV1 to become more than it is in itself, and in our analysis, we found that the telepresence robot avatar became multiple: different in slightly different or in the same situations. Just as the perception of a leg will, for instance, depend on whether it is part of a living body or has been cut off (Mol, 2002), there are several versions of what the robot avatar is – versions that are sometimes incoherent. Nothing ever is alone, it is always related (Mol, 2002). In Mol’s (2002) research, she describes how surgeons will switch from focusing on organs to focusing on patients during an operation, depending on whether they are focusing on ‘the physical being on the operation table’ (p. 124) or the patient as a social being. Similarly, in our analysis, we see that students, parents, and school workers shift and switch in their description of what AV1 is – sometimes depending on the situation, and sometimes not. AV1 therefore is and becomes several things depending on the people that interact with it and, to a certain extent, depending on the different situations in which it is used.
A technology’s features will have an influence on the manifold perceptions of said technology. In the case of AV1, the anthropomorphic design of the plastic bust, and the telepresence qualities of the robot avatar are arguably part of the reason for the robot avatar being considered as a toy, a creep, an avatar, and a reverse cyborg. AV1 is cute and the design could be that of a toy. For decades, toy manufacturers have provided children with toys and objects ‘that almost begged to be taken as “alive”’ (Turkle, 1995: 77), and AV1’s design that makes it look as if it has eyes and a head that can tilt and turn around gives the robot avatar qualities that similarly ask that it be regarded as ‘alive’, thus paving the way for it to be considered as having a – potentially creepy – identity. Similarly, the telepresence qualities of the robot are arguably part of transforming the robot avatar into the homebound student: the student’s voice comes out of the robot avatar’s speakers, and the homebound student controls the robot avatar’s movements – movements that can add to how present the homebound student appears (Bouquain et al., 2023).
However, the technology’s transformation in interaction with human bodies varies, and the manyfoldedness of the technology can create not just multiple but also conflicting perceptions of what the technology is. As such, one individual cannot alone decide what the technology becomes, as it becomes in interaction with various actors and in various situations. In the case of AV1, the manyfoldedness of the technology includes a sometimes strong link to the user of the technology – namely, the homebound student. That the technology sometimes is the student has various implications. First, there is a difference between forgetting a plastic bust in a copy room or a cabinet and forgetting a human student, but if the technology is the homebound student, the homebound student may experience that it is them who has been forgotten. This reflects the findings of research conducted on avatars and their relations to bodies in artificial reality. In studies of artificial realities, people have been found to experience feeling enmeshed with avatars that they had created for themselves. For instance, in a study about avatars in Second Life, Bloustien and Wood (2013) argue that, when individuals are enmeshed with their avatars, attacks on the avatar can be ‘experienced phenomenologically as though they were actual physical violations’ (p. 60). This results in ethical dilemmas and poses questions as to how the robot avatar should be treated. Second, the perception of what the robot is matters to the rights that are associated with the robot avatar. When the teacher in the analysis talked about the student’s unquestionable right to be in the classroom, he could do so because of a perception of AV1 as the student. When AV1 is not the student but is instead considered as a piece of technology, AV1 does not have as many rights. In other schools, AV1 was, for instance, not allowed to be taken outside of the classroom because other students’ parents had not consented to AV1’s presence. Similarly, in some classes, AV1 could also not be used without parental consent. Although telepresence robot avatars are typically developed as well-intentioned devices, school workers in Norway have been shown to be sceptical of AV1 (Johannessen, 2024), and a student’s unquestionable right to be in the classroom thus only extends to AV1 when AV1 is a student as a reverse cyborg. This implication regulates the use of the technology and the opportunities that the technology provides for potential human users. Therefore, when the manyfoldedness of the technology creates conflicting perceptions, it can pose unexpected and potentially unacknowledged challenges. When AV1 is placed in a bag, the teacher may not consider that act as being anything other than a practical activity needed to move a piece of hardware; however, when the same AV1 is simultaneously a student, the simple activity raises questions as to how to treat and handle the technology. Similarly, although not being allowed to use a piece of technology in class may be frustrating for a student, when the technology is the student, the denial of the usage is not necessarily just a rejection of a technology but also a rejection of a homebound ill student.
Limitations
Our study is based on 105 interviews. While interviews provide a rich data material to conduct this analysis, observational data might have provided additional information about how AV1 becomes and its relation to human actors.
Conclusion
In this article, we answer the question: How does a very simple telepresence robot avatar become a technology multiple when interacting with humans? We argue that a simple technology can be many things, sometimes at once. As a supplement to the understandings of how bodies emerge with technologies (e.g. Lupton, 2015, 2018; Lupton et al., 2022; Stage et al., 2020), we have shown how technologies emerge with bodies as well. Similar to how Alač (2016) found that a robot can be ‘treated as a living creature while it is handled as a material thing’ (p. 533), we found that AV1 is not just a plastic bust, it is also a toy, a creep, an avatar, and a reverse cyborg. In short, AV1 as a technology multiple is messy; the manyfoldedness of the technology is vast, and the current analysis cannot describe it in detail, but in the above analysis, the conflicting perceptions of AV1 that we found in the analysis suggest that the technology is multiple; it is made up of a manyfoldedness that is dependent on the technology’s interaction with human bodies. Just as a body can emerge and be transformed with technology (Kember and Zylinska, 2012; Stage et al., 2020), technology is also co-constructed with bodies. Technologies are thus more than technological, and similar to how the cyborg assemblage can be understood as a configuration of, for instance, other bodies and objects (Lupton, 2015; Lupton et al., 2022), the technology is also entangled with bodies, other things, and places. For AV1, the technology thus comes into being with the homebound student, but the homebound student, and the illnesses, disabilities, or challenges this student might have, only makes up part of what the technology becomes.
The idea of the multiple is relevant across technologies. Burnett’s (2017) iPad multiple is an example of this, but other research that does not explicitly refer to the manyfoldedness of technology also shows how technology can still be multiple, nonetheless. In Rose’s (2023) research on Furbies, for instance, she shows that they can function as avatars and companions, be hybridized with plush toys, or be reworked into handbags. The fluid materiality of technologies can, for instance, influence on how technologies are domesticated and how they afford, and the fluid materiality may help to explain contrasting experiences and thoughts about the introduction and use of technology. Thus, acknowledging a technology’s manyfoldedness can help to further understand the complexity of what even a simple technology can be.
Footnotes
Acknowledgements
We would like to thank Lars E. F. Johannessen for conducting most of the data gathering, and for providing feedback on an earlier draft of this article. In addition, we would like to thank Siri Hjellum for conducting some of the interviews with teachers. We are also thankful for feedback and comments we received from Moa Eriksson Krutrök on an earlier draft of the article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The project received funding from The Gjensidige Foundation and from the Research Council of Norway (funding ID: 301840).
