Abstract
This article examines entanglements between a fitness wearable device, the data it collects and visualizes, and the body-mind they claim to represent. Drawing on embodied insights from my experience as a transmasculine-identified member of a ‘science-backed, technology-tracked’ fitness experience and employing discourse and visual analysis of marketing materials and conversations on a public subreddit for enthusiasts, the article places ‘misfit bodies’ – rather than the unmarked, universal ‘body’ – at the centre of conversations about fitness wearables and self-tracking data. Employing queer/trans critique enables analysis of forms of difference that mediate and compose all bodies and illuminates the regulatory norms and technologies through which they are produced. Throughout, the article foregrounds how selves, bodies, data, and technologies are entangled in mutual and open-ended becomings that exceed the assumed transformation of wearable users into neoliberal healthist subjects.
In his book-length examination of technologies of physical fitness, philosopher Brian Pronger (2002: 211) employed the charged phrase body fascism to theorize the rise of ‘normative cults of the fit body’ that cultivate a desire for that which exploits us. Technologies of physical fitness call upon the body to account for itself according to metrics, imagery, and measures (Maguire, 2002). For Pronger (2002: 114), identifying with such technologies cultivates a religious devotion to logics of accumulation that direct the desires, energies, and potentials of the body narrowly into the economics of usefulness.
Body fascism was an early iteration of a now-familiar critique of the neoliberal and capitalist designs of fitness technologies that mould us into responsibilized consumer-subjects. Pronger’s analysis focused on fitness technologies that were increasingly accessible to the masses amid a fitness boom beginning in the 1980s: Jane Fonda exercise videos, fitness apparel, representations of the fit body, and in-home equipment such as Bowflex or Stairmaster (Millington, 2016: 1185). The second fitness boom of the early-aughts, meanwhile, was marked by the popularity of personal trainers and group-exercise fitness classes. These experiences now incorporate technologies that place personalized data at our fingertips; fitness enthusiasts have embraced wearable fitness devices such as Fitbits and seek out pricey technology-enhanced fitness experiences such as Peloton or Orangetheory Fitness (OTF). A recent efflorescence of scholarship shows how accessibility of digital data and self-tracking technologies such as wearables has ushered in an era of optimizable, quantifiable, and agentive bodies, and reconfigured definitions of health and the self (Lupton, 2016; Millington, 2016; Nafus, 2016; Pink et al., 2017; Ruckenstein and Schüll, 2016, 2017).
I came across Pronger’s book because a friend recommended it to me after he became aware of my enthusiasm for OTF, a personalized fitness experience branded as ‘science-backed and technology-tracked’. Started in 2010 by CEO Ellen Latham in Boca Raton, Florida, OTF’s group workouts incorporate proprietary wearable heart-rate monitors, a Smartphone app, and other in-studio technologies that convert bodily processes into continuous flows of data. During my first workout beneath OTF’s signature orange lights, I immediately registered discomfort with my willingness to ‘consent’ to continuous collection of my bodily data (heart rate, caloric burn, attendance patterns, and so on) via the wearable affixed to my arm and my seeming nonchalance about where they are going and who has access to them. I am suspicious of the flimsiness of the algorithm utilized to measure heart rate – which is, as will be elaborated, at the core of OTF’s science and claims about workouts’ efficacy. During workouts and across web interfaces, I roll my eyes at the gendered and ableist assumptions that seep into messaging and branding. And yet, I am an avid member of a technophilic ‘normative cult of the fit body’ (Pronger, 2002: 211), a contradiction that prompts me to ask: How might trans ambivalences and pleasures in body–data-technology entanglements unsettle what we think we know about the body and its datafication amid proliferating wearable technologies? Based on ethnographic engagement with OTF and its media, I argue that people are entangled in mutual and open-ended becomings with data and technologies that exceed the ‘given’ of their transformation into neoliberal healthist subjects in North American fitness cultures.
To make sense of my ambivalent intimacies (love/hate, desire/suspicion) with OTF’s signature wearable technology and the metrics it collects, I turned to scholarly accounts of datafication, sport, and wearable technologies. Reading this literature through the lens of my own experience navigating a fitness datasphere, 1 I noticed a few things. First, while many scholars draw empirical evidence from interviews with wearable technology users, there was a noticeable absence of embodied first-hand critical analyses of body–data or body–technology interactions (with some exceptions, e.g., Berk, 2018; Clark and Thorpe, 2020; Forlano, 2017; Fotopoulou and O’Riordan, 2016). I observed the quiet assumption of an abstract, unmarked ideal-type body (‘the body’) that engages with data and wearables. The body in such accounts is variously enhanced, surveilled, or mediated by extra-somatic technologies such as fitness wearables. Deborah Lupton (2020b: 50–51) challenges such body–technology parsings, suggesting that, ‘when wearables come together with humans and other nonhuman actors, they generate dynamic human-nonhuman assemblages that create specific agential capacities . . . distributed between the humans and nonhumans involved’. As a transmasculine genderqueer person, I am acutely aware that the materiality of the body is a ‘suturing of disparate parts’ and always subject to biotechnical alteration (Barad, 2015: 393; Gill-Peterson, 2014; Malatino, 2020). The relationship between a trans body and a wearable, then, builds on layered relations of that body with other technologies; yet this is actually the case for all bodies.
Methodological Orientations
This study is an ethnographic discourse analysis of how OTF mobilizes the concepts of data and science in its workouts and advertising and how these concepts are taken up and modified within user-created web cultures. As a medical anthropologist, my analysis also brings critical ethnographic sensibilities to bear on my embodied experiences as a trans-identified enthusiastic participant in OTF for over 5 years. I do so to illuminate aspects of a broader cultural moment characterized by the conversion of qualitative aspects of life into quantified data points.
Inspired by trans studies’ problematizing of the ontological separation of technics and living things, this article unpacks the entanglements of a body (rather than the body), data, and technology. Trans critique invites analysis of forms of difference that mediate all practices, bodies, and identities, and the regulatory norms, technologies, and institutions through which they are produced (Aizura et al., 2020; Beauchamp, 2019: 13; McGlotten, 2016). I use the phrase ‘misfit bodies’ to refer to subject positions and embodied experiences such as my own that exceed or defy normative categories and coordinates that draw the contours of the taken-for-granted abstract body that tends to anchor conversations about data, wearable technologies, algorithms, and datafication. Here, I take inspiration from Rosemarie Garland-Thomson’s (2011: 594) coinage of the critical keyword ‘misfit’ to reframe dominant understandings of disability. Her attention to how built and arranged spaces fit majority bodies and create misfits with minority forms of embodiment prompts consideration of how data, algorithms, and wearable devices are sociomaterial infrastructures that awkwardly fit (most) bodies. Self-tracking metrics, technologies, and measurements, for example, encode presumptions of ability (consider a wheelchair user and pedometers), race (automated technologies that cannot detect darker skin), gender (algorithms that fit bodies into gendered slots), and safety (tracking movement in some geopolitical contexts exposes one to risk) (Beauchamp, 2019; Browne, 2015; Elman, 2018; Meneley, 2019; Noble, 2018).
OTF and other fitness experiences (e.g. Peloton, Soul Cycle, or CrossFit) are often cast as cults whose privileged wealthy adherents glom on to what they are selling with religious devotion and come to proselytize these experiences to others, fortifying a white, able-bodied, capitalist status quo (Dawson, 2017; Hejtmanek, 2020). The purpose of this article is not to redeem OTF technologies or practices by uncovering latent queer potential or trans liberation in fitness wearables or in boutique fitness experiences. Rather, I contend that critically examining embodied engagements with fitness wearables within the datasphere of a specific fitness culture might help us understand emergent bio-technical subjectivities and the norm of living with datafication, data, and wearable technologies. In the next section, I provide an overview of how OTF mobilizes aesthetics and discourses of science, technology, and data to shore up its claims to accuracy and efficacy. In the process, OTF conjures a personalized wearable experience and stabilizes ‘the body’ as a universal substrate amenable to metrical measurement and optimization. Then, I present personal vignettes that excavate queer entanglements that defy the containerizing impulse of the figure/concept of this ‘body’ and unsettle an assumed autonomy or separation between body/data, body/technology, or body/algorithm. Finally, I gesture towards how trans and queer critique and analytics unsettle terms and assumptions that characterize discussions of agency, surveillance, and the body in existing literature on wearables, healthism, and datafication.
‘You Can’t Improve What You Don’t Measure’: Conjuring ‘the Body’
Sweating as I run up an incline on my treadmill, I glance at the screen above the row of numbered treadmills, a checkerboard of multicoloured squares: grey, blue, green, orange, and red. Each contains a name, a continuous calorie count, a number indicating heart rate, and a ‘splat points’ count. Each square (mine labelled CAL_11) represents one of 20 individuals in the room, variously running or power-walking on a treadmill, rowing, or engaging in weight or other floor exercises, guided by a coach in OTF attire, and modulated by the beat of a Top50 soundtrack. My orange-coloured armband (see Figure 1), worn on my bicep just above my right elbow, links my body’s interior timekeeper (heart) to the outside world.

The author’s wearable.
This composite reflects trends documented by media and surveillance studies scholars. The particulars of OTF (colour-coded zones visualized onscreen, the splat points, the OTF-wearable) stand out, even as they merge with other instruments, technologies, and apps that swirl around us in a data-rich world. Many scholars conceptualize our digital-selves as Deleuzian dividuals (possessing a ‘real’ and a datafied/shadow self) or data-doubles (Horrocks, 2019), but Tobias Matzner (2016) suggests that derivatives are a better metaphor for the multiplying selves produced by apps (Facebook, Google Fit, Instagram) and institutions (TSA, immigration) that extract valuable portions of ourselves: As I wipe sweat from my brow, my data, too, ‘sweats’ into a vast digital economy (Gregg, 2015: 44–46; Lupton, 2018).
Orange is at the core of the OTF experience and OTF’s science; the overriding goal of those who complete a one-hour OTF session is 12+ minutes in the orange zone. According to OTF marketing materials, this colour-coded science is rooted in efficacy of a cardiovascular workout that produces results through excess post-exercise oxygen consumption (EPOC). Chief executive officer (CEO) Ellen Latham (2015: 37) explains: ‘This fancy phrase [EPOC] . . . means that your body revs its metabolism so high . . . that you continue to burn extra calories (think of splatting those fat cells) for up to 24–36 hours after your workout’. In ‘push’ (Orange) and ‘all out’ (Red) zones, more oxygen is delivered to cells’ mitochondria (as compared to the less desirable grey, blue, and green zones), a process called mitochondrial biogenesis. The display screen spectacle described above reveals in real-time which coloured zone an individual is in by beaming their data from their wearable to the screen. The progressive spectrum of colours to be achieved claims accuracy (colour reflects effort) even as it embeds moralized assumptions. Each minute one spends in the orange (and/or red) heart rate zone collects splat points towards the goal of 12+ per hour, key to achieving Latham’s (2015: 12) metabolically charged body (‘more capillaries = more oxygen delivered = more powerful cellular energies = more calories burned = more fat cells go SPLAT’). Splat points are object fetishes (the OTF logo itself represents a ‘splat’) not unlike Weight Watchers points recorded into food journals (Heyes, 2006) or FitBit step counts; my personal archive contains 13,961 splat points to date accumulated over 517 classes.
Measurability and the conversion of bodily processes into data is a cornerstone of contemporary fitness cultures. As Latham (2015: 13) puts it in her self-published book, PUSH: A Guide to Living an All Out Life, ‘You can’t improve what you don’t measure . . . it’s not just that you feel like you’re becoming metabolically charged – it’s not just that you look like you’re getting fitter – you’ve got the data to prove it’. (In asserting that fitter bodies manifest visibly, Latham rehearses incipient fatphobic logics that inaccurately assume fat to be unhealthy. OTF’s logo, a fat cell going ‘splat’, also embeds anti-fat sentiments.) The science-backed experience is hinged on OTF’s proprietary heart rate monitor wearable, which employs an algorithm that captures heartbeats and converts them into the metrics accessible to members in the personal app and on display screens. OTF mobilizes discourse anchored in the apparently objective qualities of data and technology to make claims to efficacy, notwithstanding scholarship in exercise science and physiology journals that finds fitness wearables and algorithms are not as accurate as they claim.
In OTF’s marketing materials and social media presence, the accuracy of its wearable technologies and algorithms are central themes, shoring up the broader assumption that better data make better selves (Crawford et al., 2015: 489). On OTF’s website, clicking the ‘Why it works’ tab takes you to another ‘Science’ tab. The centrepiece of this page is a video titled ‘What makes Orangetheory work? Science’. 2 Against a backdrop of music that builds to a crescendo, three individuals with credentialed letters following their names are profiled (Ellen Latham, M.S., Creator/Co-founder; Joel French, PhD, Senior Director of Research; and Michael Piermarini, M.S., Director of Fitness). The letters signal mastery of scientific knowledge and lend credibility to the speakers, who use metaphors to make science accessible to viewers. Latham says, ‘We call [OTF] the multivitamin of metabolic training’, and French explains, ‘We push you during class where your body is essentially writing a check that it can’t cash’. 3 A graph – whose x-axis tracks the one-hour workout and whose y-axis tracks metabolic rate – is populated in real time with two lines that perform a choreographed aesthetics of expertise. An orange squiggly line, representing enhanced caloric afterburn from elevated metabolism (labelled ‘OTF’), takes on active agency relative to the flat one beneath it (labelled ‘typical workout’).
OTF plugs the baseline science of its workouts (metabolic and caloric afterburn), while also presenting itself as responsive to new scientific trends and inputs from members. For example, OTF’s central Instagram account posted a message against bright orange background, reading, ‘Each of our hearts beats at its own speed and efficiency’. The @orangetheory account manager writes, ‘At Orangetheory, no two workouts are the same and no two hearts beat the same. We’re making our heartrate formula even more personalized so you can get the most out of every workout’ (29 May 2019). A link in the post carries me to an article on OTF’s website, titled ‘Do you “fit” the heartrate equation?’ (Barker, 2019). The article describes a new formula for calculating heart rates (rolled out in May 2019) that uses data from an individual’s past 20 workouts to estimate new zones: ‘The zones being set are yours and yours alone’. It also provides the reader with some background on heart rate zone science, describing the traditional age-based formula (framed by the article as outdated) for calculating maximum heart rate. This briefly recounted history rhetorically casts doubt on ‘one-size-fits-all’ age-based equations and prefigures a better metric, buttressing an autonomous subject (‘yours and yours alone’), and opening space for an individual’s enhanced self-understanding via personalized wearable technology fitted to their body’s interiors.
Becoming ‘the Body’: Dataveillance as Queer Pleasure
OTF members are taught and expected (by coaches, fellow members, and OTF media) to crave the orange zone, an aesthetic and physical sensation and pleasure that operates by casting bodies as transparent and allowing members to ‘see’ under the skin and ‘feel’ their data (Lupton, 2017; Sherman, 2016). When I am on the treadmill or rower, I cannot help but glance up at the screen to see how my zone compares to the others in my class. I observe that men seem to burn more calories than women and take great satisfaction on days when my calorie count is more ‘male’ (even as I doubt the accuracy of the counts and resist the male/female binary). I have come to understand my rhythms and bodily states through the colours displayed on the screen: on days when my square remains red for most of class, I am affirmed in my felt sense of a poor night of sleep or fear that I am likely to have a bad day. So, too, do I feel embarrassed if I’m ‘in the red’ for too long, as it seems to indicate my inability to achieve what coaches incessantly remind us evidences good fitness: getting one’s heart rate back to normal quickly after a period of exertion (such as a sprint on the treadmill or rower). In addition to prompting self-reflection and facilitating the data-stories I tell myself, my wearable’s visual externalization of bodily data, paradoxically given the public display of my name and associated metrics, provides a sense of relief and anonymity. For bodies such as mine that internalize their abnormality or monstrosity – bodies that are often subjected to surplus scrutiny by those who seek to authoritatively classify them as this or that – the public projection of personal metrics and the magnetic hold the colourful display screen has on those who admire it provide brief respite from curious stares; athletic spaces tend to expose bodies and their characteristics to public commentary and inspection.
The numbers and colours on the screen are a welcome distraction from my ‘misfit’ physical body, with its felt imperfections: numbers reference rather than represent the body and my fellow members’ gazes are fixed upon the screens rather than me. In other words, a specific, awkward body and its slippery movement between social categories such as male and female projects, momentarily, as just ‘the body’. Abstraction, reduction of complexity, and becoming a data point are in this case felt as a source of comfort, making the fitness wearable into a kind of prosthesis of selfhood (Lupton, 2016: 70) that enables a particular user to see themselves, momentarily, as something other than a misfit body. While collection of such data and their display on a public screen seem creepy and raise privacy concerns for friends I have introduced to OTF classes, they nonetheless may present misfit bodies such as mine with the possibility of hiding in plain sight, making dataveillance into a queer pleasure: these are the ambivalences of datafication that come to light when we think from a body rather than ‘the body’. While OTF presents its science and wearable technologies as accurate and personalized for each of its over one million members, the next section will thus foreground how diverse members – some who are likewise ‘misfit bodies’ – themselves engage with, make sense of, and improvise with data, algorithms, and wearables in ways that presume they are never perfect, finished, or complete (Pink et al., 2018: 5).
Embodied Metrical Tinkering, Citizen Science, and Splat Imaginaries
As Figure 1 depicts, my OTF wearable – now over 5 years old – is worn out. Its orange colour is faded, its band is stretched out from so much use, and the Velcro sometimes does not stick very well, prompting the wearable to misread my heart rate if it gets too loose. The grey middle piece that sits against my skin and collects my metrics is scratched. I have to charge it more often than I used to because it does not hold its charge very well. My wearable is outdated, a relic of OTF material culture. Most of my classmates wear the newer version of OTF’s wearable (described as a ‘life changing monitor’ in marketing materials), which is a smaller, sleek black device. I have considered updating my wearable but cannot seem to part with the one I have grown fond of; it knows me. To part with my trusty wearable would seem to be abandoning a part of myself and discarding an archive of my workouts and data, even if that archive is, in actuality, stored on my Smartphone. The worn, dated wearable is a reminder of how long I have been doing OTF, a badge of pride. Material and affective characteristics of my wearable facilitate my connection to it and my data. As I show below, they also mediate how OTF members like myself understand its algorithm and tinker with their own metrics. Too often, discussions of algorithms and data presume them to be abstract, ungraspable, mysterious, and free-floating things, overlooking the ways that they are made, used, imagined, and assessed in embodied and material encounters between bodies and technologies (Thomas et al., 2018).
OTF’s decision to roll out a new ‘cutting-edge’ algorithm in 2019 was debated on a public subreddit of OTF members and enthusiasts for months ahead of this announcement. The subreddit might be considered as a space of distributed public science that hybridizes embodied knowledge with evidence drawn from scientific papers and corporate marketing materials consulted by redditors. This subreddit has multiple purposes. Based on posts and commentary, such as excitement that the workout template is available before a person goes to bed, it is evident that some North America-based users (including me) consult the subreddit before a workout; members in Australia or New Zealand typically post the workouts after completing them in their time zone and commenters provide their opinions on the quality, difficulty, and their enjoyment (or not) of daily workouts. Moderators post surveys where interested OTF members can relate their stats for various rowing, running, or other challenges that take place every few weeks; submitted results are curated into an Excel sheet of data where one can compare their own performance against others’ data, organized by age, gender, weight, height, and geographic location variables. The subreddit also generates voluminous commentary regarding splat points, heart rate algorithm(s), wearables, weight loss, and performance data: it acts as an important site where data become a means of relating to and learning from others as people assign meanings to technologies and data that far exceed those imagined by their designers or developers. In one post, for example, a self-identified older redditor who compares himself to Jackson Pollock (an artist known for his drip technique that resembles ‘splats’) attributes his unusually high collection of splat points to the algorithm: He comments that he is well aware that his splat count is due to the ‘grading on the age curve embedded in the proprietary algorithm that is used’. Fellow redditors endorse his theory and suggest that splat metrics may work on average but not for everyone. Some mention they have no faith in the algorithm and rely on ‘perceived exertion’ as a better performance metric. One poster writes: ‘Everyone (OTF, doctors) uses an algorithm . . . some people fit the algorithm perfectly, and then there are outliers. Those people that don’t fit the bell curve (like me)’. This comment generates enthusiastic responses; one redditor notes that they can’t help feeling frustrated by the algorithm even though they know it is ‘not exact science’; they ‘assume [they are] still getting stronger and improving even though [they] don’t get to see it in [their] spreadsheets and graphs’.
In other cases, however, redditors use embodied knowledge to shore up the accuracy of the algorithm: ‘I’ve become very familiar with the feeling when I’ve just entered the orange zone. I describe it as my heart fluttering for a moment as I get that uncomfortable feeling . . . I think their algorithm is spot on because it is always that same feeling corroborating the moment I get into [the] orange [zone]’. These tidbits, representative of redditors’ algorithm-centred conversations, confound assumptions that wearable technologies prompt us to, ‘understand . . . the body’s activity not in terms of the aching of one’s legs or the amount of sweat issuing from one’s pores, but in the numbers collected’ (Gilmore, 2016: 2534). They align with Nick Seaver’s (2017) insight that algorithms are unstable objects that emerge through multiple processes people use to engage them; splat metrics generated by OTF-wearables are not predictable, stable, or consistent, but emerge in queer and non-linear fashion in interactions with diverse body-minds 4 (Nafus and Sherman, 2014). As Neff and Nafus (2016: 69) put it, ‘to get something out of data, many people . . . have to tinker to make data suit their own contexts and purposes’ (see also Lupton, 2020b: 62–63).
Whereas some argue that bodily intuition is outsourced to ‘unbodied data’, leading people to distrust sensing or feeling as ways of knowing the body (Smith and Vonthethoff, 2017), I rely on embodied epistemologies and methodologies (‘feeling’) not only to make meaning from visual data, but to critically reflect on what do or do not constitute good data, and to blur distinct lines between bodies and data: they cannot be distinguished as individual objects that pre-exist their intra-actions (Barad, 2007). Wearables and the splat metrics they make visible may thus enhance, rather than displace, embodied meaning-making practices (Ruckenstein, 2014). The pleasure of looking at OTF’s checkerboard spectacle of colourful squares, self-in-others, is mesmerizing; while I cannot physically touch the screen, my ability to alter the incline/speed of my treadmill (and thus tinker with on-screen colours reflecting exertion) provides a haptic queer pleasure of intimate surveillance that entangles voyeurism and seeing beneath the skin. Digital displays that depict members’ resting heart rates while they wait for class to begin in the lobby see patrons at studios I have attended positing explanations for social rhythms of the heart (menstruation, stress, hangover, a sleepless night, being in close proximity to a crush in class) (Pantzar et al., 2017). Having personal data at one’s fingertips, then, may amplify bodily sensations and meanings and prompt conversion of data into stories, quanta into qualia. Bodies, data, and algorithms are perpetually caught up in non-linear becomings. The body itself is a kind of algorithm attuned to the world, one that processes inputs, adjusting for errors or wrong predictions, and improving future responses. The subreddit posts above that consider the algorithm’s validity emphasize this point: ‘the algorithm’ is not an autonomous thing to be embraced or rejected, endorsed or questioned. For instance, I read my embodied ‘feeling’ or sensorial data alongside the data I see during and after class. This critical engagement entails drawing on a personal embodied archive of experiences and self-knowledge that falls outside datafied representations of performance or heart rate (according to posts on the subreddit, these include: medications taken, stress levels, health conditions, recent surgeries, or pregnancies). The exchange and looping back between data and bodies or algorithms and bodies illuminates how people, environments, objects, classificatory categories, social relations, and value are made and remade in queerly unfolding transactions (Mauss, 2016; Strathern, 1988).
I posit that it was collective frustration, voiced on the public subreddit, with OTF’s algorithm that, at least partly, prompted OTF to dramatically roll out a well-advertised personalized heart rate metric as a corrective to what was framed in extensive discussions on the subreddit as a problem. OTF’s research director indicted members for ‘judging results by the numbers’, and losing sight of more qualitative signs of improvement (clothes fit, lower blood pressure, feeling better). ‘Rather than question the technology’, he suggests, people, ‘figure this is just another thing that doesn’t work for their body, and they quit. But now, with the new calculations, there’s especially no reason to question the technology’. Rather than recognizing multifaceted embodied ways of knowing employed by those who evaluated the algorithm, he fortifies ‘data’ themselves, suggesting that new numbers are more accurate, sure to please even ‘number junkies’; he mimics data science’s recursive resilience in the face of error.
Media scholars take interest in the potential of sousveillance to undermine surveillance (Mann et al., 2003; Newell, 2020), but attending closely to how people talk about and learn from their data shifts attention away from questions of how to resist or evade dataveillance and its instruments of subjection (Ball et al., 2016: 75). Markets, embodied knowledge, pleasure, and complaint loop into one another. This does not conclusively prove cause and effect (i.e. that redditors prompted OTF to devise a better heart rate metric); instead, it captures how data economies and affective economies, and scientific and sensorial ways of knowing, are entangled in feedback loops. The OTF subreddit curates distributed expertise and collective tinkering with technologies and metrics, tinkerings that will continue long after rollouts of new algorithms or technologies; they are in fact constitutive of these things. As Boellstorff (2013: 104) suggests, queer potential for resistance to surveillance, ‘lies in the dialectic [of surveillance and recognition], not through some pure, imagined oppositional stance or project’ (see also Poirier, 2017). Algorithms and digital technologies that embed them are framed as revolutionary and innovative, always on paths of improvement towards perfection. Scholars have convincingly argued, though, that it is moments of breakdown, noise, glitch, and failure that constitute, rather than threaten, their usability and function (Jackson, 2013; Tanweer, 2016). Yet, this focus on breakdown and failure risks overlooking the affective functions of technologies and data. The function of my wearable far exceeds the pragmatic axes of its reliability, accuracy, or performance. It encompasses as well the pleasure and joy that emerge from my embodied and sensorial encounters with it. My interactions with OTF data and technology – whether or not they ‘fail’ or succeed or are accurate or not – affirm, for me, the open-endedness of my body and the possibility of its transformation(s). While my wearable and the datafication of my bodily processes facilitate healthist ideology, so, too, do they challenge some of the oppositions that structure our worlds: male/female, nature/culture, body/technology.
‘We All Sweat the Same’: When ‘the Body’ Doesn’t Fit
‘You feel welcomed walking into any studio. No matter your race, age, skin color, sexuality, income. It doesn’t matter here.’
This Instagram post (11 June 2019) captures OTF’s attempts to carve a niche: Latham (2015: 14) notes that what makes Orangetheory different than other fitness experiences is that ‘It’s for everyone’. She describes how she integrates options into workout formats: ‘If you want to walk, you walk. If you want to jog, you jog. If you want to run, you run’. The choice as to which activities one does is completely one’s own, a narration that presumes an agentive, autonomous, and self-possessed actor with full control over ‘their’ body. The body itself is taken as a universal given, even if its capacities to handle exertion differ. Data that appear on studio screens, metrics extracted from processes happening beneath one’s skin, are modulated by one’s choice to exert oneself or to take it easy (what some members term ‘taking a Green Day’).
The emphasis on choice enacted by a human actor ‘naked’ of differences imagines the studio as a suspended ritual space, marked by loud music and orange light where initiates might be stripped of social personhood and markers that locate them in sociopolitical hierarchies (Sassatelli, 1999; cf. Crossley, 2004: 55; Turner, 1967): ‘We all sweat the same’, OTF suggests. Yet, as this section will demonstrate, the supposed universality of the human substrate – the material to be tracked and measured (glossed as ‘the body’) by wearable technology – is betrayed by axes such as gender, ability, neurotypicality, race, and age that constitute body-minds. Sylvia Wynter (2003) has taught us that the exclusion of processes such as racialization from theorizations of the category Human has furthered the mythology that crafts universal Man in the image of white cis-heterosexual maleness and has naturalized the unmarked substrate from which algorithms and technologies that measure humans spring (Benjamin, 2019). OTF’s splat metrics are products of this mythology, and fitness wearable devices came about through efforts, undergirded by carceral logics, to subjugate, monitor, and maximize exploitation of marginalized populations. Surveillance is not a new process tied to digital technologies, but an ongoing mechanism for control and sorting of bodies by raced, gendered, classed, and other differences (Beauchamp, 2019; Benjamin, 2019; Beutin, 2017; Browne, 2015; Wernimont, 2019).
The moment you enter an OTF studio, you are classified into male or female categories that filter, sort, and measure your datafied (and ‘real’) selves. Your data and qualitative experience are prefigured by the sociotechnical system architected around your categorical home (app, algorithm, in-studio rhetoric, expectations of performance, and pronouns). During the floor portion of a workout (involving free-weights and rowers), an OTF coach models the template, verbally cuing dumbbell selections for exercises. Suggestions often target two genders, and the feminine token always carries less weight: ‘Ladies – 10 pounds, men – 20’. 5 These directives uphold a binary system that assumes clean correspondence between gender category and weight-bearing capacity, and between gender category and body-mind, and their daily repetition creates a milieu in which other gender partitionings are naturalized, as Schrijnder et al. (2020) and Nash (2018) document in the case of CrossFit. Coaches and members uphold gendered norms in jokes they make, and in-studio prizes, such as T-shirts that come in tight v-neck or baggy crew varieties that assume gendered preferences or raffle prizes such as spa certificates or golf passes (given to female and male members, respectively). Publicly posted leaderboards for benchmark challenges such as 1-mile sprints are divided into ‘male’ and ‘female’.
Even as users of wearables are told little about the cultural and scientific assumptions that undergird notions of the normal user, OTF patrons themselves craft their own explanations of felt normalization; they are, for example, well aware of being gendered by algorithms in the studio. A married couple at my former studio attended class at different times, so when the woman’s wearable stopped functioning, she borrowed her husband’s instead of purchasing a replacement. She quickly observed that she burned significantly more calories and earned more splat points with less effort than when using her own. She hypothesized that the device is ostensibly prefigured to ‘cook’ data according to the recipe ‘male’ or ‘female’ entered into OTF’s system (which members can alter using their personal app). Gender is assumed coterminous with biological difference, locating difference within the body’s processes (heartbeat, blood flow, muscle fibre length). Gender, in fitness cultures, has long been imagined firmly in biology’s realm (Besnier et al., 2018; Karkazis and Carpenter, 2018), and this legacy manifests in anxious debates about how to classify bodies that defy hormonal and phenotypic thresholds of male and female, categories caught up in racialization (Anderson and Travers, 2017; Cavanagh and Sykes, 2006; Jones, 2021; Nyong’o, 2010; Pape, 2019).
Discourses of trans and gender non-conforming bodies ‘gaming’ the system by winning unfairly (such as we saw in the recent case of college swimmer Lia Thomas) assigns universal assumptions to those who participate in sport, generating suspicions of trans bodies as inauthentic or conniving. Yet, their presumed dirtying of performance data and expectations pre-slotted into a sex-gender binary illustrates how gendered bodies are copies of copies, imperfectly citing and re-citing ‘data’ (in the form of categories) that swirl around them. Trans is one among many identitarian categories that presumes an always emerging self, rather than one that is known, static, on a linear path of development, or complete: Karen Barad (2015: 411) suggests, ‘trans is not a matter of changing in time, from this to that, but an undoing of “this” and “that”’. Some of the genderqueer and trans OTF members I know are ambivalent about technologies of gender that encroach upon them, electing to identify their gender in the system as they see fit (male or female), perhaps confounding algorithmic sorting mechanisms and, potentially, ‘dirtying’ data that presume biological difference. 6 Yet, to focus solely on hormones as molecules that corrupt streams of bodily data overlooks how data are filtered through leaky conceptual sieves and always misfit that which they reference. A body named female with ‘too much testosterone’ is as impure as a body managed by anxiety medications, or one that has absorbed environmental toxins, is under stress, or outfitted with an insulin pump or pacemaker. Redditors frequently hypothesize about how such impurities affect algorithmic outputs, as in a post where the poster expresses dismay about their inability to collect splat points. They and their interlocutors come to the consensus that anti-anxiety medications reduce the number of splat points one can collect. Redditors regularly assert that pacemakers, heart medication, thyroid medication, ADHD medication, birth control pills, alcohol, insulin, marijuana, and anti-depressants affect splat metrics.
The tacit binary gender recipe that guides algorithmic production, and the symbolic context that naturalizes it, obscures how gendered assumptions prefigure ‘raw’ data (Gitelman, 2013). This observation intervenes in the data/body divide, which is foundational to the notion that data are encroaching on bodies, making them subservient, enhancing them, or dulling their sensorial mechanisms. If we do not take the ‘body’ as a given, however, data’s externalized autonomy from bodies melts into air. (Big) Data – in their effort to count, measure, and monitor bodies – do not so much newly surveill or normalize bodies previously ‘safe’ from dataveillance or naked of technologies as entangle body-minds in yet another citational matrix, another technology of gender (Beauchamp, 2019; Cifor and Garcia, 2019; de Lauretis, 1987; Gieseking, 2018: 152; Sundén, 2015; see also Note 9). While critical discussions of data often cast fragmentation of selves into derivatives that further projects of normalization and control as new, generations of queer, racialized, disabled, and chronically ill people are familiar with redistributions or multiplications of the self that defy Cartesian dualism or binary conceptualizations of private and public, male and female. The common-sense separation of body and data, ‘underwrites the notion of an integral human body, according to which the incorporation of technology is a fall from the original wholeness of birth’ (Gill-Peterson, 2014: 405), obscuring how wearables and data are yet another participant in bodies’ partial emergences, caught up in larger techno-economies. Jasbir Puar (2017: 46–49) suggests that transpeople engage in ‘piecing’ – body modification through technology – to craft a body that can ‘pass as trans’, making the trans body available to neoliberalism’s machinations. While piecing helpfully draws attention to how all bodies are cobbled together, her account inadvertently assumes a ‘given’ body that is a kind of potato head substrate to which things are added or deleted. This image’s materialist bent resonates with popular renderings of data as external to, separable from, or enhancement of otherwise ‘pure’ and universal bodies, and casts trans bodies’ pieces as additive or artificial rather than of-the-body. So, too, does parsing wearable technologies from bodies risk overlooking how they are entangled in mutual bio-technic becomings and transformations.
Beyond ‘the Body’
How we imagine ‘the body’ in relation to data or wearable technologies has implications for how we theorize dataveillance such as that within OTF’s datasphere. While we should not resign ourselves to letting our data be digitally vacuumed up by corporations, it is crucial that we rethink key terms of these debates, namely data ownership, consent, and privacy, rooted in assumptions of ‘the (liberal subject’s) body’ (Epstein, 2016; Lupton, 2020a; Matzner, 2019). Consent, for example, is the lynchpin of Western liberal personhood: it guards our interiors against intrusion and protects society’s vulnerable members. Limited imaginaries of agency restrict our ability to think consent outside an idealized dyad of giving consent (say, signing privacy policies), the authenticity of which is commonly verified through the signature, an imprimatur that performatively calls into being a single, rational, and willful self. Bodies are produced in this dyad as objectified, impersonal, and homogeneous categories. Yet, to attempt to escape the reach of surveillance capitalism by refusing to sign a piece of paper relies on a fiction that selves are pure and whole entities whose data-sweat we can contain within our bodies (Sanders, 2017: 55; Schüll, 2017). This indeterminacy, rather than solely funnelling data off to ‘them’ for nefarious uses, embeds, as well, ‘thoughtful interconnection and the perverse proliferation of pleasures and expressions’ (Haber, 2016: 161). In data economies, as elsewhere, the fiction of a ‘whole’ body-mind is exposed by the ways that datafied portions of the consenting ‘I’ have already been stolen away. Some people refuse to consent (say, to sharing medical data with third-parties) at their embodied peril: Type 1 diabetics who will not consent to clinic privacy policies (which compel sharing of personal data) are unable to access lifesaving care, for instance. Normative theorizations of privacy and ethics tend not to account for the small voyeuristic pleasures or the pragmatic realities of give-and-take and overlook the tinkering, hacks, and becomings that the surveilling logics of datafication enable. Trans, disabled, chronically ill, and queer people have employed such hacks and enlisted technologies into their self-making and other projects for generations.
Thinking data and wearables through misfit bodies reveals a ‘disjunction between the “felt sense” of the body and the body’s corporeal contours’ (Salamon, 2010: 2). Bodies are not necessarily contiguous with externally perceived physical bodies, nor can we assume boundaries between human/nonhuman, data/bodies, or social/natural. Centring the felt sense of body-minds long conceptually excluded from dominant theorizations in body studies, defies the presumption of any subject’s relationship to the materiality of their body. Data reaffirm ‘the order they have already prefabricated’ (Moore and Robinson, 2016: 2783), yet this account has shown that fitness wearables are objects that do much more than just siphon data to corporations, normalize healthism, or track heartbeats. These technologies and the data practices they engender speak to both the ‘profoundly exposed nature of contemporary bodies’ and ‘unanticipated and surprising potentialities for embodiment’ (French and Smith, 2016: 4). Centring marginal, socially marked, misfit, and non-normative bodies in data discussions reminds us that the proliferation of digital data and wearable technologies is not just grounds for critique. It is an opportunity to rethink ontologies of the imagined pure and stable categories (e.g. race, gender, sexuality, disability) at the core of bodily metrics and technologies, and to draw attention to how such axes of difference are neutralized by the fiction of ‘the body’ as it operates in popular and critical discussions of data and bodies, and in the design of algorithms and data-technologies.
